Apache Airflow (or simply Airflow) is a platform to programmatically author, schedule, and monitor workflows.
When workflows are defined as code, they become more maintainable, versionable, testable, and collaborative.
Use Airflow to author workflows as directed acyclic graphs (DAGs) of tasks. The Airflow scheduler executes your tasks on an array of workers while following the specified dependencies. Rich command line utilities make performing complex surgeries on DAGs a snap. The rich user interface makes it easy to visualize pipelines running in production, monitor progress, and troubleshoot issues when needed.
Table of contents
Airflow works best with workflows that are mostly static and slowly changing. When the DAG structure is similar from one run to the next, it clarifies the unit of work and continuity. Other similar projects include Luigi, Oozie and Azkaban.
Airflow is commonly used to process data, but has the opinion that tasks should ideally be idempotent (i.e., results of the task will be the same, and will not create duplicated data in a destination system), and should not pass large quantities of data from one task to the next (though tasks can pass metadata using Airflow's Xcom feature). For high-volume, data-intensive tasks, a best practice is to delegate to external services specializing in that type of work.
Airflow is not a streaming solution, but it is often used to process real-time data, pulling data off streams in batches.
Apache Airflow is tested with:
|Main version (dev)||Stable version (2.2.0)|
|Python||3.6, 3.7, 3.8, 3.9||3.6, 3.7, 3.8, 3.9|
|Kubernetes||1.18, 1.19, 1.20||1.18, 1.19, 1.20|
|PostgreSQL||9.6, 10, 11, 12, 13||9.6, 10, 11, 12, 13|
|MySQL||5.7, 8||5.7, 8|
Note: MySQL 5.x versions are unable to or have limitations with running multiple schedulers -- please see the Scheduler docs. MariaDB is not tested/recommended.
Note: SQLite is used in Airflow tests. Do not use it in production. We recommend using the latest stable version of SQLite for local development.
Note: Python v3.10 is not supported yet. For details, see #19059.
Note: If you're looking for documentation for the main branch (latest development branch): you can find it on s.apache.org/airflow-docs.
For more information on Airflow Improvement Proposals (AIPs), visit the Airflow Wiki.
Documentation for dependent projects like provider packages, Docker image, Helm Chart, you'll find it in the documentation index.
We publish Apache Airflow as
apache-airflow package in PyPI. Installing it however might be sometimes tricky
because Airflow is a bit of both a library and application. Libraries usually keep their dependencies open, and
applications usually pin them, but we should do neither and both simultaneously. We decided to keep
our dependencies as open as possible (in
setup.py) so users can install different versions of libraries
if needed. This means that
pip install apache-airflow will not work from time to time or will
produce unusable Airflow installation.
To have repeatable installation, however, we keep a set of "known-to-be-working" constraint
files in the orphan
constraints-2-0 branches. We keep those "known-to-be-working"
constraints files separately per major/minor Python version.
You can use them as constraint files when installing Airflow from PyPI. Note that you have to specify
correct Airflow tag/version/branch and Python versions in the URL.
pipinstallation is currently officially supported.
While it is possible to install Airflow with tools like Poetry or
pip-tools, they do not share the same workflow as
pip - especially when it comes to constraint vs. requirements management.
pip-tools is not currently supported.
If you wish to install Airflow using those tools, you should use the constraint files and convert them to the appropriate format and workflow that your tool requires.
pip install 'apache-airflow==2.2.0' \ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.2.0/constraints-3.7.txt"
pip install 'apache-airflow[postgres,google]==2.2.0' \ --constraint "https://raw.githubusercontent.com/apache/airflow/constraints-2.2.0/constraints-3.7.txt"
For information on installing provider packages, check providers.
Apache Airflow is an Apache Software Foundation (ASF) project, and our official source code releases:
Following the ASF rules, the source packages released must be sufficient for a user to build and test the release provided they have access to the appropriate platform and tools.
There are other ways of installing and using Airflow. Those are "convenience" methods - they are
not "official releases" as stated by the
ASF Release Policy, but they can be used by the users
who do not want to build the software themselves.
Those are - in the order of most common ways people install Airflow:
dockertool, use them in Kubernetes, Helm Charts,
docker swarm, etc. You can read more about using, customising, and extending the images in the Latest docs, and learn details on the internals in the IMAGES.rst document.
All those artifacts are not official releases, but they are prepared using officially released sources. Some of those artifacts are "development" or "pre-release" ones, and they are clearly marked as such following the ASF Policy.
DAGs: Overview of all DAGs in your environment.
Tree: Tree representation of a DAG that spans across time.
Graph: Visualization of a DAG's dependencies and their current status for a specific run.
Task Duration: Total time spent on different tasks over time.
Gantt: Duration and overlap of a DAG.
Code: Quick way to view source code of a DAG.
As of Airflow 2.0.0, we support a strict SemVer approach for all packages released.
There are few specific rules that we agreed to that define details of versioning of the different packages:
amazon 3.0.3providers can happily be installed with
Airflow 2.1.2. If there are limits of cross-dependencies between providers and Airflow packages, they are present in providers as
install_requireslimitations. We aim to keep backwards compatibility of providers with all previously released Airflow 2 versions but there will sometimes be breaking changes that might make some, or all providers, have minimum Airflow version specified. Change of that minimum supported Airflow version is a breaking change for provider because installing the new provider might automatically upgrade Airflow (which might be an undesired side effect of upgrading provider).
Apache Airflow version life cycle:
|Version||Current Patch/Minor||State||First Release||Limited Support||EOL/Terminated|
|2||2.2.0||Supported||Dec 17, 2020||TBD||TBD|
|1.10||1.10.15||EOL||Aug 27, 2018||Dec 17, 2020||June 17, 2021|
|1.9||1.9.0||EOL||Jan 03, 2018||Aug 27, 2018||Aug 27, 2018|
|1.8||1.8.2||EOL||Mar 19, 2017||Jan 03, 2018||Jan 03, 2018|
|1.7||220.127.116.11||EOL||Mar 28, 2016||Mar 19, 2017||Mar 19, 2017|
Limited support versions will be supported with security and critical bug fix only. EOL versions will not get any fixes nor support. We always recommend that all users run the latest available minor release for whatever major version is in use. We highly recommend upgrading to the latest Airflow major release at the earliest convenient time and before the EOL date.
As of Airflow 2.0, we agreed to certain rules we follow for Python and Kubernetes support. They are based on the official release schedule of Python and Kubernetes, nicely summarized in the Python Developer's Guide and Kubernetes version skew policy.
We drop support for Python and Kubernetes versions when they reach EOL. We drop support for those EOL versions in main right after EOL date, and it is effectively removed when we release the first new MINOR (Or MAJOR if there is no new MINOR version) of Airflow For example, for Python 3.6 it means that we drop support in main right after 23.12.2021, and the first MAJOR or MINOR version of Airflow released after will not have it.
The "oldest" supported version of Python/Kubernetes is the default one until we decide to switch to
later version. "Default" is only meaningful in terms of "smoke tests" in CI PRs, which are run using this
default version and the default reference image available. Currently
apache/airflow:2.2.0 images are Python 3.7 images as we are preparing for 23.12.2021 when will
Python 3.6 reaches end of life.
We support a new version of Python/Kubernetes in main after they are officially released, as soon as we make them work in our CI pipeline (which might not be immediate due to dependencies catching up with new versions of Python mostly) we release new images/support in Airflow based on the working CI setup.
Want to help build Apache Airflow? Check out our contributing documentation.
Official Docker (container) images for Apache Airflow are described in IMAGES.rst.
More than 400 organizations are using Apache Airflow in the wild.
Airflow is the work of the community, but the core committers/maintainers are responsible for reviewing and merging PRs as well as steering conversations around new feature requests. If you would like to become a maintainer, please review the Apache Airflow committer requirements.
If you would love to have Apache Airflow stickers, t-shirt, etc. then check out Redbubble Shop.
The CI infrastructure for Apache Airflow has been sponsored by: