🔗 原文链接:https://airflow.apache.org/docs/apache-airflow/stable/installation.html

This page describes installations using the apache-airflow package published in PyPI, but some information may be useful during installation with other tools as well.

:::info Note:
Airflow is also distributed as a Docker image (OCI Image). Consider using it to guarantee that software will always run the same no matter where it is deployed. For more information, see: Docker Image for Apache Airflow. :::

1、Prerequisites(先要条件)

Airflow is tested with:

  • Python: 3.6, 3.7, 3.8
  • Databases:
    • PostgreSQL: 9.6, 10, 11, 12, 13
    • MySQL: 5.7, 8
    • SQLite: 3.15.0+
  • Kubernetes: 1.18.15 1.19.7 1.20.2


Note:** MySQL 5.x versions are unable to or have limitations with running multiple schedulers — please see: Scheduler. MariaDB is not tested/recommended.

Note: SQLite is used in Airflow tests. Do not use it in production. We recommend using the latest stable version of SQLite for local development.

Please note that with respect to Python 3 support, Airflow 2.0.0 has been tested with Python 3.6, 3.7, and 3.8, but does not yet support Python 3.9.

2、Installation tools

The official way of installing Airflow is with the pip tool. There was a recent (November 2020) change in resolver, so currently only 20.2.4 version is officially supported, although you might have a success with 20.3.3+ version (to be confirmed if all initial issues from pip 20.3.0 release have been fixed in 20.3.3). In order to install Airflow you need to either downgrade pip to version 20.2.4 pip install --upgrade pip==20.2.4 or, in case you use Pip 20.3, you need to add option --use-deprecated legacy-resolver to your pip install command.

While there are some successes with using other tools like poetry or pip-tools, they do not share the same workflow as pip - especially when it comes to constraint vs. requirements management. Installing via Poetry or pip-tools is not currently supported. If you wish to install airflow using those tools you should use the constraint files and convert them to appropriate format and workflow that your tool requires.

3、Airflow extra dependencies

The apache-airflow PyPI basic package only installs what’s needed to get started. Additional packages can be installed depending on what will be useful in your environment. For instance, if you don’t need connectivity with Postgres, you won’t have to go through the trouble of installing the postgres-devel yum package, or whatever equivalent applies on the distribution you are using.

Most of the extra dependencies are linked to a corresponding provider package. For example “amazon” extra has a corresponding apache-airflow-providers-amazon provider package to be installed. When you install Airflow with such extras, the necessary provider packages are installed automatically (latest versions from PyPI for those packages). However you can freely upgrade and install provider packages independently from the main Airflow installation.

For the list of the extras and what they enable, see: Reference for package extras.