Airflow needs to know how to connect to your environment. Information such as hostname, port, login and passwords to other systems and services is handled in the Admin -> Connections section of the UI. The pipeline code you will author will reference the ‘conn_id’ of the Connection objects.
☄️Managing Connections - 图1

Connections can be created and managed using either the UI or environment variables.

See the Connections Concepts documentation for more information.

1. Creating a Connection with the UI

Open the Admin -> Connections section of the UI. Click the Create link to create a new connection.
☄️Managing Connections - 图2

  1. Fill in the Conn Id field with the desired connection ID. It is recommended that you use lower-case characters and separate words with underscores.
  2. Choose the connection type with the Conn Type field.
  3. Fill in the remaining fields. See Handling of special characters in connection params for a description of the fields belonging to the different connection types.
  4. Click the Save button to create the connection.

    2. Editing a Connection with the UI

    Open the Admin -> Connections section of the UI. Click the pencil icon next to the connection you wish to edit in the connection list.
    ☄️Managing Connections - 图3

Modify the connection properties and click the Save button to save your changes.

3. Creating a Connection from the CLI

You may add a connection to the database from the CLI.

Obtain the URI for your connection (see Generating a Connection URI).

Then add connection like so:

  1. airflow connections add 'my_prod_db' \
  2. --conn-uri 'my-conn-type://login:password@host:port/schema?param1=val1&param2=val2'

Alternatively you may specify each parameter individually:

  1. $ airflow connections add 'my_prod_db' \
  2. --conn-type 'my-conn-type'
  3. --conn-login 'login' \
  4. --conn-password 'password' \
  5. --conn-host 'host' \
  6. --conn-port 'port' \
  7. --conn-schema 'schema' \
  8. ...

4. Exporting Connections from the CLI

You may export connections from the database using the CLI. The supported formats are json, yaml and env.

You may mention the target file as the parameter:

  1. $ airflow connections export connections.json

Alternatively you may specify format parameter for overriding the format:

  1. $ airflow connections export /tmp/connections --format yaml

You may also specify - for STDOUT:

  1. $ airflow connections export -

The JSON format contains an object where the key contains the connection ID and the value contains the definition of the connection. In this format, the connection is defined as a JSON object. The following is a sample JSON file.

  1. {
  2. "airflow_db": {
  3. "conn_type": "mysql",
  4. "host": "mysql",
  5. "login": "root",
  6. "password": "plainpassword",
  7. "schema": "airflow",
  8. "port": null,
  9. "extra": null
  10. },
  11. "druid_broker_default": {
  12. "conn_type": "druid",
  13. "host": "druid-broker",
  14. "login": null,
  15. "password": null,
  16. "schema": null,
  17. "port": 8082,
  18. "extra": "{\"endpoint\": \"druid/v2/sql\"}"
  19. }
  20. }

The YAML file structure is similar to that of a JSON. The key-value pair of connection ID and the definitions of one or more connections. In this format, the connection is defined as a YAML object. The following is a sample YAML file.

  1. airflow_db:
  2. conn_type: mysql
  3. extra: null
  4. host: mysql
  5. login: root
  6. password: plainpassword
  7. port: null
  8. schema: airflow
  9. druid_broker_default:
  10. conn_type: druid
  11. extra: '{"endpoint": "druid/v2/sql"}'
  12. host: druid-broker
  13. login: null
  14. password: null
  15. port: 8082
  16. schema: null

You may also export connections in .env format. The key is the connection ID, and the value describes the connection using the URI. The following is a sample ENV file.

  1. airflow_db=mysql://root:plainpassword@mysql/airflow
  2. druid_broker_default=druid://druid-broker:8082?endpoint=druid%2Fv2%2Fsql

5. Storing a Connection in Environment Variables

The environment variable naming convention is [AIRFLOW_CONN_{CONN_ID}](https://airflow.apache.org/docs/apache-airflow/stable/cli-and-env-variables-ref.html#envvar-AIRFLOW_CONN_-CONN_ID), all uppercase.

So if your connection id is my_prod_db then the variable name should be AIRFLOW_CONN_MY_PROD_DB. :::tips 🔖 Note ::: :::info Single underscores surround CONN. This is in contrast with the way airflow.cfg parameters are stored, where double underscores surround the config section name. Connections set using Environment Variables would not appear in the Airflow UI but you will be able to use them in your DAG file. :::

The value of this environment variable must use airflow’s URI format for connections. See the section Generating a Connection URI for more details.

5.1 Using .bashrc (or similar)

If storing the environment variable in something like ~/.bashrc, add as follows:

  1. $ export AIRFLOW_CONN_MY_PROD_DATABASE='my-conn-type://login:password@host:port/schema?param1=val1&param2=val2'

5.2 Using docker .env

If using with a docker .env file, you may need to remove the single quotes.

  1. AIRFLOW_CONN_MY_PROD_DATABASE=my-conn-type://login:password@host:port/schema?param1=val1&param2=val2

6. Connection URI format

In general, Airflow’s URI format is like so:

  1. my-conn-type://my-login:my-password@my-host:5432/my-schema?param1=val1&param2=val2

:::tips 🔖 Note ::: :::info The params param1 and param2 are just examples; you may supply arbitrary urlencoded json-serializable data there. :::

The above URI would produce a Connection object equivalent to the following:

  1. Connection(
  2. conn_id='',
  3. conn_type='my_conn_type',
  4. description=None,
  5. login='my-login',
  6. password='my-password',
  7. host='my-host',
  8. port=5432,
  9. schema='my-schema',
  10. extra=json.dumps(dict(param1='val1', param2='val2'))
  11. )

You can verify a URI is parsed correctly like so:

  1. >>> from airflow.models.connection import Connection
  2. >>> c = Connection(uri='my-conn-type://my-login:my-password@my-host:5432/my-schema?param1=val1&param2=val2')
  3. >>> print(c.login)
  4. my-login
  5. >>> print(c.password)
  6. my-password

6.1 Generating a connection URI

To make connection URI generation easier, the [Connection](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/models/connection/index.html#airflow.models.connection.Connection) class has a convenience method [get_uri()](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/models/connection/index.html#airflow.models.connection.Connection.get_uri). It can be used like so:

  1. import json
  2. from airflow.models.connection import Connection
  3. c = Connection(
  4. conn_id='some_conn',
  5. conn_type='mysql',
  6. description='connection description',
  7. host='myhost.com',
  8. login='myname',
  9. password='mypassword',
  10. extra=json.dumps(dict(this_param='some val', that_param='other val*')),
  11. )
  12. print(f"AIRFLOW_CONN_{c.conn_id.upper()}='{c.get_uri()}'")
  13. # AIRFLOW_CONN_SOME_CONN='mysql://myname:mypassword@myhost.com?this_param=some+val&that_param=other+val%2A'

Additionally, if you have created a connection, you can use airflow connections get command.

  1. $ airflow connections get sqlite_default
  2. Id: 40
  3. Conn Id: sqlite_default
  4. Conn Type: sqlite
  5. Host: /tmp/sqlite_default.db
  6. Schema: null
  7. Login: null
  8. Password: null
  9. Port: null
  10. Is Encrypted: false
  11. Is Extra Encrypted: false
  12. Extra: {}
  13. URI: sqlite://%2Ftmp%2Fsqlite_default.db

6.2 Handling of special characters in connection params

:::tips 🔖 Note
This process is automated as described in section Generating a Connection URI. :::

Special handling is required for certain characters when building a URI manually.

For example if your password has a /, this fails:

  1. >>> c = Connection(uri='my-conn-type://my-login:my-pa/ssword@my-host:5432/my-schema?param1=val1&param2=val2')
  2. ValueError: invalid literal for int() with base 10: 'my-pa'

To fix this, you can encode with [quote_plus()](https://docs.python.org/3/library/urllib.parse.html#urllib.parse.quote_plus):

  1. >>> c = Connection(uri='my-conn-type://my-login:my-pa%2Fssword@my-host:5432/my-schema?param1=val1&param2=val2')
  2. >>> print(c.password)
  3. my-pa/ssword

7. Securing Connections

Airflow uses Fernet to encrypt passwords in the connection configurations stored the metastore database. It guarantees that without the encryption password, Connection Passwords cannot be manipulated(使用) or read without the key. For information on configuring Fernet, look at Fernet.

In addition to retrieving connections from environment variables or the metastore database, you can enable an secrets backend to retrieve connections. For more details see Secrets backend.

8. Custom connection types

Airflow allows the definition of custom connection types - including modifications of the add/edit form for the connections. Custom connection types are defined in community maintained providers, but you can can also add a custom provider that adds custom connection types. See Provider packages for description on how to add custom providers.

The custom connection types are defined via Hooks delivered by the providers. The Hooks can implement methods defined in the protocol(协议) class DiscoverableHook. Note that your custom Hook should not derive(得到) from this class, this class is a dummy example to document expectations regarding about class fields and methods that your Hook might define. Another good example is [JdbcHook](https://airflow.apache.org/docs/apache-airflow-providers-jdbc/stable/_api/airflow/providers/jdbc/hooks/jdbc/index.html#airflow.providers.jdbc.hooks.jdbc.JdbcHook).

By implementing those methods in your hooks and exposing them via hook-class-names array in the provider meta-data you can customize Airflow by:

  • Adding custom connection types
  • Adding automated Hook creation from the connection type
  • Adding custom form widget to display and edit custom “extra” parameters in your connection URL
  • Hiding fields that are not used for your connection
  • Adding placeholders showing examples of how fields should be formatted

You can read more about details how to add custom provider packages in the Provider packages