[airflow.executors.dask_executor.DaskExecutor](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/executors/dask_executor/index.html#airflow.executors.dask_executor.DaskExecutor) allows you to run Airflow tasks in a Dask Distributed cluster.

    Dask clusters can be run on a single machine or on remote networks. For complete details, consult the Distributed documentation.

    To create a cluster, first start a Scheduler:

    1. # default settings for a local cluster
    2. DASK_HOST=127.0.0.1
    3. DASK_PORT=8786
    4. dask-scheduler --host $DASK_HOST --port $DASK_PORT

    Next start at least one Worker on any machine that can connect to the host:

    1. dask-worker $DASK_HOST:$DASK_PORT

    Edit your airflow.cfg to set your executor to [airflow.executors.dask_executor.DaskExecutor](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/executors/dask_executor/index.html#airflow.executors.dask_executor.DaskExecutor) and provide the Dask Scheduler address in the [dask] section. For more information on setting the configuration, see Setting Configuration Options.

    Please note:

    • Each Dask worker must be able to import Airflow and any dependencies you require.
    • Dask does not support queues. If an Airflow task was created with a queue, a warning will be raised but the task will be submitted to the cluster.