GitHub Stats

    Stars

    65

    Forks

    20

    Release Date

    4/11/2025

    about 2 months ago

    Detailed Description

    MseeP.ai Security Assessment Badge

    mcp-server-apache-airflow

    smithery badge

    A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.

    About

    This project implements a Model Context Protocol server that wraps Apache Airflow's REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.

    Feature Implementation Status

    | Feature | API Path | Status | | -------------------------------- | --------------------------------------------------------------------------------------------- | ------ | | DAG Management | | | | List DAGs | /api/v1/dags | ✅ | | Get DAG Details | /api/v1/dags/{dag_id} | ✅ | | Pause DAG | /api/v1/dags/{dag_id} | ✅ | | Unpause DAG | /api/v1/dags/{dag_id} | ✅ | | Update DAG | /api/v1/dags/{dag_id} | ✅ | | Delete DAG | /api/v1/dags/{dag_id} | ✅ | | Get DAG Source | /api/v1/dagSources/{file_token} | ✅ | | Patch Multiple DAGs | /api/v1/dags | ✅ | | Reparse DAG File | /api/v1/dagSources/{file_token}/reparse | ✅ | | DAG Runs | | | | List DAG Runs | /api/v1/dags/{dag_id}/dagRuns | ✅ | | Create DAG Run | /api/v1/dags/{dag_id}/dagRuns | ✅ | | Get DAG Run Details | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id} | ✅ | | Update DAG Run | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id} | ✅ | | Delete DAG Run | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id} | ✅ | | Get DAG Runs Batch | /api/v1/dags/~/dagRuns/list | ✅ | | Clear DAG Run | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/clear | ✅ | | Set DAG Run Note | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/setNote | ✅ | | Get Upstream Dataset Events | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents | ✅ | | Tasks | | | | List DAG Tasks | /api/v1/dags/{dag_id}/tasks | ✅ | | Get Task Details | /api/v1/dags/{dag_id}/tasks/{task_id} | ✅ | | Get Task Instance | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id} | ✅ | | List Task Instances | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances | ✅ | | Update Task Instance | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id} | ✅ | | Clear Task Instances | /api/v1/dags/{dag_id}/clearTaskInstances | ✅ | | Set Task Instances State | /api/v1/dags/{dag_id}/updateTaskInstancesState | ✅ | | Variables | | | | List Variables | /api/v1/variables | ✅ | | Create Variable | /api/v1/variables | ✅ | | Get Variable | /api/v1/variables/{variable_key} | ✅ | | Update Variable | /api/v1/variables/{variable_key} | ✅ | | Delete Variable | /api/v1/variables/{variable_key} | ✅ | | Connections | | | | List Connections | /api/v1/connections | ✅ | | Create Connection | /api/v1/connections | ✅ | | Get Connection | /api/v1/connections/{connection_id} | ✅ | | Update Connection | /api/v1/connections/{connection_id} | ✅ | | Delete Connection | /api/v1/connections/{connection_id} | ✅ | | Test Connection | /api/v1/connections/test | ✅ | | Pools | | | | List Pools | /api/v1/pools | ✅ | | Create Pool | /api/v1/pools | ✅ | | Get Pool | /api/v1/pools/{pool_name} | ✅ | | Update Pool | /api/v1/pools/{pool_name} | ✅ | | Delete Pool | /api/v1/pools/{pool_name} | ✅ | | XComs | | | | List XComs | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries | ✅ | | Get XCom Entry | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key} | ✅ | | Datasets | | | | List Datasets | /api/v1/datasets | ✅ | | Get Dataset | /api/v1/datasets/{uri} | ✅ | | Get Dataset Events | /api/v1/datasetEvents | ✅ | | Create Dataset Event | /api/v1/datasetEvents | ✅ | | Get DAG Dataset Queued Event | /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri} | ✅ | | Get DAG Dataset Queued Events | /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents | ✅ | | Delete DAG Dataset Queued Event | /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri} | ✅ | | Delete DAG Dataset Queued Events | /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents | ✅ | | Get Dataset Queued Events | /api/v1/datasets/{uri}/dagRuns/queued/datasetEvents | ✅ | | Delete Dataset Queued Events | /api/v1/datasets/{uri}/dagRuns/queued/datasetEvents | ✅ | | Monitoring | | | | Get Health | /api/v1/health | ✅ | | DAG Stats | | | | Get DAG Stats | /api/v1/dags/statistics | ✅ | | Config | | | | Get Config | /api/v1/config | ✅ | | Plugins | | | | Get Plugins | /api/v1/plugins | ✅ | | Providers | | | | List Providers | /api/v1/providers | ✅ | | Event Logs | | | | List Event Logs | /api/v1/eventLogs | ✅ | | Get Event Log | /api/v1/eventLogs/{event_log_id} | ✅ | | System | | | | Get Import Errors | /api/v1/importErrors | ✅ | | Get Import Error Details | /api/v1/importErrors/{import_error_id} | ✅ | | Get Health Status | /api/v1/health | ✅ | | Get Version | /api/v1/version | ✅ |

    Setup

    Dependencies

    This project depends on the official Apache Airflow client library (apache-airflow-client). It will be automatically installed when you install this package.

    Environment Variables

    Set the following environment variables:

    AIRFLOW_HOST=<your-airflow-host>        # Optional, defaults to http://localhost:8080
    AIRFLOW_USERNAME=<your-airflow-username>
    AIRFLOW_PASSWORD=<your-airflow-password>
    AIRFLOW_API_VERSION=v1                  # Optional, defaults to v1
    

    Usage with Claude Desktop

    Add to your claude_desktop_config.json:

    {
      "mcpServers": {
        "mcp-server-apache-airflow": {
          "command": "uvx",
          "args": ["mcp-server-apache-airflow"],
          "env": {
            "AIRFLOW_HOST": "https://your-airflow-host",
            "AIRFLOW_USERNAME": "your-username",
            "AIRFLOW_PASSWORD": "your-password"
          }
        }
      }
    }
    

    For read-only mode (recommended for safety):

    {
      "mcpServers": {
        "mcp-server-apache-airflow": {
          "command": "uvx",
          "args": ["mcp-server-apache-airflow", "--read-only"],
          "env": {
            "AIRFLOW_HOST": "https://your-airflow-host",
            "AIRFLOW_USERNAME": "your-username",
            "AIRFLOW_PASSWORD": "your-password"
          }
        }
      }
    }
    

    Alternative configuration using uv:

    {
      "mcpServers": {
        "mcp-server-apache-airflow": {
          "command": "uv",
          "args": [
            "--directory",
            "/path/to/mcp-server-apache-airflow",
            "run",
            "mcp-server-apache-airflow"
          ],
          "env": {
            "AIRFLOW_HOST": "https://your-airflow-host",
            "AIRFLOW_USERNAME": "your-username",
            "AIRFLOW_PASSWORD": "your-password"
          }
        }
      }
    }
    

    Replace /path/to/mcp-server-apache-airflow with the actual path where you've cloned the repository.

    Selecting the API groups

    You can select the API groups you want to use by setting the --apis flag.

    uv run mcp-server-apache-airflow --apis "dag,dagrun"
    

    The default is to use all APIs.

    Allowed values are:

    • config
    • connections
    • dag
    • dagrun
    • dagstats
    • dataset
    • eventlog
    • importerror
    • monitoring
    • plugin
    • pool
    • provider
    • taskinstance
    • variable
    • xcom

    Read-Only Mode

    You can run the server in read-only mode by using the --read-only flag. This will only expose tools that perform read operations (GET requests) and exclude any tools that create, update, or delete resources.

    uv run mcp-server-apache-airflow --read-only
    

    In read-only mode, the server will only expose tools like:

    • Listing DAGs, DAG runs, tasks, variables, connections, etc.
    • Getting details of specific resources
    • Reading configurations and monitoring information
    • Testing connections (non-destructive)

    Write operations like creating, updating, deleting DAGs, variables, connections, triggering DAG runs, etc. will not be available in read-only mode.

    You can combine read-only mode with API group selection:

    uv run mcp-server-apache-airflow --read-only --apis "dag,variable"
    

    Manual Execution

    You can also run the server manually:

    make run
    

    make run accepts following options:

    Options:

    • --port: Port to listen on for SSE (default: 8000)
    • --transport: Transport type (stdio/sse, default: stdio)

    Or, you could run the sse server directly, which accepts same parameters:

    make run-sse
    

    Installing via Smithery

    To install Apache Airflow MCP Server for Claude Desktop automatically via Smithery:

    npx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claude
    

    Development

    Setting up Development Environment

    1. Clone the repository:
    git clone https://github.com/yangkyeongmo/mcp-server-apache-airflow.git
    cd mcp-server-apache-airflow
    
    1. Install development dependencies:
    uv sync --dev
    
    1. Create a .env file for environment variables (optional for development):
    touch .env
    

    Note: No environment variables are required for running tests. The AIRFLOW_HOST defaults to http://localhost:8080 for development and testing purposes.

    Running Tests

    The project uses pytest for testing with the following commands available:

    # Run all tests
    make test
    

    Code Quality

    # Run linting
    make lint
    
    # Run code formatting
    make format
    

    Continuous Integration

    The project includes a GitHub Actions workflow (.github/workflows/test.yml) that automatically:

    • Runs tests on Python 3.10, 3.11, and 3.12
    • Executes linting checks using ruff
    • Runs on every push and pull request to main branch

    The CI pipeline ensures code quality and compatibility across supported Python versions before any changes are merged.

    Contributing

    Contributions are welcome! Please feel free to submit a Pull Request.

    The package is deployed automatically to PyPI when project.version is updated in pyproject.toml. Follow semver for versioning.

    Please include version update in the PR in order to apply the changes to core logic.

    License

    MIT License

    Star History

    Star History

    Feb 18Mar 26Apr 3Apr 9Apr 18Apr 26May 19May 28Jun 7Jun 19Jul 6015304560
    Powered by MSeeP Analytics

    About the Project

    This app has not been claimed by its owner yet.

    Claim Ownership

    Receive Updates

    Security Updates

    Get notified about trust rating changes

    to receive email notifications.