Skip to main content

Documentation Index

Fetch the complete documentation index at: https://astronomer-preview.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

In this section, you’ll learn how to use Azure Key Vault as a secrets backend on Astro Private Cloud.

Prerequisites

  • A Deployment.
  • The Astro CLI.
  • An Astro project initialized with astro dev init.
  • An existing Azure Key Vault linked to a resource group.
  • Your Key Vault URL. To find this, go to your Key Vault overview page > Vault URI.
If you do not already have Key Vault configured, see the Microsoft Azure documentation.

Step 1: Register Astro Private Cloud as an app on Azure

Follow the Microsoft Azure documentation to register a new application for Astro Private Cloud. At a minimum, you need to add a secret that Astro Private Cloud can use to authenticate to Key Vault. Note the value of the application’s client ID and secret for Step 3.

Step 2: Create an access policy

Follow the Microsoft documentation to create a new access policy for the application that you just registered. The settings you need to configure for your policy are:
  • Configure from template: Select Key, Secret, & Certificate Management.
  • Select principal: Select the name of the application that you registered in Step 1.

Step 3: Set up Key Vault locally

In your Astro project, add the following line to your requirements.txt file:
apache-airflow-providers-microsoft-azure
In your Dockerfile, add the following environment variables with your own values:
ENV AZURE_CLIENT_ID="<your-client-id>" # Found on App Registration page > 'Application (Client) ID'
ENV AZURE_TENANT_ID="<your-tenant-id>" # Found on App Registration page > 'Directory (tenant) ID'
ENV AZURE_CLIENT_SECRET="<your-client-secret>" # Found on App Registration Page > Certificates and Secrets > Client Secrets > 'Value'
ENV AIRFLOW__SECRETS__BACKEND=airflow.providers.microsoft.azure.secrets.key_vault.AzureKeyVaultBackend
ENV AIRFLOW__SECRETS__BACKEND_KWARGS={"connections_prefix": "airflow-connections", "variables_prefix": "airflow-variables", "vault_url": "<your-vault-url>"}
This tells Airflow to look for variable information at the airflow-variables-* path in Azure Key Vault and connection information at the airflow-connections-* path. In the next step, you’ll run an example Dag to test this configuration locally.
By default, this setup requires that you prefix any secret names in Key Vault withairflow-connectionsorairflow-variables. If you don’t want to use prefixes in your Key Vault secret names, set the values forsep,"connections_prefix", and"variables_prefix"to""withinAIRFLOW__SECRETS__BACKEND_KWARGS.
If you want to deploy your project to a hosted Git repository before deploying to Astronomer, be sure to save <your-client-id>, <your-tenant-id>, and <your-client-secret> in a secure manner. When you deploy to Astronomer, you should set these values as secrets with the Astro Private Cloud UI.

Step 4: Test Key Vault locally

To test your Key Vault setup on Astro Private Cloud locally, create a new secret in Key Vault containing either a variable or a connection. Once you create a test secret, write a simple Dag which calls the secret and add this Dag to your project’s dags directory. For example, you can use the following Dag to print the value of a variable to your task logs:
from datetime import datetime

from airflow import DAG
from airflow.models import Variable
from airflow.operators.python import PythonOperator

def print_var():
    my_var = Variable.get("<your-variable-key>")
    print(f'My variable is: {my_var}')

with DAG('example_secrets_dags', start_date=datetime(2022, 1, 1), schedule=None) as dag:

  test_task = PythonOperator(
      task_id='test-task',
      python_callable=print_var,
)
To test your changes:
  1. Run astro dev stop followed by astro dev start to push your changes to your local Airflow environment.
  2. In the Airflow UI (http://localhost:8080/admin/), trigger your new dag.
  3. Click on test-task > View Logs. If you ran the example Dag above, you should see the contents of your secret in the task logs:
    {logging_mixin.py:109} INFO - My variable is: my-test-variable
    
Once you confirm that the setup was successful, you can delete this dag.

Step 5: Push changes to Astro Private Cloud

Once you’ve confirmed that your secrets are being imported correctly to your local environment, you’re ready to configure the same feature in a Deployment on Astro Private Cloud.
  1. In the Astro Private Cloud UI, add the same environment variables found in your Dockerfile to your Deployment environment variables. Specify the AZURE_CLIENT_ID, AZURE_TENANT_ID, and AZURE_CLIENT_SECRET variables as Secret to ensure that your credentials are stored securely.
  2. In your Astro project, delete the environment variables from your Dockerfile.
  3. Deploy your changes to Astro Private Cloud.
From here, you can store any Airflow variables or connections as secrets on Key Vault and use them in your project.