Documentation Index
Fetch the complete documentation index at: https://astronomer-preview.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Prerequisites
- A user account on AWS with access to AWS cloud resources.
- The AWS CLI.
- The Astro CLI.
- An Astro project.
Retrieve AWS user credentials locally
Run the following command to obtain your user credentials locally:aws configure sso instead.
The AWS CLI then stores your credentials in two separate files:
.aws/config.aws/credentials
- Linux:
/home/<username>/.aws - Mac:
/Users/<username>/.aws - Windows:
%UserProfile%/.aws
Configure your Astro project
For Airflow 3, use the provided
docker-compose.override.yml. For Airflow 2, replaceapi-serverwithwebserverand remove thedag-processorblock..aws folder as a volume in Docker.
- In your Astro project, create a file named
docker-compose.override.ymlwith the following configuration:
- Mac
- Linux
- Windows
- In your Astro project’s
.envfile, add the following environment variables. Make sure that the volume path is the same as the one you configured in thedocker-compose.override.yml.
- Mounted user credentials in the
~/.aws/configfile. - Configurations in
aws_access_key_id,aws_secret_access_key, andaws_session_token. - An explicit username & password provided in the connection.
~/.aws/config.
Test your credentials with a secrets backend
Now that Airflow has access to your user credentials, you can use them to connect to your cloud services. Use the following example setup to test your credentials by pulling values from different secrets backends.-
Create a secret for an Airflow variable or connection in AWS Secrets Manager. All Airflow variables and connection keys must be prefixed with the following strings respectively:
airflow/variables/<my_variable_name>airflow/connections/<my_connection_id>
my_secret_varyou will need to give the secret the nameairflow/variables/my_secret_var. When setting the secret type, chooseOther type of secretand select thePlaintextoption. If you’re creating a connection URI or a non-dict variable as a secret, remove the brackets and quotations that are pre-populated in the plaintext field. -
Add the following environment variables to your Astro project
.envfile. For additional configuration options, see the Apache Airflow documentation. Make sure to specify yourregion_name. -
Run the following command to start Airflow locally:
-
Access the Airflow UI at
localhost:8080and create an Airflow AWS connection namedaws_standardwith no credentials. See Connections. When you use this connection in your dag, it will fall back to using your configured user credentials. -
Add a dag which uses the secrets backend to your Astro project
dagsdirectory. You can use the following example dag to retrieve<my_variable_name>and<my_connection_id>from the secrets backend and print it to the terminal: - In the Airflow UI, unpause your dag and click Play to trigger a dag run.
- View logs for your dag run. If the connection was successful, your masked secrets appear in your logs. See Airflow logging.
