Remote Execution Agents generate task logs in your Kubernetes cluster. By default, these logs remain in agent Pods and are lost when Pods terminate. Configure logging to preserve and access these logs.Documentation Index
Fetch the complete documentation index at: https://astronomer-preview.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Airflow 3This feature is only available for Airflow 3.x Deployments.
Logging approaches
External logging provider (recommended)
Export logs to your logging platform (Splunk, Elasticsearch, CloudWatch, etc.) using a logging sidecar. Configure the Airflow UI to display links to external logs.Object storage (Airflow UI display)
Store logs in object storage (S3, GCS, Azure Blob) and configure the Astro API server to fetch and display them in the Airflow UI. Logs appear after task completion.Real-time streaming to object storage
Extend object storage logging with a Vector sidecar to stream partial logs while tasks run. Provides real-time log visibility in the Airflow UI.Comparison
| Feature | External Logging | Object Storage | Real-Time Streaming |
|---|---|---|---|
| Data location | Your logging platform | Your object storage | Your object storage |
| UI experience | Link to external platform | View in Airflow UI | View in Airflow UI (live) |
| Log availability | Near real-time | After task completion | During task execution |
| Setup complexity | Medium | Medium | High |
| Storage costs | Platform-dependent | Standard object storage | Higher (many small files) |
Prerequisites
- Remote Execution Agent installed and registered
- Workload identity configured for your Kubernetes cluster
- One of the following:
- External logging: Logging platform endpoint
- Object storage: S3, GCS, or Azure Blob container
- Real-time streaming: Object storage + Vector configuration