parent | title | nav_exclude |
---|---|---|
Infrastructure Catalog |
AWS Airflow |
false |
Airflow is an open source platform to programmatically author, schedule and monitor workflows. More information here: airflow.apache.org
No requirements.
No provider.
The following input variables are required:
Description: Standard name_prefix
module input. (Prefix counts towards 64-character max length for certain resource types.)
Type: string
Description: Standard environment
module input.
Type:
object({
vpc_id = string
aws_region = string
public_subnets = list(string)
private_subnets = list(string)
})
Description: Standard resource_tags
module input.
Type: map(string)
Description: The command to run on the Airflow container.
Type: string
The following input variables are optional (have default values):
Description: Optional. Overrides the docker image used for Airflow execution.
Type: string
Default: "airflow"
Description: Optional. The number of CPU cores.
Type: number
Default: 2
Description: Optional. The amount of RAM to use, in GB.
Type: number
Default: 4
Description: A map of environment variables to pass to the Airflow image.
Type: map(string)
Default: {}
Description: A map of environment variable secrets to pass to the airflow image. Each secret value should be either a
Secrets Manager URI or a local JSON or YAML file reference in the form /path/to/file.yml:name_of_secret
.
Type: map(string)
Default: {}
Description: The git repo reference to clone onto the airflow server
Type: string
Default: null
The following outputs are exported:
Description: Link to the airflow web UI.
Description: Link to Airflow logs in Cloudwatch.
Description: Command to launch the Airflow web server via ECS.
Description: Summary of resources created by this module.
Source code for this module is available using the links below.
NOTE: This documentation was auto-generated using
terraform-docs
and s-infra
from slalom.dataops
.
Please do not attempt to manually update this file.