Pantheon Data is seeking an experienced Airflow Data Engineer to design, develop, and maintain robust data pipelines that support mission-critical analytics and data integration initiatives for DoD and Federal clients.
Requirements
- Proficiency in Python and SQL for data manipulation and transformation.
- Hands-on experience with Apache Airflow, DBT, or similar orchestration frameworks.
- Experience with cloud platforms (AWS, Azure, or GCP), including managed data services.
- Familiarity with CI/CD, Docker, and Kubernetes.
- Understanding of data warehousing concepts, metadata management, and distributed computing frameworks (e.g., Spark).
Responsibilities
- Design, build, and manage data pipelines and workflows using Apache Airflow and other orchestration tools.
- Develop and optimize ETL/ELT processes to ensure timely and accurate data delivery across multiple environments.
- Collaborate with data scientists, analysts, and DevOps engineers to operationalize data models and analytics solutions.
- Integrate diverse data sources (structured, unstructured, APIs) into centralized data platforms.
- Implement and maintain data quality, lineage, and observability frameworks.
- Support Airflow DAG deployments in containerized or cloud environments (e.g., AWS ECS, Kubernetes).
- Automate monitoring, alerting, and logging for data pipelines to ensure reliability and transparency.
Other
- 4-8+ years of experience in data engineering or data pipeline development.
- Strong communication skills and ability to work within agile, cross-functional teams.
- Ability to work effectively remotely in cross-functional teams.
- Ability to meet deadlines and produce quality work.
- U.S. Citizenship with the ability to obtain and maintain a DoD Secret clearance.