FlightSafety International is looking to enable scalable, reliable, and automated data solutions using Azure, Databricks, DBT, and Airflow to support enterprise analytics and drive business intelligence and advanced analytics.
Requirements
- 2+ years of experience with Databricks, Delta Live Tables*, and Unity Catalog
- 2+ years of experience with Python* and PySpark
- Experience with DBT* for data modeling and transformation
- Experience with Apache Airflow for workflow orchestration
- Experience with Data Vault 2.0* modeling (certification preferred)
- Strong understanding of ELT/ETL concepts, data modeling, and cloud data platforms
- Experience with data ingestion and replication tools (e.g., Fivetran*, SQDR, Rivery)
Responsibilities
- Design and develop scalable ETL/ELT pipelines using Azure Data Factory (ADF), Databricks*, and DBT
- Implement real-time and batch data processing using Delta Live Tables (DLT)
- Orchestrate data workflows using Databricks LakeFlow, Apache Airflow, and ADF pipelines
- Design and implement Data Vault 2.0* models for cloud-based data warehousing
- Develop data ingestion and replication solutions using tools such as Fivetran*, SQDR, Rivery, or custom Python scripts
- Write Python* and PySpark code for data transformation, cleansing, and automation
- Monitor and optimize pipeline performance, ensuring data quality and reliability
Other
- Collaborate with analysts, architects, and business stakeholders to understand data needs and deliver consistent datasets
- Maintain documentation for data flows, models, and pipeline logic
- Support data governance, metadata management, and compliance initiatives
- Participate in Agile ceremonies and contribute to sprint planning, reviews, and retrospectives
- Infrequent travel as needed