BNSF Tech is looking to leverage cutting-edge technology to enhance freight transportation across 28 western states and three Canadian provinces, and the Data & AI Engineer role is intended to support this goal by leading the development and implementation of advanced data engineering solutions.
Requirements
- Proficiency in programming languages such as Python, SQL, and Java
- Strong understanding of data warehousing concepts including dimensional modeling and star schema
- Experience with ETL processes and cloud-based data integration solutions
- Familiarity with database management systems (SQL, NoSQL) and modern lakehouse technologies (Apache Iceberg, Delta Lake, Apache Trino)
- Hands-on experience with data streaming technologies like Apache Kafka, AWS Kinesis, Apache Flink, and Apache Spark
- Experience in data pipeline orchestration, datamart creation, and certified datasets
- Knowledge of DevSecOps standards, source control tools such as Jenkins and GitHub
Responsibilities
- Design and develop scalable data streaming solutions for real-time data processing
- Create and maintain robust data pipelines for both batch and streaming data workflows
- Build and manage large-scale data warehouses to support business intelligence and analytics
- Develop and certify datasets by integrating multiple data sources to ensure accuracy and reliability
- Monitor, troubleshoot, and optimize data pipelines to ensure high performance and scalability
- Ensure data security, compliance, and adherence to industry standards and regulations
- Write efficient, well-documented code with comprehensive testing and validation
Other
- Minimum 6 years of experience in software engineering, data engineering, or database management
- Excellent problem-solving, communication, and collaboration skills
- Ability to quickly learn new technologies and adapt to changing environments
- Annual incentive bonus program
- Generous leave policies and time-off benefits
- Opportunities for professional growth and development in a supportive environment