Mission Lane is seeking a lead for their Data Engineering team to design, develop, and maintain complex data pipelines and drive best practices in software engineering.
Requirements
- Expert-level SQL skills.
- Strong Python programming skills.
- Strong understanding of software engineering principles and best practices (e.g., version control, testing, CI/CD).
- Extensive experience with data warehousing technologies, preferably Snowflake.
- Experience with dbt (Data Build Tool).
- Experience with cloud platforms, preferably GCP (Google Cloud Platform), including services like Cloud Functions, and GCS.
- Experience designing and implementing reliable and resilient ETL/ELT pipelines.
Responsibilities
- Design, develop, and maintain complex, high-performance data pipelines using Python, SQL, dbt, and Snowflake on GCP.
- Lead the effort to advance code quality, test coverage, and maintainability of our data pipelines.
- Champion and implement software engineering best practices, including code reviews, testing methodology, CI/CD, and documentation.
- Drive the adoption of data observability tools and practices (e.g., data lineage, automated alerting).
- Research, evaluate, and recommend new technologies and tools to improve our data platform.
- Contribute to the data architecture and design of our data warehouse.
- Troubleshoot and resolve complex data pipeline issues, ensuring data quality and reliability.
Other
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering.
- Excellent communication, collaboration, and problem-solving skills.
- Unlimited paid time off, 401(k) match, a monthly wellness stipend, health/ dental/ vision insurance options, disability coverage, paid parental leave, flexible spending account (for childcare and healthcare), life insurance, and a remote-friendly work environment.