Mission Lane is combining the power of data, technology, and exceptional service to pave a clear way forward for millions of people on the path to financial success. By attracting top talent and leveraging cutting-edge technology, we’re enabling people to unlock real financial progress.
Requirements
- Strong analytical SQL skills
- Strong Python skills
- Understanding of software engineering principles and best practices (e.g., version control, testing, CI/CD)
- Experience with data warehousing technologies, preferably Snowflake
- Experience with cloud platforms, preferably GCP (Google Cloud Platform), including services like Cloud Functions, and GCS
- Experience designing and implementing reliable and resilient ETL/ELT pipelines
- Experience with our specific stack: Snowflake, dbt, Montecarlo, Airflow
Responsibilities
- Design, develop, and maintain high-performance data pipelines using Python, SQL, dbt, and Snowflake on GCP.
- Engage in our efforts to advance code quality, test coverage, and maintainability of our data pipelines.
- Champion and implement software engineering best practices, including code reviews, testing methodology, CI/CD, and documentation.
- Support the adoption of data quality tools and practices (e.g., data lineage, automated alerting).
- Research, evaluate, and recommend new technologies and tools to improve our data platform.
- Contribute to the data architecture and design of our data warehouse.
- Collaborate effectively with software engineering teams to define data structures, streamline ingestion processes, and ensure data consistency.
Other
- Bachelor's degree in Computer Science, Engineering, or a related field
- 2+ years of experience in data engineering
- Excellent communication, collaboration, and problem-solving skills
- Participate in on-call rotation to support critical data pipelines.
- Mission Lane is not sponsoring new applicant employment authorization and please, no third-party recruiters.