The Lifetime Value Co. is seeking a Data Engineer to build, maintain, and optimize data pipelines for their genealogy data products, processing large-scale datasets and ensuring data integrity and accessibility.
Requirements
- 1-2 years of experience using Python in a production environment
- Hands-on experience or familiarity with Apache Airflow (DAGs, operators, scheduling)
- Experience working with Docker or other OCI-compatible container frameworks
- Proficiency in Git and standard version control workflows (branches, pull requests, etc.)
- Experience building and consuming RESTful APIs
- Comfort working with data formats such as CSV and JSON, with understanding of data flow across systems
- Experience with AWS services such as S3, IAM, and core cloud infrastructure concepts
Responsibilities
- Build and maintain scalable, reusable data pipelines to process large genealogy datasets
- Analyze source data and collaborate with stakeholders to clarify requirements, challenge assumptions, and propose improvements
- Design, develop, and maintain RESTful APIs to expose data and services to other teams and systems
- Write production-ready code following best engineering practices, including testing, peer reviews, and CI/CD workflows
- Troubleshoot data pipeline issues and provide ongoing support to internal teams relying on data services
- Contribute to the continuous improvement of tools, workflows, and infrastructure
- Work with various technologies and cloud services under the guidance of a tech lead
Other
- Join an established and collaborative data team as an individual contributor
- Strong communication skills and an analytical mindset
- Proficiency in English (written and spoken)
- 100% remote work culture promoting flexibility and work-life balance
- Opportunities for professional growth, milestone bonuses, and anniversary gifts