Link Logistics Real Estate is seeking a Senior Data Engineer to own and modernize their research and analytics pipelines, ensuring stakeholders have timely, accurate, and trusted data for high-impact decisions.
Requirements
- Strong experience with Azure Databricks and Spark (PySpark)
- Advanced SQL and Python skills for large-scale data transformation and automation
- Experience with dbt for transformation and modeling
- Familiarity with Snowflake and integrating data for downstream tools like Power BI
- Strong grasp of MLOps principles: reproducibility, version control, workflow orchestration, basic CI/CD, and monitoring
- Understanding of data quality, testing, and governance best practices
- Bonus: Experience building internal data tools or dashboards with Streamlit
Responsibilities
- Develop, maintain, and document data ingestion, transformation, and ML model pipelines.
- Diagnose existing code and processes, separating intentional design from accidental or legacy artifacts.
- Identify and remediate technical debt.
- Propose and implement improvements in reliability, performance, and maintainability.
- Automate model retraining, scoring, and delivery.
- Develop basic CI/CD and monitoring for analytics workflows.
- Serve as the main technical liaison with enterprise tech.
Other
- Translate business requirements into concrete technical specs.
- Advocate for research/data needs in cross-team discussions.
- Scope and specify data and engineering requirements for new analytics and research projects.
- Drive clarity on priorities, timelines, and “definitions of done.”
- Proactively identify ambiguities and communicate risks or blockers.