AmeriLife is looking for a skilled Data Engineer to design, build, and run robust data pipelines that power enterprise-wide analytics and reporting, ensuring data reliability, scalability, and performance in a rapidly evolving industry.
Requirements
- 5+ years of hands-on ETL development and production support.
- Expert-level SQL Server (T-SQL), including performance tuning and query plan analysis.
- Strong SSIS development, deployment, and troubleshooting experience in production.
- Solid Azure Data Factory experience for data pipeline orchestration and monitoring.
- Proven ability to analyze complex data issues and deliver timely, stable solutions.
- Exposure to Databricks stack i.e. PySpark/Spark SQL, Delta Lake, Jobs; Unity Catalog a plus.
- Python for automation and lightweight transforms
Responsibilities
- Analyze, troubleshoot, and resolve operational support tickets across BI, ETL processes, and data pipelines.
- Maintain, enhance, and deploy SSIS packages with strong logging, error handling, and recoverability.
- Develop, optimize, and troubleshoot complex T-SQL (queries, stored procedures, views, functions).
- Build and orchestrate ADF pipelines (triggers, activities, linked services, IRs) with parameterization and environment promotion.
- Ensure data accuracy and integrity in production; define validation checks and monitoring.
- Perform root-cause analysis of failures and implement permanent fixes to prevent recurrences.
- Document runbooks, operational processes, and known fixes for knowledge transfer.
Other
- Strong analytical and problem-solving skills with keen attention to detail.
- Excellent communication skills and a collaborative, ownership-oriented work style.
- The ability to Shape model-ready datasets (clean dims/facts outputs, SCD outputs, RLS helper columns) with semantic modelers.
- Familiarity with GitHub/Azure DevOps (CI/CD, releases, environment configuration).
- Contribute to reliability practices (alerting, retries, idempotent reruns, SLAs) and on-call readiness.