FHI is looking to hire a Data Engineer to own and support ETL/data pipelines, integrations, and reporting, diagnose issues, improve data quality, and partner with BI to deliver reliable, scalable solutions.
Requirements
- Strong SQL (queries, tuning, stored procedures, views, functions).
- Practical experience with Python and Pandas; inbound/outbound APIs; scripting, logging, error handling, containers, and loops.
- Solid grasp of data architecture and modeling (normalized/denormalized, star/snowflake).
- Knowledge of SDLC
- Experience maintaining data pipelines and integrations across SQL Server/Snowflake or similar environments; BI/reporting with Power BI/SSRS/Workday.
- Familiarity with source control (e.g., Git) and ERP data flows (e.g., Workday).
- Microsoft Azure Data Engineer Associate (DP-203), AWS Certified Data Analytics – Specialty, Snowflake SnowPro Core, Any relevant SQL Server/Power BI certifications
Responsibilities
- Maintain and support ETL, data flows, and reporting processes; serve as Level-1 production support for data flow incidents.
- Monitor, schedule, and optimize BI jobs; maintain integration catalogs and documentation.
- Troubleshoot upstream/downstream impacts; translate changes into end-to-end reporting solutions (Power BI, SSRS, Workday).
- Build and refine data pipelines; assemble large datasets that meet functional and non-functional needs.
- Automate manual processes and improve data delivery and reliability.
- Create clear documentation (ETL processes, object usage, data models) and test/validate code changes.
Other
- Proven ability to work independently, multi-task, and deliver in a rapidly changing environment.
- excellent analytical, problem-solving, and communication skills.
- Bachelor’s in Computer Science or related field, or equivalent experience.
- Remote (US) with working hours aligned to Eastern time. Strong preference for candidates based in North Carolina.
- Prolonged periods of sitting and working on a computer.