OUC is seeking a Data Engineer to structure, organize, and optimize scalable data pipelines, cloud architecture, and business intelligence, translating complex business needs into secure, high-performance data solutions.
Requirements
- Expertise in SQL, Python, and Scala, ensuring optimized query performance for structured and unstructured data processing.
- Proficiency in two or more of the following languages for diverse technical implementations: JavaScript (Snowpark), DAX, Power Query (M), WDL, Power Fx, C-Sharp, Perl, VBA, R, Julia.
- Proficiency in ETL/ELT tools, including Talend, Dataiku, Alteryx, Snowpipes, Azure Data Factory, SSIS.
- Experience in workflow automation using Power Automate and cloud-based integration platforms.
- Strong knowledge of cloud-native data solutions, including Snowflake, Databricks, AWS S3, Azure Synapse, and relational databases like Oracle, MySQL.
- Proficiency in VM-based deployments using Azure VMs, AWS EC2, Google Compute Engine for scalable data processing.
- Expertise in data modeling methodologies: Inmon, Kimball, Data Vault, ensuring robust analytics solutions.
Responsibilities
- Performs ETL/ELT processes, designs data models, and implements multidimensional data architectures to optimize data processing for analytics, reporting, business intelligence, and operational decision-making.
- Develops and enhances scalable data pipelines, integrations, and storage solutions across on-premises and cloud environments, ensuring high-performance, reliable data accessibility.
- Builds and maintains AI/ML-ready data infrastructure, enabling seamless integration of machine learning models and automated analytics workflows.
- Executes query auditing, code reviews, quality assurance (QA), and data governance to uphold best practices in security, performance, and compliance, ensuring integrity, reliability, and accessibility of data assets.
- Translates business requirements into technical designs, optimized data flows, and infrastructure solutions, ensuring alignment with organizational objectives.
- Collaborates with data scientists, ML engineers, analysts, and IT teams to streamline data workflows, troubleshoot performance bottlenecks, and maintain scalable architectures for structured and unstructured data.
- Generates effort assessments for data engineering projects, defining Work Breakdown Structures (WBS) to scope and estimate tasks, timelines, and resource requirements.
Other
- This is a hybrid position offering two remote workdays per week. Must be onsite Tuesdays and Thursdays.
- Bachelor’s Degree in Information Technology, Computer Science, Computer Engineering, Management Information Systems, Mathematics, Statistics or related field of study, from an accredited college or university.
- Master’s Degree in Data Engineering, Information Systems, or related field of study from an accredited college or university (preferred.)
- Minimum of three (3) years of experience in SQL or Python Programming, ELT/ELT processes, data warehousing & cloud, data modeling and architecture, data security and governance, and visualization and reporting.
- Experience with cloud infrastructure components as well as self service analytics platforms.