The company is looking to modernize its data engineering and migration processes, moving from on-premise SQL Server solutions to cloud-based technologies like Microsoft Fabric, Snowflake, and Databricks. This involves designing, building, and maintaining data integration solutions, ensuring compatibility with state-based deployment operations and CI/CD pipelines, and providing operational support for existing and new data systems.
Requirements
- Leverage T-SQL, Spark SQL, SSIS, Python, Synapse, Azure Data Factory, OneLake, Data Lake, DBT etc.
- Strong technical knowledge and application with Azure data services (Azure SQL, Synapse, Microsoft Fabric, Databricks).
- Strong understanding in ETL/ELT design, data integration (API, JSON, XML, Excel, proprietary formats).
- Proficiency in Python, and PowerShell for data engineering workflows.
- Intermediate to advanced SSIS expertise for supporting and optimizing legacy ETL processes.
- Experience with state-based database deployments (e.g., SSDT, Redgate, dbForge, ApexSQL).
- Proven experience with set-based development techniques, object-oriented concepts, etc.
Responsibilities
- Perform data engineering and modernization by designing, building and maintaining data integration, transfer and analysis solutions for structured, and semi-structured data. Leverage T-SQL, Spark SQL, SSIS, Python, Synapse, Azure Data Factory, OneLake, Data Lake, DBT etc.
- Lead the technical migration of data solutions, primarily SQL Server, to Azure (Microsoft Fabric), Snowflake and Databricks or other cloud data technologies.
- Implement and adhere to data management best practices as well as adopted security, development and coding standards for solution designs, performance tuning and optimization.
- Ensure solutions are compatible with state-based deployment operations and CI/CD ready using tools such as SSDT projects, ARM/BICEP, DAB, etc.
- Provide operational support by investigating failed jobs, troubleshoot data issues and respond to user reported errors in collaboration with DBAs and other members of the IT Teams. Perform diagnostics, index tuning, resource and performance optimizations.
- Ensure continuity and reliability of existing data solutions during modernization.
- Initially develop and support SQL Server solutions like stored procedures, SSIS packages and views. Transition and deliver reports via Spotfire to transfer data between systems or extract data to be transferred to outside partners or agencies. These solutions are mostly T-SQL and SSIS with occasional Python and PowerShell.
Other
- Bachelor’s Degree from an accredited institution.
- Field of Study: Computer Science, Computer Information Systems or related.
- At least tw0 (2) years of experience with Oil & Gas data workflows and applications such as WellView, Enertia, ProCount, Petra, Cygnet, OSIPI (Aveva), etc.
- Strong problem-solving mindset with a bias for automation and scalability.
- Self-starter who thrives in a modernization and transformation environment and disciplined adherence to standards and processes inside a fast-paced and results-oriented environment.