Costco IT is responsible for the technical future of Costco Wholesale, the third largest retailer in the world. This role is focused on data engineering to build and deliver automated data pipelines from a plethora of internal and external data sources to support Costco's Sustainability Commitment and work towards a more sustainable future.
Requirements
- 5+ years’ experience engineering and operationalizing data pipelines with large and complex datasets.
- 3+ years’ hands-on experience with Informatica PowerCenter and/or IICS.
- 4+ years’ experience working with Cloud technologies such as Data flow, Data Fusion, Pub/Sub, Dataform, dbt, GCS, Bigquery, Cloud SQL, Firestore/ Datastore, Apigee ADLS, Azure Databricks, Spark, Azure Synapse, Cosmos DB, and other big data technologies.
- Extensive experience working with various data sources (DB2, SQL,Oracle, flat files (csv, delimited), APIs, XML, JSON.
- Advanced SQL skills required. Solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
- 3+ years’ experience with Data Modeling, ETL, and Data Warehousing.
- Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).
Responsibilities
- Builds data models and develops data pipelines to store data in defined data models and structures.
- Develops and operationalizes data pipelines to create enterprise certified data sets that are made available for consumption (BI, Advanced analytics, APIs/Services).
- Works in tandem with Data Architects, Data Stewards and Data Quality Engineers to design data pipelines and recommends ongoing optimization of data storage, data ingestion, data quality and orchestration.
- Designs, develops, & implements ETL/ELT processes using Informatica Intelligent Cloud Services (IICS).
- Uses Google Cloud and Azure services such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Cosmos, Databricks, Delta Lake to improve and speed up delivery of our data products and services.
- Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
- Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery.
Other
- Communicates technical concepts to non-technical audiences both in written and verbal form.
- Performs peer reviews for other data engineer’s work.
- Experience delivering data solutions through agile software development methodologies.
- Exposure to the retail industry.
- Excellent verbal and written communication skills.