Accenture Federal Services is looking to hire a GCP Data Pipeline Developer & BigQuery Data Engineer to design, develop, and maintain data pipelines on Google Cloud Platform (GCP), analyze and optimize large datasets using BigQuery, and implement ETL processes to ensure data quality and integrity, ultimately driving data-driven decision-making within the organization.
Requirements
- Proficiency in foundational GCP services.
- Strong experience with BigQuery.
- Solid understanding of data engineering principles.
- Advanced SQL skills for querying and manipulating data.
- Experience with PySpark for data transformations and analytics.
- Familiarity with ETL tools such as Qlik Data Integration or FiveTran.
- Ability to design and implement scalable data solutions.
Responsibilities
- Develop and maintain GCP data pipelines.
- Design and optimize BigQuery data structures.
- Utilize SQL and PySpark for data processing.
- Collaborate with team members on data engineering projects.
- Implement ETL processes using tools like GCP Dataflow, Qlik Data Integration, or FiveTran.
Other
- Strong analytical and problem-solving skills.
- Excellent collaboration and communication skills.
- Experience with other cloud platforms and big data technologies.
- Previous experience in a similar role within a data-driven organization.
- Certifications related to GCP, BigQuery, or data engineering.