Cognizant is looking for a GCP Data Engineer to build scalable data solutions using Google Cloud Platform and Big Data technologies to help clients envision, build, and run more innovative and efficient businesses.
Requirements
- Strong hands-on experience with GCP, Kafka, Spark, Spring Boot, Java, and REST APIs.
- Proven expertise in data warehousing, data lakes, and big data technologies.
- Solid understanding of cloud-native architectures and data integration strategies.
- GCP certification (e.g., Google Cloud Professional Data Engineer or Architect) is a plus.
Responsibilities
- Design and implement scalable data pipelines and architecture using GCP services, Kafka, and REST APIs.
- Develop and deploy high-performance applications using Apache Spark, Spring Boot, and Java.
- Build and manage data lakes and data warehousing solutions to support analytics and reporting.
- Leverage GCP tools (e.g., BigQuery, Dataflow, Pub/Sub, Cloud Storage) to optimize data processing and integration.
- Lead big data initiatives to extract insights from large datasets and support business decision-making.
- Stay current with industry trends and emerging technologies in data engineering and cloud computing.
Other
- Experience in stakeholder and project management.
- Collaborate with cross-functional teams to ensure alignment with business goals and technical requirements.
- Manage project planning, tracking, and delivery to meet timelines and budget expectations.
- Engage with stakeholders to gather requirements and provide regular updates.
- Ensure high standards of customer service and support for internal and external clients.
- Promote best practices in project and portfolio management (PPM).