The company is seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure to enable analytics and business intelligence.
Requirements
- Strong proficiency in SQL and experience with relational databases.
- Hands-on experience with Python/Scala/Java for data processing.
- Familiarity with big data technologies (Hadoop, Spark, Kafka).
- Experience with cloud platforms (AWS, Azure, GCP) and data services.
- Knowledge of data modeling, warehousing, and architecture principles.
- Understanding of CI/CD pipelines and version control (Git).
- Experience with streaming data and real-time processing.
Responsibilities
- Design, develop, and maintain data pipelines for ingestion, transformation, and storage.
- Implement ETL processes to ensure data quality, consistency, and reliability.
- Optimize data workflows for performance and scalability.
- Manage and monitor data infrastructure (cloud/on-premise).
- Ensure compliance with data governance, security, and privacy standards.
- Troubleshoot and resolve data-related issues promptly.
- Collaborate with data scientists, analysts, and business teams to understand data requirements.
Other
- Bachelor’s/Master’s degree in Computer Science, Engineering, or related field.
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration abilities.
- Ability to work in a fast-paced, dynamic environment.
- Travel requirements not specified