The company is looking to build robust, scalable data pipelines and infrastructure to power its cutting-edge applications.
Requirements
- Strong programming skills in Python, Go and C++.
- Expertise in Linux and command-line tools.
- Hands-on experience with cloud platforms, particularly Google Cloud Platform (GCP) and its data services.
- Solid experience building and managing CI/CD pipelines with Jenkins.
- Proficiency with containerization (Docker) and orchestration (Kubernetes), including application deployment with Helm charts.
- Experience with workflow orchestration tools like Apache Airflow.
- Solid understanding of big data technologies, particularly Apache Spark.
Responsibilities
- Design, develop, and maintain scalable and reliable data pipelines using Python and Go.
- Develop and maintain robust CI/CD pipelines using Jenkins to automate the testing and deployment of data applications.
- Work within a Linux environment to manage and optimize data processing jobs.
- Automate complex infrastructure and deployment tasks through effective scripting (e.g., shell scripts).
- Build, schedule, and monitor complex data workflows using Apache Airflow.
- Process large datasets efficiently with distributed computing frameworks like Apache Spark.
- Manage and configure infrastructure as code using Ansible for automation and consistency.
Other
- Proven experience as a Senior level Software Developer, Data Engineer, DevOps Engineer, or in a similar role.
- Collaborative and supportive work environment.
- Opportunities for professional growth and skill development.
- The chance to work on exciting and challenging projects that make a real impact.
- Self-starter with a passion for building resilient, automated data systems.