Transforming data into actionable insights to empower business and technical leaders to make impactful decisions.
Requirements
- 8+ years of related hands-on experience (Spark, PySpark, Shell scripting, Teradata, and Databricks).
- Proven ability to write complex and efficient SQL queries and stored procedures.
- Solid experience implementing data lake or data warehouse solutions on Databricks.
- Proficiency with Agile methodologies and DevOps tools such as Git, Jenkins, and Artifactory.
- Experience with Unix/Linux shell scripting (KSH) and basic Unix server administration.
- Hands-on experience with AWS services including S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch.
- Expertise in Databricks components such as Delta Lake, Notebooks, Pipelines, cluster management, and cloud integration (Azure/AWS).
Responsibilities
- Partner with business and technical stakeholders to deeply understand and translate requirements into scalable data solutions.
- Design and document robust, high-performance data architectures that support strategic business outcomes.
- Build and optimize production-grade ETL pipelines using Spark and PySpark, ensuring reliability and efficiency.
- Develop and maintain data models that enable advanced analytics and reporting.
- Write efficient, complex SQL queries (Teradata SQL, Hive SQL, Spark SQL) across platforms such as Teradata and Databricks Unity Catalog.
- Implement and manage CI/CD pipelines to seamlessly deploy code artifacts to AWS and Databricks.
- Orchestrate and monitor Databricks jobs using Databricks Workflows, proactively troubleshooting and resolving issues.
Other
- Contribute actively to Agile ceremonies, including sprint planning, grooming, daily stand-ups, demos, and retrospectives, fostering a culture of continuous improvement.
- Familiarity with job scheduling tools like CA7 Enterprise Scheduler.
- Proficiency with collaboration tools like Jira and Confluence.
- Demonstrated creativity, foresight, and sound judgment in planning and delivering technical solutions.
- Bring your curiosity, drive, and collaborative spirit to our team—together, we’ll shape the future of data-driven decision-making!