Accelerating development of Ibotta's cutting-edge data platform as a leader in the Data Platform Organization, working with both engineering and analytics to develop and own stable, scalable, and approachable data platforms, enabling data mesh concepts while heavily leveraging AWS cloud and Databricks Lakehouse technologies.
Requirements
- 3+ years of experience in software development, preferably with Scala and Python.
- Preferred experience building/implementing data pipelines using Databricks
- Experience being a key critical contributor participating in medium and large data projects from ideation to implementation
- Preferred experience with event-driven architecture design patterns and practices
- Experience in database design principles supported by strong SQL abilities
- Experience building processes supporting data transformation, data structures, metadata, dependency, and workload management
- Experience with the following a strong plus:
-
- AWS Cloud Services; EC2, S3
-
- Experience with Scala and Spark
-
- Experience with Delta Lake, Apache Iceberg, or Apache Hudi
-
- Message Brokers such as Kafka or Kinesis
-
- ETL tools and processes (Airflow or other similar tools)
-
- Infrastructure as code using Terraform, CloudFormation, etc
-
- Experience building APIs and libraries
Responsibilities
- Work with cross-functional engineering teams to enable approachable and self-service data movement and access patterns
- Provide guidance and assistance to stakeholders with building complex datasets that meet the business needs.
- Identify, design, and implement process improvements including automating manual processes, optimizing data delivery, re-designing infrastructure for greater reliability and performance.
- Work as a member of the Data Engineering squad to deliver product features and resolve data related technical issues.
- Work with information security to keep our data secure.
- Support the engineering of distributed systems, frameworks, and design patterns enabling efficient usage of Ibotta’s Data Lake
- Use Scala or Python to utilize Spark to collect and manage data at scale
Other
- This position is located in Denver, Colorado as a hybrid position requiring 3 days in office, (Tuesday, Wednesday, and Thursday).
- Candidates must live in the United States.
- Perform incident resolution and root cause analysis of critical outages. Implement solutions to systematic failures. Provide on-call support, including after-hours on a rotational basis.
- Embrace and uphold Ibotta’s Core Values: Integrity, Boldness, Ownership, Teamwork, Transparency, & A good idea can come from anywhere
- Agile (Kanban or Scrum) development experience