Nimble is looking to scale its data engineering efforts to advance its robotics moonshot and drive design and development of data infrastructure across its products and internal tools.
Requirements
Proficiency in writing production-grade code in at least one of: Python, Java, Rust, Go, or Scala.
Strong debugging skills and the ability to diagnose and resolve issues efficiently.
Experience with Kafka and/or Spark.
Experience working with AWS or other cloud platforms.
Experience with modern data engineering tools like Databricks, Clickhouse, DeltaLake, Apache Iceberg, Apache Flink, Parquet, Apache, Arrow Airflow
Experience with Robotics, Machine Learning and Mechanical Engineering
Responsibilities
Design and maintain scalable, reliable data pipelines.
Ensure timely and efficient data collection, processing, and availability.
Integrate data from multiple sources for both batch and streaming analytics.
Manage and optimize the data lakehouse for performance and reliability.
Apply best practices in data modeling, pipeline design, and query optimization.
Monitor pipeline performance and troubleshoot issues.
Develop and manage ETL/ELT processes for loading data into data warehouses and data lakes.
Other
Fast learner with a strong work ethic, high performance standards, and ambition.
1-4 years of experience in a tech company or fast-growing startup.
Willing to work extended hours and weekends if needed
This position is based full time in our San Francisco headquarters
BS/MS/PhD in Computer Science, Mathematics, Computer Engineering, or a related field, or equivalent practical experience.