Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

Qode Logo

Senior Data Engineer

Qode

Salary not specified
Aug 21, 2025
Columbus, OH, US
Apply Now

Honda is seeking a Senior Data Engineer to design, build, and maintain modern data platforms to support enterprise analytics, machine learning, and digital transformation initiatives, enabling data-driven decision-making.

Requirements

  • Strong proficiency in SQL, Python, and one or more data processing frameworks (e.g., PySpark, Spark, Hadoop).
  • Hands-on experience with cloud platforms (AWS, Azure, or GCP), particularly cloud-native data services (e.g., Databricks, Snowflake, Redshift, BigQuery, Synapse).
  • Experience with data pipeline orchestration tools (Airflow, ADF, Luigi, etc.).
  • Solid understanding of data warehousing concepts, dimensional modeling, and ETL/ELT design patterns.
  • Knowledge of data governance, security, and compliance best practices.
  • Experience in automotive/manufacturing industry or with IoT/connected vehicle data.
  • Familiarity with real-time data streaming technologies (Kafka, Kinesis, Event Hubs).

Responsibilities

  • Design, build, and optimize large-scale data pipelines and ETL/ELT processes using modern data engineering tools and cloud technologies.
  • Develop and maintain scalable data architectures to support advanced analytics and reporting requirements.
  • Collaborate with cross-functional teams (data scientists, analysts, business units) to understand data needs and deliver reliable data solutions.
  • Ensure data quality, governance, and security across all data platforms.
  • Implement best practices for data ingestion, transformation, and storage in cloud and on-prem environments.
  • Monitor, troubleshoot, and improve performance of data systems.
  • Mentor junior engineers and contribute to the establishment of engineering standards and practices.

Other

  • 6+ years of professional experience in data engineering, with proven expertise in large-scale data environments.
  • Excellent problem-solving and communication skills.
  • Exposure to MLOps pipelines and supporting data science workflows.
  • Knowledge of DevOps practices (CI/CD, infrastructure as code, containerization with Docker/Kubernetes).