Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

WillHire Logo

Data Engineering Intern

WillHire

From $16
Oct 16, 2025
Remote, US
Apply Now

WillHire is expanding into the Data Engineering vertical and is seeking to build a Data Engineering Internship Cohort to support data and infrastructure teams in designing, building, and maintaining scalable data pipelines for their HR-tech environment.

Requirements

  • Currently pursuing (or recently completed) B.Tech/BE/M.Tech/MSc in Computer Science, Data Engineering, IT, or related fields.
  • Strong understanding of databases, SQL, and data modeling concepts.
  • Familiarity with Python/Java/Scala for data processing.
  • Basic knowledge of data warehousing, ETL concepts, and data pipelines.
  • Understanding of cloud platforms (AWS/GCP/Azure) and their data services.
  • Certifications such as AWS Certified Data Engineer, Google Cloud Data Engineer, or Microsoft Azure Data Fundamentals (even if in-progress).
  • Experience with big data tools like Hadoop, Spark, or Kafka.

Responsibilities

  • Assist in designing, developing, and maintaining ETL pipelines for structured and unstructured data.
  • Work with databases (SQL/NoSQL) to ensure data accuracy, integrity, and accessibility.
  • Support integration of APIs and third-party data sources into the data ecosystem.
  • Help optimize data storage, transformation, and retrieval for performance and scalability.
  • Contribute to data quality checks, validation processes, and monitoring frameworks.
  • Assist in developing dashboards and reports for business and product stakeholders.
  • Research new tools, frameworks, and best practices in data engineering & analytics.

Other

  • Currently pursuing (or recently completed) B.Tech/BE/M.Tech/MSc in Computer Science, Data Engineering, IT, or related fields.
  • Familiarity with data visualization tools (Tableau, Power BI, Looker).
  • Hands-on experience in API integrations, streaming data, or real-time processing.
  • Participation in data hackathons, Kaggle competitions, or open-source data projects.
  • Critical thinking, attention to detail, and a proactive learning mindset.