Halvik Corp is looking to design and build infrastructure that transforms raw data into actionable insights for the US Government
Requirements
- Strong SQL skills and database experience (coursework, projects, or internships)
- Demonstrated experience with ETL processes, data manipulation, or database management
- Programming experience in Python, Java, Scala, or similar languages
- Experience with cloud platforms (AWS, Azure, GCP)
- Familiarity with big data tools (Spark, Kafka, etc.)
- Understanding of data warehousing concepts
- Experience with version control and collaborative development
Responsibilities
- Design and implement robust ETL/ELT pipelines using modern frameworks
- Work with big data technologies (Spark, Kafka) and streaming data
- Master cloud data platforms (Snowflake, Databricks, Redshift, Synapse)
- Develop scalable database solutions using Postgres and other RDBMS
- Optimize database performance and design scalable data architectures
- Implement data quality monitoring and automated testing frameworks
- Orchestrate complex data workflows while working with cutting-edge tools like Apache Spark, Kafka, Airflow
Other
- Recent graduate with a degree in Computer Science, Data Science, Engineering, or related field
- Must be a US citizen OR have lived continuously in the US for the past 3 years (US citizenship preferred)
- Submit GRE or SAT scores with application
- Portfolio showcasing data projects or database work
- Travel requirements not specified, but may be required for certain projects or client meetings