AnswerRocket is seeking a Data Engineer to help build data pipelines and AI systems, developing expertise in modern data technologies as a foundation for AI, to implement scalable solutions for enterprise clients.
Requirements
- Strong foundation in SQL and Python programming
- Experience with at least one cloud platform (AWS, Azure, or GCP) and basic understanding of cloud services
- Knowledge of database concepts, data modeling, and data warehousing fundamentals
- Understanding of version control (Git) and basic software development practices
- Experience with data pipeline orchestration tools (Airflow, dbt, or similar)
- Hands-on experience with modern data platforms like Snowflake, BigQuery, or Databricks
- Knowledge of containerization tools (Docker) and CI/CD concepts
Responsibilities
- Build and maintain data pipelines using cloud platforms and orchestration tools under senior engineer guidance
- Implement ETL/ELT processes for data ingestion, transformation, and quality validation
- Contribute to data warehouse development and optimization projects
- Support AI/ML pipeline development and model deployment processes
- Write and maintain SQL queries, Python scripts, and data transformation logic
- Assist with data quality monitoring, testing, and troubleshooting production issues
- Build simple APIs and data access interfaces following established patterns
Other
- This is a fully remote role, supported by an Atlanta-based team, and requires occasional travel to client sites based on project needs.
- Bachelor's degree in Computer Science, Data Engineering, Information Systems, or related field; or equivalent industry experience
- 2-4 years of experience working with data systems, databases, or software development
- Strong analytical and problem-solving skills with attention to detail
- Excellent communication skills and ability to work collaboratively in a team environment