Artmac Soft is looking to solve data engineering problems using AWS cloud services and Databricks
Requirements
- Proficiency in Databricks and PySpark
- Strong programming skills in Python and SQL
- Experience with AWS data services such as S3, Lambda, Glue, Redshift, and EMR
- Familiarity with data warehousing concepts and design
- Knowledge of data governance, data quality, and data security best practices
- Experience with version control systems (e.g., Git) and CI/CD pipelines
- Experience with real-time data processing and stream processing frameworks such as Kafka or Kinesis
Responsibilities
- Experience as a Data Engineer with a strong focus on AWS cloud services.
- Proficiency in Databricks and PySpark, with hands-on experience building and managing ETL pipelines.
- Strong programming skills in Python and SQL.
- Experience with AWS data services such as S3, Lambda, Glue, Redshift, and EMR.
- Familiarity with data warehousing concepts and design.
- Knowledge of data governance, data quality, and data security best practices.
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
Other
- Bachelor's degree or equivalent combination of education and experience
- Excellent communication and collaboration skills, with the ability to work effectively in a team environment
- Ability to manage multiple tasks and projects simultaneously, prioritizing work to meet deadlines
- Strong problem-solving skills with a keen attention to detail
- Location: Richmond, Virginia (On-Site)