Abacus Insights is looking to solve problems of massive scale and complexity in the US healthcare industry by transforming the healthcare data industry.
Requirements
- 5+ years of building or using cloud services in a production environment (AWS, Azure, GCP, etc.)
- 3+ years of building ETL data pipelines at scale with Spark/PySpark and Databricks
- Strong programming skills (Python, Java, or other OOP Languages)
- Experience with AWS, Azure, and Databricks
- Experience with Airbyte and Snowflake
- Experience with CI/CD frameworks
- Experience with PySpark, Python, and SQL
Responsibilities
- Develop and implement virtual, high performant cloud solutions which conform to US healthcare security standards
- Build data processing pipelines leveraging AWS/Azure, Airbyte, Databricks and Snowflake
- Write PySpark, Python, and SQL code to meet requirements for clients or internal teams
- Deploy code using CI/CD frameworks
- Troubleshoot client reported incidents, identify root cause, fix, and document problems, and implement preventive measures
- Optimize the performance and cost of Databricks workflows
- Drive the technical excellence of a team, mentor other team members and lead by example
Other
- Bachelor's degree, preferably in Computer Science, Computer Engineering, or related IT discipline
- 8+ years of commercial software development experience
- Excellent oral and written communication skills
- Strong analytical, problem solving, organization and prioritization skills
- Go-getter with self-starter mindset
- Someone who stays current with emerging technologies and development techniques