At Steampunk, our goal is to build and execute a data strategy for our clients to coordinate data collection and generation, to align the organization and its data assets in support of the mission, and ultimately to realize mission goals with the strongest effectiveness possible.
Requirements
- Experience in crafting database / data warehouse solutions in cloud (Preferably GCP. Alternatively Azure, AWS).
- Key must have skill sets – Python.
- Big data tools: Hadoop, Spark, Kafka, etc.
- Relational SQL and NoSQL databases, including Postgres, CloudSQL, MongoDB
- Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- AWS cloud services: EC2, EMR, RDS, Redshift (or Azure and GCP equivalents)
- Google Professional Data Engineer certification
Responsibilities
- Lead and architect migration of data environments with performance and reliability.
- Assess and understand the ETL jobs, workflows, BI tools, and reports.
- Address technical inquiries concerning customization, integration, enterprise architecture and general feature / functionality of data products.
- Support an Agile software development lifecycle.
- Act as an individual contributor and mentor more junior team members.
- Contribute to the growth of our Data Exploitation Practice.
- Design and develop solutions to high-impact, complex data problems.
Other
- Ability to hold a position of public trust with the US government.
- 8-10 years industry experience coding commercial software and a passion for solving complex problems.
- Excellent communication and customer service skills
- Ability to work in an Agile environment.
- Experience supporting project teams of developers and data scientists who build web-based interfaces, dashboards, reports, and analytics/machine learning models.