Develop enterprise grade data platforms, services, and pipelines.
Requirements
- Big data tools: Hadoop, Spark, Kafka, etc.
- Relational SQL and NoSQL databases, including Postgres and Cassandra.
- Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- AWS cloud services: EC2, EMR, RDS, Redshift (or Azure equivalents)
- Data streaming systems: Storm, Spark-Streaming, etc.
- Search tools: Solr, Lucene, Elasticsearch
- Object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Responsibilities
- Lead and architect migration of data environments with performance and reliability.
- Assess and understand the ETL jobs, workflows, BI tools, and reports
- Address technical inquiries concerning customization, integration, enterprise architecture and general feature / functionality of data products
- Experience in crafting database / data warehouse solutions in cloud (Preferably AWS. Alternatively Azure, GCP).
- Key must have skill sets – Python, AWS
- Support an Agile software development lifecycle
- You will contribute to the growth of our Data Exploitation Practice!
Other
- Ability to hold a position of public trust with the US government.
- 2-4 years industry experience coding commercial software and a passion for solving complex problems.
- As part of the application process, you are expected to be on camera during interviews and assessments.
- We reserve the right to take your picture to verify your identity and prevent fraud.
- We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.