The company is looking to transform raw data into actionable insights by ensuring high performance, availability, and reliability of the HPC organization’s data feeds.
Requirements
- Experience using the Linux CLI
- Experience developing scripts using Bash/Python
- Recent software development experience with Java in a Linux environment
- Familiarity with real-time streaming concepts, distributed computing fundamentals, and big data ecosystems such as Apache Storm, Apache Flink, Apache Spark, or Apache NiFi
- Experience with CI/CD concepts, principles, methodologies, and tools such as GitLab CI
- Experience with containerization technologies such as Docker
- Experience with Git Version Control System
Responsibilities
- developing, and maintaining scalable real-time ETL (Extract, Load, Transform) pipelines using Apache Storm and Java to process large-scale data streams.
- transforming raw data into actionable insights by ensuring high performance, availability, and reliability of the HPC organization’s data feeds.
Other
- CLEARANCE REQUIRED: TS/SCI w/ Polygraph
- U.S. Citizenship
- Master's degree in Computer Science or related discipline from an accredited college or university.
- Bachelor's degree in Computer Science or related discipline from an accredited college or university, plus two (2) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity
- Four (4) years of experience as a SWE, in programs and contracts of similar scope, type, and complexity.