The Swift Group is looking to solve the problem of designing data models, building data warehouses and data lakes, automating data pipelines and working with datasets of all sizes for Civilian, Defense, and Intelligence Community customers
Requirements
- Experience integrating diverse data streams
- Experience with data transfer tools (Ex: NiFi, Cribl, etc.)
- Experience establishing data standards and acting as custodian of IT and service delivery data sets and streams
- Experience in analyzing complex networks and systems
- Experience in analyzing log data from various network components and operating systems
- Splunk familiarization
- Python experience
Responsibilities
- Optimize overall data/information flow by reducing redundancy and enabling accessibility within security boundaries.
- Apply data extraction, transformation and loading (ETL) techniques to connect large data sets from a variety of sources
- Create data collection frameworks for structured and unstructured data
- Create cyber and operational analytic tools for consumers that assist them in building and optimizing downstream products inclusive of constructing dataset procedures that can help with data mining, modeling, and production
- Use and develop standards, tools, processes and governance for capturing, modeling, storing, and delivering data for the enterprise
Other
- Bachelor’s or Master’s degree
- Minimum of sixteen (16) years of relevant engineering experience
- US citizenship and an active TS/SCI with Polygraph security clearance required
- Ability to manage and troubleshoot data feeds