Leidos needs an ETL Developer to shape and implement cutting-edge data flow solutions centered around Apache NiFi, providing technical expertise in the design, development, implementation, and testing of customer tools and applications for Extracting, Transforming, and Loading (ETL) data into an enterprise Data Lake.
Requirements
- Direct experience designing, developing, and managing complex NiFi data flow solutions in large-scale enterprise environments.
- Experience in programming languages like Java and/or Python and scripting for automation.
- Experience writing and optimizing complex queries, along with experience in interacting with relational and NoSQL databases (e.g., Postgres, Elasticsearch, DynamoDB).
- Experience with real-time streaming, and API integration (REST) for seamless data connectivity.
- Experience with cloud platforms like AWS, Azure, or OCI and related data services
- Ability to analyze complex data challenges, identify root causes, and implement effective solutions.
- Experience developing custom NiFi processors with Java and/or Python.
Responsibilities
- Building complex NiFi data pipeline design: Develop and implement enterprise-level ETL NiFi data pipelines for large-scale data ingestion, transformation, and processing from diverse sources.
- Performance optimization and tuning: Optimize NiFi data flows, including processor tuning, memory management, and load balancing, ensuring optimal performance for batch and real-time processing.
- Troubleshooting and problem resolution: Identify, diagnose, and resolve complex NiFi data flow issues, including performance bottlenecks, data discrepancies, and integration failures.
- Integrating with big data and cloud technologies: Seamlessly integrate NiFi with various databases, big data ecosystems, and cloud platforms (e.g., AWS, OCI, Azure), demonstrating familiarity in relevant services (e.g., Kafka, Elasticsearch, S3, SQS/SNS).
- Follow best practices and standards: Follow establish best practices for NiFi development, deployment, security, and governance, ensuring adherence to enterprise-wide data management policies.
- Documentation and knowledge sharing: Create and maintain comprehensive documentation for NiFi data flows, mappings, architectures, and standard operating procedures, ensuring knowledge transfer and promoting efficient team operations.
- Collaboration and communication: Collaborate effectively with data architects, data engineers, application/service developers, and other stakeholders to translate business requirements into robust technical solutions.
Other
- To be considered must have an active TS/SCI with polygraph security clearance
- Ability to collaborate effectively with cross-functional teams and articulate technical concepts clearly.
- Will consider experience in lieu of a degree.
- This is a role for the restless, the over-caffeinated, the ones who ask, “what’s next?” before the dust settles on “what’s now.”
- If you’re already scheming step 20 while everyone else is still debating step 2… good. You’ll fit right in.