At Accenture Federal Services, the business problem is to help the US federal government make the nation stronger and safer and life better for people by developing stages of distributed parallel data processing pipelines.
Requirements
- 2 years of experience with Python, Java, or other programming languages
- 2 years of experience applying agile methodologies to the software development life cycle (SDLC)
- 2 years of experience with Git repositories and CI/CD pipelines
- 2 years of experience with distributed parallel streaming and batch data processing pipelines
- 2 years of experience integrating with data SDKS / APIs and data analytics SDKs / APIs
- 1 year of experience with Python/PySpark
- 1 year of experience with Java/Java interface to Spark
Responsibilities
- Operate and maintain the data processing pipelines in accordance with the availability requirements of the platform
- Follow agile methodologies when applied to the data engineering part of the system development life cycle (SDLC)
- Update technical documentation such as system design documentation (SDD); standard operating procedures (SOPs); and tactics, techniques, and procedures (TTPs); and training material
Other
- Bachelor's degree in a discipline covered under Science, Technology, Engineering or Mathematics (STEM)
- Active TS/SCI is required
- Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States
- 2 years of experience as a software engineer or a data engineer
- Travel requirements not specified