eSimplicity is looking for a Staff Software Engineer to build tools that help move, manage, and govern large-scale data across interconnected platforms, and to build web interfaces, backend services, and automated workflows for internal helper tools, data mesh strategies, and authenticated access to distributed data environments.
Requirements
- Minimum 2 years working in large scale Databricks implementation.
- Proficiency in at least one of the following languages: TypeScript, JavaScript, Python.
- Proven experience working on large-scale system architectures and Petabyte-level data systems.
- Proficient in automated testing frameworks (PyTest, Jest, Cypress, Playwright) and testing best practices.
- Experience developing, testing, and securing RESTful and GraphQL APIs.
- Proven track record with AWS cloud architecture, including networking, security, and service orchestration.
- Experience with containerization and deployment using Docker, and infrastructure automation with Kubernetes and Terraform.
Responsibilities
- Creates project-specific technical design, product and vendor selection, application, and technical architectures.
- Responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams.
- Creates new pipeline development and maintains existing pipeline; updates Extract, Transfer, Load (ETL) process; creates new ETL feature development; builds PoCs with Redshift Spectrum, Databricks, etc.;
- Implements, with the support of project data specialists, large dataset engineering: data augmentation, data quality analysis, data analytics (anomalies and trends), data profiling, data algorithms, and (measure/develop) data maturity models and develop data strategy recommendations.
- Assemble large, complex data sets that meet non-functional and functional business requirements.
- Identify, design, and implement internal process improvements, including re-designing data infrastructure for greater scalability, optimizing data delivery, and automating manual processes. ?
- Building required infrastructure for optimal extraction, transformation, and loading of data from various data sources using AWS and SQL technologies
Other
- Leads and mentors all other data roles in the program.
- Strong written and verbal communication skills, including the ability to explain technical concepts to non-technical stakeholders.
- Comfortable working in a tightly integrated Agile team (15 or fewer people).
- All candidates must pass public trust clearance through the U.S. Federal Government.
- Expected hours are 9:00 AM to 5:00 PM Eastern unless otherwise directed by your manager.