Modernize a legacy platform to a new AWS-based system, ensuring quality, reliability, and data integrity during the transition.
Requirements
- Experience in Data Ingestion Testing is must.
- Should have experience in Streaming & Batch processing tools like Apache Kafka, Spark, Flink or equilents
- Should know to run test for latency, throughput and fault tolerance
- Should be familiar with Data profiling and anomaly detection tools
- should know to test IAM roles, policies, encryption at rest/in-transit
- should know to validate secure API-end points and token-based authentication
- Proven experience in designing and implementing automated testing frameworks from scratch.
Responsibilities
- Design, build, and maintain automated testing frameworks for both real-time and batch data processing capabilities.
- Implement automation to validate the provisioning of AWS infrastructure and its adherence to security and compliance standards.
- Conduct a variety of tests, including unit, integration, performance, and security testing.
- Perform DR drills to validate RTO/RPO objectives.
- Develop and validate a reconciliation system to ensure data integrity and parity between the legacy and new platforms.
- Verify that access controls, encryption, and audit logs are correctly implemented and reviewed with stakeholders.
- Validate that CI/CD pipelines support automated deployment and rollback.
Other
- The resource should understand the data flow and should have the ability to test transformations, schema mapping and data integrity.
- good in writing test cases for completeness, accuracy and consistency
- Experience with Agile methodologies and managing agile backlogs is a plus.
- This position is not available for independent contractors