At American Express, the business problem is to design, develop and implement data pipelines to source data from internal or external sources using various technologies like API (REST / Graph), Data Brokers, ETL tools, etc.
Requirements
- At least 3+ years of experience in Python and Python Frameworks e.g. Django, FastAPI etc.
- Experience in developing RESTful APIs.
- Hands-on experience on ELK stack with experience in building logstash pipeline for a high volume data flow.
- Proficiency with data flow solutions like Apache Nifi, Apache Airflow, AWS Glue, Azure Data Factory or similar tools.
- Experience with relational database management systems such as Microsoft SQL Server, Postgres.
- Basic knowledge of Linux based systems
- Good experience of working on Kubernetes, Docker and containerized platform.
Responsibilities
- Design and implement RESTful APIs to support backend data engineering pipelines and front-end functionality in case of web application.
- Develop data flows using ETL tools.
- Spot patterns, correlate data and understand its value in solving business use cases.
- Collaborate with the development team to define and implement CI/CD pipelines.
- Utilize Git for version control and collaborate on code reviews.
- Ensure the performance, quality, and responsiveness of data pipelines and applications.
- Identify and fix bugs and performance bottlenecks.
Other
- Bachelor’s degree in Software Engineering, Computer Science, Mathematics, or Information Systems.
- 5+ years of experience in data engineering projects
- Soft Skills - analytical attitude, critical thinking, self-driving, proactive.
- 20+ weeks paid parental leave for all parents, regardless of gender, offered for pregnancy, adoption or surrogacy
- Flexible working model with hybrid, onsite or virtual arrangements depending on role and business need