Enable business capabilities through technical solutions at Fort Mill SC
Requirements
- AWS
- Spark
- AWS Glue
- Aurora Postgres
- EKS
- Redshift
- PySpark
Responsibilities
- Work with development teams and other project leaders/stakeholders to provide technical solutions that enable business capabilities
- Design and develop data applications using big data technologies (AWS, Spark) to ingest, process, and analyze large disparate datasets
- Build robust data pipelines on the Cloud using AWS Glue, Aurora Postgres, EKS, Redshift, PySpark, Lambda, and Snowflake
- Build Rest-based Data API using Python and Lambda
- Build the infrastructure required for optimal extraction, transformation, and loading of data from various data sources using SQL and AWS ‘big data’ technologies
- Implement architectures to handle large-scale data and its organization
- Execute strategies that inform data design and architecture partnering with enterprise standard
Other
- Minimum 10 years of experience in data engineering
- Work with development teams and other project leaders/stakeholders
- Work with data and analytics experts
- Work across teams to deliver meaningful reference architectures