Comcast is looking to solve the problem of creating and managing large-scale data pipelines and systems to support its media and technology businesses
Requirements
- Apache Spark
- Extract Transform Load (ETL)
- Kubernetes
- SQL
- Python
- Databricks
- Airflow
Responsibilities
- Contribute to a team responsible for creating tables using SQL within Teradata database
- Perform object-oriented software development using Python
- Process big data with Apache Spark
- Schedule data load jobs using Airflow
- Perform big data Extract, Transform, and Load (ETL) using Databricks and Kubernetes
- Store data within cloud platforms, including AWS S3, and on-premise platforms, including MinIO
- Develop and perform unit testing on integration modules
Other
- Bachelor’s degree, or foreign equivalent, in Computer Science, Engineering, or related technical field
- One (1) year of experience creating tables using SQL or a relational database
- 100% remote work eligible
- Comprehensive benefits package
- Equal opportunity workplace