AXS is looking to improve the fan experience and provide game-changing solutions for clients by building high-quality, scalable batch and real-time data pipelines integrated with products.
Requirements
- 4-6 years in a data engineering role
- 3+ years Python and writing SQL
- Experience with AWS, Snowflake, SQL, Apache Airflow, DBT and Kubernetes
- Strong Python skills with ability to write ETLELT and automation scripts using Python.
- Experience with streaming data technologies such as EventHub, Kinesis, Redis, SQS, and Kafka
- Strong familiarity with Agile methodologies (CI/CD, sprints, frequent deployments etc.).
- Experience in building both relational and dimensional data modeling
Responsibilities
- Design, implement and maintain modern/efficient, scalable ETL/ELT data pipelines.
- Support existing processes and frameworks, including Agile methodologies
- Define data models that fulfill business requirements
- Process, cleanse, and verify the integrity of data used for analysis
- Prepare development estimates and design documents
- Successfully adopt the best practice and industry standards in data engineering and analytics fields
Other
- BA/BS Degree (4-year) or MS Degree in Computer Science or other related fields preferred
- Strong analytical and communication skills to share findings in an understandable and actionable manner
- Proven ability and willingness to work with multiple partners and cross-functional teams
- Strong teamwork and collaboration, willingness to compromise, and listen with empathy.
- Employer does not offer work visa sponsorship for this position.