Fractal is looking to design, develop, enhance, and maintain scalable data pipelines across heterogeneous datasets in enterprise-scale data warehousing projects using GCP technologies.
Requirements
- GCP technologies such as BigQuery, Cloud Composer, Dataproc, Cloud Run Functions
- Python-based frameworks
Responsibilities
- Design, develop, enhance, and maintain scalable data pipelines across heterogeneous datasets in enterprise-scale data warehousing projects using GCP technologies such as BigQuery, Cloud Composer, Dataproc, Cloud Run Functions and Python-based frameworks.
Other
- The wage range for this role takes into account the wide range of factors that are considered in making compensation decisions including but not limited to skill sets; experience and training; licensure and certifications; and other business and organizational needs.
- The disclosed range estimate has not been adjusted for the applicable geographic differential associated with the location at which the position may be filled.
- At Fractal, it is not typical for an individual to be hired at or near the top of the range for their role and compensation decisions are dependent on the facts and circumstances of each case.
- As a full-time employee of the company or as an hourly employee working more than 30 hours per week, you will be eligible to participate in the health, dental, vision, life insurance, and disability plans in accordance with the plan documents, which may be amended from time to time.
- You will be eligible for benefits on the first day of employment with the Company.