Farmers is looking to transform raw data into structured, high-quality datasets that are ready for analysis to solve low to moderately complex business problems, enabling data-driven decision-making across the organization.
Requirements
- Utilizes modern data tools and technologies such as SQL, Python, dbt.
- Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery.
- Creates and manages dimensional models, star/snowflake schemas, and other data structures.
- Transforms raw data into clean, organized, and analytics-ready datasets using SQL, Python, or other relevant languages.
- Implements data transformation workflows to handle data cleansing, normalization, and enrichment.
- Conducts data validation and consistency checks to ensure the accuracy and reliability of data.
- Implements data quality monitoring and alerting mechanisms.
Responsibilities
- Architects and builds scalable data pipelines using modern ETL (Extract, Load, Transform) tools and frameworks such as dbt (Data Build Tool), Apache Airflow, or similar.
- Utilizes modern data tools and technologies such as SQL, Python, dbt.
- Automates data ingestion processes from various sources including databases, APIs, and third party services.
- Designs and implements data warehousing solutions using platforms like Snowflake, Redshift, or BigQuery.
- Develops and maintains logical and physical data models to support business analytics.
- Transforms raw data into clean, organized, and analytics-ready datasets using SQL, Python, or other relevant languages.
- Conducts data validation and consistency checks to ensure the accuracy and reliability of data.
Other
- High School Diploma or equivalent required.
- Bachelors degree preferred in computer science, data science, engineering, or a related field.
- Two to Four years of data engineering/SQL related work experience required.
- Strong verbal and written communication skills.
- Demonstrated ability to secure 'buy-in' and convince others regarding best approach.