Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

Agero Logo

Data Engineer

Agero

$100,000 - $140,000
Aug 19, 2025
Remote, US
Apply Now

Agero is seeking a Data Engineer to design, build, and maintain the core data infrastructure that powers their analytics, machine learning, and data science initiatives, optimizing data management processes, ensuring data quality and reliability, and developing scalable, efficient data models to support advanced analytics and data-driven decision-making.

Requirements

  • Extensive experience with Snowflake (preferred) or other cloud-based data warehousing solutions like Redshift or BigQuery.
  • Expertise in building and maintaining ETL/ELT pipelines using tools like Airflow, DBT, Fivetran, or similar frameworks.
  • Proficiency in Python (e.g., Pandas, PySpark) for data processing and transformation.
  • Advanced SQL skills for querying and managing relational and NoSQL databases (e.g., DynamoDB, MongoDB).
  • Solid understanding of data modeling techniques, including dimensional modeling (e.g., star schema, snowflake schema).
  • Knowledge of query optimization and cost management strategies for platforms like Snowflake and cloud environments.
  • Experience with data quality and observability frameworks (e.g., Great Expectations, Soda).

Responsibilities

  • Develop and maintain robust ETL/ELT pipelines to ingest data from diverse sources (relational and NoSQL databases, APIs, etc.), including implementing best practices for real-time and batch data ingestion.
  • Create and optimize data workflows using modern orchestration tools (e.g., Apache Airflow, Snowflake Tasks, Dagster, Mage).
  • Monitor and optimize cloud costs (e.g., AWS, Snowflake) by analyzing resource usage and implementing cost-saving strategies.
  • Perform query optimization in Snowflake to reduce compute costs and improve performance.
  • Develop and maintain modern data architectures, including data lakes and data warehouses (e.g., Snowflake, Databricks, Redshift), considering trade-offs of different data storage solutions and ensuring alignment with business requirements and SLAs.
  • Apply dimensional modeling techniques (Kimball), star and snowflake schemas, and normalization vs. denormalization strategies based on use cases.
  • Develop transformations using DBT (Core or Cloud), Spark (PySpark), or other frameworks.

Other

  • Willingness to travel is required, as you may need to attend on-site team meetings from time to time.
  • Strong communication and collaboration skills with the ability to explain technical concepts to both technical and non-technical audiences.
  • Ability to manage multiple priorities and work independently.
  • Bachelor's degree in a technical field and 3+ years of industry experience or Master's degree in a technical field and 3+ years of industry experience (2-5+ years of experience).