EverCommerce is looking to design, build, and maintain its data infrastructure and platforms to ensure scalability, reliability, and performance, supporting the company's analytical and operational needs.
Requirements
- Proficiency in programming languages such as Python, Java, or Scala, and experience with data processing frameworks such as Apache Spark, Snowflake, Apache Flink, or Hadoop.
- Strong understanding of distributed systems, cloud computing platforms (e.g., AWS, GCP, Azure), and containerization technologies (e.g., Docker, Kubernetes).
- Deep experience with relational databases and data warehousing technologies (e.g., PostgreSQL, MySQL, Redshift, Snowflake) and NoSQL databases (e.g., Cassandra, MongoDB).
- Hands-on experience with data pipeline orchestration tools such as Apache Airflow, Luigi, or Prefect.
- Experience with cloud-based platforms (e.g., Data Lake, Lakehouse, Redshift, Databricks, Snowflake, BigQuery, etc.).
- Experience with modern data integration tools (e.g., dbt, Fivetran, Airflow, Glue, AWS Data Pipelines, etc.).
- Experience with data monitoring and observability tools.
Responsibilities
- Design and implement scalable, reliable, and efficient data infrastructure solutions to support the organization's data processing, storage, and analytics needs.
- Collaborate with data engineers, software engineers, and data scientists to understand data requirements and translate them into technical specifications and architecture designs.
- Develop and maintain data pipelines for ingesting, processing, and transforming large volumes of data from various sources, ensuring data quality and integrity.
- Optimize data storage and retrieval processes, including database schema design, indexing strategies, and query optimization, to enhance performance and reduce latency.
- Implement monitoring, alerting, and logging mechanisms to proactively identify and troubleshoot issues in the data infrastructure, ensuring high availability and reliability.
- Lead the design and implementation of scalable data infrastructure using cloud-based platforms (e.g., Data Lake, Lakehouse, Redshift, Databricks, Snowflake, BigQuery, etc.) for EverCommerce.
- Build the overall strategy for data lake / Lakehouse, data ingestion, data processing and develop and manage ETL processes and data integration solutions using modern tools (e.g., dbt, Fivetran, Airflow, Glue, AWS Data Pipelines, etc.).
Other
- 7+ years of experience in data & analytics leadership role in managing cross functional teams that creates, engineers, builds and maintains data infrastructure and platforms.
- Proven ability to manage and lead team of data engineers, data ops engineers.
- Provide technical leadership and mentorship to junior members of the team, fostering a culture of collaboration, learning, and continuous improvement.
- Work closely with stakeholders to understand business requirements and priorities.
- This role can be based anywhere in the United States or Canada – if you’re close to one of our offices, we can set you up in-office or you can work 100% remotely.
- Please note that you must be eligible to work without sponsorship to qualify for this position, and this role may require travel to our Corporate Headquarters in Denver, Colorado, or to other office locations around North America.