SingleOps is looking to deliver trusted, near-real-time data products that support analytics, AI applications, and customer-facing data features in the green industry.
Requirements
- Deep mastery of SQL and extensive, hands-on experience with Snowflake
- Strong experience with dbt or similar data transformation frameworks
- Proficient in Python, Scala, or similar languages used in data pipeline logic/automation
- Experience with orchestration tools like Azure Data Factory, Airflow, or similar
- Comfortable working in a modern, git-based development environment with CI/CD
- Experience with cloud-native data streaming technologies such as Azure Event Grid
- Exposure and understanding of Data Architectural patterns such as Medallion
Responsibilities
- Build and maintain scalable, modular data pipelines using tools like dbt, Azure Data Factory, etc.
- Design batch and streaming data workflows that support near-real-time reporting and operational intelligence.
- Deliver high-quality, trusted datasets to enable analytics, dashboards, embedded apps, and AI use cases.
- Influence and guide the evolution of our data platform tooling and architectural decisions.
- Contribute to structured architectural patterns such as Medallion for layered, reusable data models.
- Drive data quality through testing, observability, and proactive alerting (e.g. dbt test, data contacts).
- Partner across teams to improve velocity, reusability, and access to data with documentation, lineage, and governance.
Other
- 5+ years of experience in data engineering or analytics engineering roles
- Proper work authorization to work for any employer in the United States, without sponsorship from the company
- Comprehensive health, dental, and vision plans for you and your family
- 401K Matching
- Unlimited Paid Time Off, paid company holidays, and a company-wide shutdown from Dec 24 - Jan 1 for a well-deserved holiday break