Bentley Systems is looking for a Data Engineer to build the foundation for scalable, cloud-native data solutions that empower their global teams, driving innovation and impact through actionable insights.
Requirements
- 3+ years of hands-on experience building scalable cloud data solutions using platforms such as Snowflake, Databricks, Redshift, or Azure Synapse, and cloud storage like S3 or Azure Blob Storage
- Strong command of SQL, with experience writing complex queries and stored procedures
- 1+ years of experience working with Databricks in a production environment
- Proven ability to design, develop, and deploy enterprise-grade data solutions, including data warehouses, data marts, and ETL/ELT pipelines
- Experience supporting business intelligence and analytics projects, working closely with data consumers to deliver curated datasets
- Proficiency in Python or other scripting languages for data integration and automation
- Understanding of version control systems like Git
Responsibilities
- Develop and maintain cloud-based data pipelines using tools like Matillion, FiveTran, and Azure Data Factory
- Build scalable solutions using MPP Data Warehouses (Snowflake, Databricks, Azure Synapse) and Azure Blob Storage
- Write and optimize complex SQL queries and stored procedures
- Integrate data using Python and other scripting languages
- Design and implement dimensional data models for analytics and reporting
- Ensure data quality, consistency, and reliability across platforms
- Collaborate with cross-functional teams to understand business needs and deliver tailored data solutions
Other
- Thrive in a remote-first, globally distributed team
- Communicate project status, timelines, and deliverables to stakeholders
- Excellent written and verbal communication skills, with the ability to collaborate across technical and non-technical teams
- A Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field — or equivalent practical experience
- An attractive salary and benefits package