Suffolk is looking to solve the problem of inefficient construction project operations by leveraging AI to transform project workflows, improve reporting, and accelerate RFIs, lookahead planning, progress updates, materials tracking, and more.
Requirements
- 4–6+ years in AI engineering / full-stack data applications or data science, including 2+ years building production LLM/RAG solutions.
- Hands-on expertise with Copilot Studio, Power Apps/Automate, custom connectors, and CoE Toolkit governance.
- Programming & data stack: Python, SQL, Databricks Lakehouse, vector stores.
- DevOps & IaC: GitHub Actions (or Azure DevOps) and Posit Workbench/Connect automation or comparable CI/CD tooling; strong Git/GitHub workflow discipline.
- Integration & ETL skills: Foundational understanding of ETL/ELT design, Airflow or Databricks Workflows, and REST/GraphQL API development; proven collaboration with Data Engineering on source-to-lake and lake-to-agent pipelines.
- Prior hands-on work in construction or heavy process industries (manufacturing, oil & gas, chemicals) is a significant plus.
- Demonstrated process excellence background (Lean/Six Sigma Green Belt or equivalent) with experience diagnosing process and data gaps and supporting change management plans with Operations Excellence.
Responsibilities
- Lead Lean/Six Sigma discovery workshops; map value streams, assess process and data maturity, and log low-effort/high-impact AI use cases.
- Evaluate each jobsite’s current workflows and underlying data; surface gaps that block AI adoption and develop phased improvement plans with Operations Excellence to establish the right process baseline before deploying agents.
- Convert user stories into production-ready agents in Copilot Studio / Power Apps/Automate, ChatGPT Enterprise, or code-first frameworks within days; wire them to Teams/SharePoint on the front end and Databricks Lakehouse or other sources on the back end.
- Build RAG pipelines backed by Delta tables, Unity Catalog, and Databricks Vector Search; automate infra with GitHub Actions / Posit; monitor latency, cost, adoption, and drift.
- Partner with Data Engineering to design and maintain ETL pipelines, API integrations, and event-driven connectors feeding RAG and agents.
- Blend OpenAI, Azure OpenAI, and AWS Bedrock behind secure custom connectors; package agents for seamless rollout.
- Train crews, gather feedback, iterate, and track adoption and ROI metrics; apply influence model principles to embed agents into daily routines and SOPs, and track behavior change KPIs.
Other
- Bachelor’s in CS, Engineering, Physics, or a related field; Master’s preferred.
- Strong facilitation and communication skills.
- Willing and able to travel and work on active jobsites.
- Ability to sit for long periods of time; talk or hear; perform fine motor, hand and finger skills in the use of a keyboard, telephone, or writing.
- Ability to stand; walk; and reach with arms and/or hands.