Arcutis is looking to solve the problem of designing, building, and maintaining scalable Azure Databricks-based data pipelines and architectures that enable analytics, AI/ML, and reporting across commercial functions.
Requirements
- Experience with Azure Cloud (Databricks, DevOps, DataFactory)
- Strong hands-on python, pyspark, and SQL skills
- Direct experience with building and leveraging API integrations in ETL pipeline development
- Experience integrating with current best-in-class AI models and APIs like OpenAI API, Databricks AI models, etc.
- Strong knowledge of and experience with maximizing business value using Databricks Unity Catalog or Databricks One capabilities
- Familiarity with tools and practices in the Martech and Adtech landscape
- Knowledge of Consumer, Patient, or HCP data ecosystems
Responsibilities
- Design, build, and maintain scalable, reliable, and cost-efficient data pipelines using Azure Databricks in support of analytics, machine learning, data science, and operational use cases
- Lead data engineering initiatives that enable AI/ML model development, LLM integrations, and AI-driven applications while ensuring scalability and alignment with enterprise and business priorities
- Architect and implement data integration frameworks across diverse Adtech and Martech ecosystems, incorporating Google Analytics, media campaign data, and third-party marketing APIs like Salesforce
- Manage and optimize data ingestion, transformation, and storage processes using SQL, Python, and PySpark to integrate structured and unstructured data sources
- Design and maintain API integrations with internal and external systems, including Python- and PySpark-based services and AI/LLM-powered APIs for advanced analytics and automation
- Administer and maintain Azure-based data tools and platforms (e.g., Databricks, ADF) to ensure operational excellence, reliability, and security
- Collaborate with internal stakeholders and external partners to evolve data platform design and architecture that supports advanced analytics, personalization, marketing intelligence, and marketing automation
Other
- Bachelor of science degree required
- A minimum of 5 years transferable working experience in the area of operational support of a Data Engineering, Data Architecture, or Cloud Platforms function preferred
- Travel up to 20%
- Ability to collaborate with a broad set of stakeholders to evaluate the business need and construct a technical design that can enable stakeholder priorities
- Self-driven with ability to independently design end-to-end data pipelines while ensuring architectural best practices