Ascent Developer Solutions is looking to build and optimize data pipelines, work with large-scale data processing systems, and implement data solutions in cloud environments to support their real estate lending platform.
Requirements
- 3+ years of hands-on experience with Azure Databricks
- 4+ years of experience with Python for data engineering tasks
- Strong proficiency in SQL and experience working with data warehouses
- Solid understanding of Azure cloud services (e.g., Data Lake, Data Factory, Synapse)
- Experience with version control tools like Git and CI/CD pipelines
- Familiarity with data modeling, ETL/ELT processes, and performance tuning
- Experience with Delta Lake, Spark, or PySpark
Responsibilities
- Design, develop, and maintain scalable data pipelines using Azure Databricks and Azure Data Factory
- Write efficient and reusable code in Python for data transformation and automation
- Develop and optimize complex SQL queries for data extraction and reporting
- Work with data warehouse solutions (e.g., Azure Synapse, Snowflake, or similar) to support analytics and reporting needs
- Ensure data quality, integrity, and security across all data platforms
- Monitor and troubleshoot data workflows and performance issues
Other
- This role works on-site from Ascent's Encino offices 2-3 days/week and remotely from southern California otherwise.
- Full-time, 40 hours per week; must be available for occasional overtime.
- Some nights and weekends required.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements