CoStar Group is looking to solve the problem of digitizing the world's real estate by empowering people to discover properties, insights, and connections that improve their businesses and lives, specifically through the growth and development of Homes.com
Requirements
- 3+ years of data pipeline engineering experience, and/or deep database engineering experience
- Experience with Python / other scripting languages to ETL data
- Hands-on experience with cloud-based relational and non-relational databases
- Experience with No-SQL databases (e.g. DynamoDB)
- Experience with data pipeline tools (e.g. Glue, Step Functions, Lambda)
- Knowledge and/or experience working with Apache Spark/Databricks/Azure DevOps
Responsibilities
- Designing, building, testing and deploying scalable, reusable and maintainable applications that handle substantial amounts of data
- Taking full ownership of your work from development and testing to eventual deployment and support in production
- Collaborating with other engineers, product owners, designers, and leadership
- Becoming a trusted team member in matters of technical architecture, design and code
- Advocating for evolution and improvement - both technical and non-technical - within our teams
- Gaining a deep understanding of the CoStar business, including the Analytic products
Other
- Bachelor’s Degree required from an accredited, not for profit university or college, with degree preferably in Computer Science, Data Science, or related field
- A track record of commitment to prior employers
- A demonstrable record of accomplishment of building and launching successful products
- Willingness to take an active role in mentoring other developers
- Ability to analyze technical requirements and design new architectures, data models and ETL strategies