Masterworks is looking to build and maintain robust, high-performance data pipelines and infrastructure to support its fintech platform, ensuring data reliability, efficiency, and scalability.
Requirements
- Strong experience with Amazon Redshift, including query optimization and system diagnostics
- Proficiency with ETL orchestration tools such as Luigi and Apache Airflow
- Expert-level SQL skills; ability to analyze and optimize long-running queries
- Proven ability to troubleshoot high CPU or slow query issues on Redshift
- Familiarity with data alerting and monitoring tools (e.g., CloudWatch, Datadog, custom alert systems)
- Experience with cloud platforms (AWS preferred)
- Experience with Snowflake or other modern data platforms
Responsibilities
- Design, build, and maintain scalable ETL pipelines using Luigi and Apache Airflow
- Monitor and optimize performance of Redshift clusters, particularly: Diagnosing high CPU usage, Identifying slow or resource-intensive queries, Refactoring SQL for performance improvements
- Proactively build data quality alerts and notification systems to ensure pipeline health and catch missing/incomplete data early
- Work closely with analysts and stakeholders to ensure the data is accurate, available, and accessible
- Respond promptly to issues during working hours (within 5 minutes during core hours)
- Lead or assist in potential migration projects (e.g., Redshift to Snowflake or other tools), including planning, testing, and execution
- Collaborate on data modeling and schema design to support analytics and application needs
Other
- 2-5 years of hands-on data engineering experience in a production environment
- Strong communication skills and a collaborative mindset
- High responsiveness during working hours; ability to support production data pipelines and address urgent issues quickly
- Must be eligible for full-time US work - no exceptions.
- Must be able to work from our NY office - not a remote role.