SS&C is looking to build scalable and reliable data solutions to support robust analytics and downstream systems by focusing on data ingestion, transformation, and integration into centralized data lakes.
Requirements
- Expertise in structured and unstructured data ingestion, modelling and transformation.
- Experience with MPP database design and integration.
- Experience with S3 (specifically MinIO).
- Expertise in RESTful API design and integration.
- Experience with large-scale data processing frameworks.
- Experience in Data Lakehouse optimization.
- Experience in C-Sharp, .NET, SQL Server, and multi-tiered backend development.
Responsibilities
- Design, implement, and maintain data ingestion pipelines from external sources into centralized data lakes.
- Transform, normalize, and process large-scale data for downstream applications and analytics.
- Develop and maintain RESTful APIs to provide data access and integration points for internal and external systems.
- Optimize performance, scalability, and reliability of backend systems and APIs.
- Apply software design patterns, code best practices, and automated testing methodologies.
- Monitor and troubleshoot data pipelines, APIs, and integrations in production environments.
- Stay current with emerging technologies and best practices in data engineering, backend development, and cloud-based solutions.
Other
- Strong problem-solving and troubleshooting skills.
- Excellent verbal and written communication skills.
- Bachelor’s degree in Computer Science, Software Engineering, or related technical/quantitative field (or equivalent experience).
- Familiarity with financial services applications or private markets.
- Hybrid Work Model