Kensho is looking to solve hard problems in data and Artificial Intelligence to innovate and drive progress for S&P Global and its customers worldwide.
Requirements
- At least one core expertise, such as distributed computing or deep knowledge of one of the technologies we use
- Practical understanding of algorithms, data structures, and design patterns
- Personal projects, open source contributions, or a portfolio of work
- Versatility and adaptability to contribute across various teams and/or suite of products
- Effective coding, documentation, and communication habits
- Python, Django, Typescript, Airflow
- Postgres, Elasticsearch, Kafka, Redis, Envoy
Responsibilities
- Make an immediate impact by contributing to real features, gaining feedback via code review, merging to our codebase and deploying to our production environment
- Learn and employ best practices such as static type checking, optimizing for legibility, code review, IaC, and following a maturity model for production-readiness
- Leverage regularly sanctioned time (Knowledge Days!) across most teams for continuous learning and exploration beyond our day-to-day work
- Write server-side and user-facing applications built on top of our machine learning capabilities
- Design interfaces, APIs, and services that address immediate needs with an eye for future scaling and reuse
- Develop the right abstractions to enable observability & deployment to our core platform
- Build data ingestion & processing for stand-alone software products and machine learning R&D
Other
- In-person collaboration, therefore interns are required to work out of the Cambridge HQ or our New York City office
- We are an equal opportunity employer that welcomes future Kenshins with all experiences and perspectives
- All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, or national origin
- Attend technical and non-technical discussions as well as company wide social events
- Collaborative, communicative environment that allows us to tackle the biggest challenges in data