The company is looking to design, develop, and deploy advanced data integration and analytics solutions using the Palantir Foundry platform to solve complex business requirements.
Requirements
- Deep technical expertise in Foundry’s ecosystem
- Strong data engineering skills
- PySpark
- SQL
- Foundry transformations
- Cloud services (AWS, Azure, GCP)
- External APIs
Responsibilities
- Build and optimize data pipelines, Ontology models, and Foundry applications (Workshop, Contour, Quiver, Slate)
- Develop custom workflows and dashboards using Foundry’s suite of tools
- Implement robust ingestion strategies for structured and unstructured data
- Apply PySpark, SQL, and Foundry transformations for data cleansing and enrichment
- Create operational workflows and user-facing applications within Foundry
- Integrate Foundry with cloud services (AWS, Azure, GCP) and external APIs
- Ensure compliance with data governance, lineage, and security standards (RBAC, encryption)
Other
- Mentor junior team members
- Collaborate with cross-functional stakeholders
- Provide mentorship and enforce best practices in development and deployment
- Stay updated on Foundry features and drive adoption of new functionalities
- Act as a subject matter expert for Palantir Foundry