Job Board
LogoLogo

Get Jobs Tailored to Your Resume

Filtr uses AI to scan 1000+ jobs and finds postings that perfectly matches your resume

Salesforce Logo

Senior Software Engineer, Data Platform

Salesforce

$157,600 - $236,500
Dec 10, 2025
San Francisco, CA, US
Apply Now

Salesforce is looking to modernize its core data ecosystem by building a robust, software-defined Data Mesh using Snowflake, dbt, Airflow, and Informatica. This involves bridging the gap between Data Engineering and Software Engineering, treating data pipelines as production code, automating infrastructure, and optimizing distributed systems to enable AI and Analytics across the enterprise.

Requirements

  • Strong background in software engineering (Python/Java/Go) applied to data. You are comfortable writing custom API integrations and complex Python scripts.
  • Deep production experience with Snowflake (architecture/tuning) and dbt (Jinja/Macros/Modeling).
  • Advanced proficiency with Airflow (Managed Workflows for Apache Airflow).
  • Hands-on experience with AWS services (S3, Lambda, IAM, ECS) and containerization (Docker/Kubernetes).
  • Experience with Git, CI/CD (GitHub Actions/Jenkins), and Terraform.
  • Knowledge Graph Experience: Familiarity with Graph Databases ( Neo4j ) or Semantic Standards (RDF/SPARQL, TopQuadrant ) is a strong plus as we integrate these technologies into the platform.
  • Experience with Apache Iceberg or Delta Lake.

Responsibilities

  • Design and implement scalable data pipelines and transformation logic using Snowflake (SQL) and dbt . Replace legacy hardcoded scripts with modular, testable, and reusable data components.
  • Engineer robust workflows in Airflow . Write custom Python operators and ensure DAGs are dynamic, factory-generated, and resilient to failure.
  • Own the performance of your datasets. Deep dive into query profiles, optimize pruning/clustering in Snowflake, and reduce credit consumption while improving data freshness.
  • Manage the underlying platform infrastructure (warehouses, roles, storage integration) using Terraform or Helm. Click-ops is not an option.
  • Enforce a strict "DataOps" culture. Ensure every PR has unit tests, schema validation, and automated deployment pipelines.
  • Build monitoring and alerting (Monte Carlo, Grafana, Newrelic, Splunk) to detect data anomalies before stakeholders do.
  • Work with domain teams (Sales, Marketing, Finance) to onboard them to the platform, helping them decentralize their data ownership while adhering to platform standards.

Other

  • 5+ years of relevant data or software engineering experience.
  • Focus: Execution & Component Ownership. You are given a problem (e.g., "Migrate this domain to dbt," "Optimize this slow pipeline") and you solve it with high-quality, clean code with minimal supervision.
  • Scope: You own features and specific pipelines. You mentor junior engineers on code reviews and best practices.
  • Experience using AI coding assistants (Copilot, Cursor) to accelerate development.
  • Salesforce is an equal opportunity employer and maintains a policy of non-discrimination with all employees and applicants for employment.