Keller Postman is looking to design, construct, install, test, and maintain highly scalable data management systems to support their litigation and arbitration efforts.
Requirements
- Proficient in Snowflake, Databricks, or similar tools and experience in data warehousing.
- Skilled in SQL, ETL design, and data modeling.
- Proficiency in SQL (complex queries, stored procedures, optimization) and familiarity with Python for data engineering tasks.
- Familiarity with Sigma Computing for reporting, data visualization, and business user self-service analytics.
- Understanding of data governance, security, and compliance frameworks (e.g., GDPR, HIPAA).
- Experience with streaming data technologies (Kafka, Event Hubs, or similar).
- Exposure to DevOps practices and Infrastructure as Code (e.g., Terraform, ARM templates).
Responsibilities
- Develop, construct, test, and maintain data architectures, including databases and large-scale processing systems.
- Design, build, and optimize data pipelines and ETL/ELT processes leveraging Snowflake and Azure Services.
- Develop and maintain Snowflake data warehouses, ensuring efficient data modeling, partitioning, and performance tuning.
- Implement data flow processes that automate and streamline data collection, processing, and analysis.
- Ensure data governance, quality, and security best practices across all data platforms.
- Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decisionmaking across the organization.
- Support CI/CD pipelines and automation for data workflows and deployments.
Other
- The ideal candidate will be on Central Standard Time or willing to work regular Central Standard Business hours; being located in the Chicago area is a plus.
- Must be able to read, write, and speak fluent English.
- Ability to work in a fast-paced environment and manage multiple projects simultaneously.
- Strong communication skills, capable of conveying complex data issues to non-technical team members.
- A minimum of 5 years of experience in a data engineering role.