Northbeam needs to scale its business and invest in the right people and systems to handle rapid growth.
Requirements
- Solid understanding of data processing needs of real-time and batch systems for transactional and analytical processing
- Experience with SQL / Java / Scala / Spark / Python
- Experience designing and deploying high performance systems with reliable monitoring and logging practices
- Experience with data pipeline orchestration tools and practices
- Experience in data engineering solutions such as Bigquery, Airflow, dbt, Kafka, Pinot
- Experience working with cloud infrastructure such as Google Cloud Platform, Azure, or AWS
- Experience or a desire to leverage infrastructure-as-code
Responsibilities
- Build and maintain data ingestion pathways spanning APIs, file processing, and configurable inputs.
- Own and Implement the data/infrastructure engineering aspects end-to-end from idea to solution as part of a cross functional effort with your peers.
- Work on the data pipeline architecture and optimize it for readability, maintainability, as well as cost.
- Integrate with our machine learning and data science systems to deliver insights to our customers.
- Maintain and augment the necessary infrastructure to scale our platform.
- Write technical documentation for internal and external stakeholders.
- Provide technical leadership to various software engineers and drive the long term technical strategy with scalable architecture and best practices.
Other
- Six to eight plus years of experience
- Prior experience working in marketing, e-commerce, or ad-tech
- Growth mindset - we’re always learning and growing
- Customer focus - we want to make the customer happy with our product
- Ownership mentality - we think like owners in the business
- Radical candor - we’re transparent and give direct feedback to one another
- Flexible PTO Policy
- 12 Company Paid Holidays