Abnormal AI is looking to optimize infrastructure costs within their R&D Engine by building and maintaining core data pipelines and dashboards. The role will also support other critical initiatives like AI enablement and engineering productivity.
Requirements
- Proficiency in writing complex SQL queries and building clean, reliable data models to support reporting and analysis.
- Hands-on experience using DBT (Data Build Tool) to transform and organize data for downstream analytics workflows.
- Experience with at least one programming language (e.g., Python or R) for data exploration, statistical analysis, and automating reporting workflows; familiarity with DBT and/or Databricks is a plus.
- Experience integrating and analyzing cloud cost and operational datasets (AWS, Azure, Databricks)
- Proficiency with business intelligence tools (e.g., Looker, Tableau, Power BI) to build intuitive dashboards and communicate insights effectively.
Responsibilities
- Build, enhance, and maintain data pipelines and dashboards that drive transparency and optimization within our infrastructure cost program (across AWS, Azure, and related platforms)
- Support reporting and data development across AI enablement, engineering productivity, product usage, and other R&D-focused initiatives
- Partner with engineers and technical stakeholders to define, track, and optimize actionable metrics; participate in metric design, not just execution
- Apply strong SQL, dbt, and Python skills to automate measurement, ensure data quality, and maintain reliable operational metrics
Other
- 3+ years of experience working with large-scale data in analytical or product-oriented environments, with a strong focus on data exploration, interpretation, and communication.
- Strong grasp of data analyst best practices, including analysis reproducibility, validating results through testing and sanity checks, and communicating insights clearly to both technical and non-technical audiences.
- Demonstrated ability to translate ambiguous business questions into structured analyses and data models optimized for insight generation.
- Strong track record of collaboration with cross-functional partners to deliver high-impact data solutions.
- Bachelor's degree in a quantitative field such as Data Science, Mathematics, Statistics, Computer Science, Information Systems, or a related discipline.