Meta Platforms, Inc. (Meta) is looking to solve the problem of building the next generation of data tools that generate business insights for a product and designing, architecting, and developing software and data solutions that help product and business teams make data-driven decisions.
Requirements
- Data ETL (Extract, Transform, Load) design, implementation, and maintenance on a large scale
- Programming in Python, Perl, Java, or PHP
- Internet technologies: HTTP, HTML, CSS, or JavaScript
- Writing SQL statements
- Analyzing large volumes of data to provide data driven insights, gaps, and inconsistencies
- Data warehousing architecture and plans
- Informatica, Talend, Pentaho, dimensional data modeling, or schema design
Responsibilities
- Design, build, and launch data pipelines to move data across systems and build the next generation of data tools that generate business insights for a product.
- Analyze user needs and software requirements to determine workability and to offer support for end users on data usage.
- Design, architect, and develop software and data solutions that help product and business teams make data-driven decisions.
- Rethink and influence strategy and roadmap for building efficient data solutions and scalable data warehouse plans.
- Design, develop, test, and launch new data models and processes into production, and provide support.
- Leverage homegrown extract, transform, and load (ETL) framework as well as off-the-shelf ETL tools, as appropriate.
- Interface closely with data infrastructure, product, and engineering teams to build and extend cross platform ETL and reports generation framework.
Other
- Five years of progressive, post-baccalaureate work experience in the job offered or in a computer-related occupation
- Requires two years of experience in each of the following:
- Map Reduce or MPP system
- LAMP stack development, AND
- Hadoop, HBase, or Hive