Token Metrics is seeking a Senior Big Data Engineer to facilitate the operations of their Data Scientists and Engineering team by constructing frameworks for data preparation and analysis using various tools and techniques.
Requirements
- 3+ years of Python, Java or any programming language development experience
- 3+ years of SQL & No-SQL experience (Snowflake Cloud DW & MongoDB experience is a plus)
- 3+ years of experience with schema design and dimensional data modeling
- Expert proficiency in SQL, NoSQL, Python, C++, Java, R.
- Expert with building Data Lake, Data Warehouse or suitable equivalent.
- Expert in AWS Cloud.
Responsibilities
- Employ various tools and techniques to construct frameworks that prepare information using SQL, Python, R, Java and C++.
- Employing machine learning techniques to create and sustain structures that allow for the analysis of data while remaining familiar with dominant programming and deployment strategies in the field.
- Conceptualizing and generating infrastructure that allows big data to be accessed and analyzed.
- Reformulating existing frameworks to optimize their functioning.
- Testing such structures to ensure that they are fit for use.
- Building a data pipeline from different data sources using different data types like API, CSV, JSON, etc.
- Preparing raw data for manipulation by Data Scientists.
Other
- Collaborate with coworkers to ensure that your approach meets the needs of each project.
- Liaising with coworkers and clients to elucidate the requirements for each task.
- Ensuring that your work remains backed up and readily accessible to relevant coworkers.
- Remaining up-to-date with industry standards and technological advancements that will improve the quality of your outputs.
- Excellent analytical and problem-solving skills.
- A knack for independence and group work.
- Capacity to successfully manage a pipeline of duties with minimal supervision.