Daimler Truck North America (DTNA) is looking to solve business issues with data sets by developing, constructing, testing, and maintaining architectures that align with business requirements. They aim to leverage data to drive business solutions and improve data reliability, efficiency, and quality.
Requirements
- Azure Data Explorer, Azure Data Factory; creating ETL and ELT data pipelines; statistical analysis using SQL, KQL and Python; blending relational and time- series data sources for analysis; developing quality checks for incoming data packets; statistical modeling and methods; team leadership—managing work across a small team of engineers; developing end-to-end data modelling on live data; deploying and managing large cloud compute workloads for optimal speed and cost efficiency; Spark and related technologies; major cloud analytics platforms: AWS, Google or Azure; Azure ADLS or AWS S3; Databricks or Jupyter; Python; Azure Functions, Logic Apps; and telematics IoT.
Responsibilities
- Develop, construct, test and maintain architectures and align it with business requirements.
- Develop and maintain data pipelines in support of existing and new data requirements.
- Develop data set processes, address business issues with data sets.
- Identify ways to improve data reliability, efficiency and quality.
- Use data to discover tasks that can be automated.
- Develop methods and metrics that provide insights to product teams and business owners.
- Undertake cleansing, profiling, validation and aggregation of structured and unstructured data.
Other
- Hybrid (4 days per week in-office / 1 day remote).
- Work within an Agile team and a dynamic, fast-moving, collaborative environment.
- Communicate status to all stakeholders.
- Keep current on latest data analytics technologies and products, including hands on evaluations and in-depth research.
- Applicants must be legally authorized to work permanently in the country the position is located in at the time of application.