Unlocking the secrets held by a data set and solving global challenges using IoT, machine learning, and artificial intelligence at Booz Allen
Requirements
- 5+ years of experience with data exploration, data cleaning, data analysis, data visualization, or data mining
- 5+ years of experience designing, developing, testing, and deploying Extract, Transform, Load (ETL) pipelines using Python and PySpark within the Databricks platform
- 5+ years of experience optimizing existing Databricks pipelines for performance, scalability, and data quality
- 5+ years of experience leveraging cloud-based data storage and processing services such as AWS S3, Azure Blob Storage, and Google Cloud Storage
- Experience developing interactive dashboards and reports using Tableau, Excel, or tools such as Power BI, Qlik, Seaborn, Plotly, and Python libraries such as Matplotlib
- Experience adhering to data governance, data privacy, and data security policies and procedures
- Experience with Palantir Foundry or Maven Smart Systems
Responsibilities
- Create real-world impact using leadership skills and data science expertise
- Work closely with clients to understand their questions and needs
- Dig into data-rich environments to find the pieces of their information puzzle
- Guide teammates and lead the development of algorithms and systems
- Use the right combination of tools and frameworks to turn sets of disparate data points into objective answers
- Provide a deep understanding of the data, what it all means, and how it can be used
- Develop interactive dashboards and reports using Tableau, Excel, or tools such as Power BI, Qlik, Seaborn, Plotly, and Python libraries such as Matplotlib
Other
- Secret clearance
- Bachelor’s degree
- 5+ years of experience as a Data Scientist or Data Engineer
- Collaborating with cross-functional teams, including data scientists, business analysts, and product managers, to understand data requirements
- TS/SCI clearance