Takeda's Biologics Process Development team is looking to improve the efficiency and effectiveness of Research and Development (R&D) operations by implementing advanced digital solutions, specifically through the support of the Digitalization Initiatives Project.
Requirements
- Proficient in Python for data manipulation and analysis.
- Strong experience with SQL for database management and querying.
- Familiarity with AWS services (e.g., S3, Lambda, RDS).
- Experience with Tableau for dashboards and visualizations.
- Basic knowledge of HTML and CSS for interface integration.
- Experience with FastAPI (or similar frameworks) for building APIs.
- Familiarity with Big Data technologies (e.g., Hadoop, Spark, Databricks)
Responsibilities
- Collaborate with cross-functional teams to develop and optimize data pipelines that support real-time data integration and analysis.
- Work with stakeholders to translate data requirements into technical specifications and workflows aligned with project goals.
- Support workflow optimization by enhancing data management and analysis processes, contributing to improved accuracy and decision-making.
- Ensure data-related processes comply with organizational standards, regulations, and best practices.
- Create and maintain comprehensive documentation for data processes, pipelines, and workflows, providing stakeholders with regular project updates.
Other
- Must be currently pursuing a Master’s or Doctoral degree in Computer Science, Data Science, Information Technology, or a related field.
- Must be authorized to work in the U.S. on a permanent basis without requiring sponsorship
- Must be currently enrolled in a degree program graduating December 2026 or later
- Able to work full time 40 hours a week during internship dates
- Highly reliable, strong team player, detail-oriented, and adaptable.