Overview
Hybrid
Depends on Experience
Contract - Independent
Contract - W2
Skills
API
Amazon Lambda
Amazon RDS
Amazon S3
Amazon SQS
Amazon Web Services
Analytical Skill
Augmented Reality
Cloud Computing
Collaboration
Computerized System Validation
Continuous Delivery
Continuous Improvement
Data Modeling
Data Wrangling
Estimating
Extract
Transform
Load
Git
GitHub
JSON
Kerberos
Management
Metrology
Microsoft Visual Studio
Pandas
Pharmaceutics
Python
Relational Databases
Remote Desktop Services
Research
Research and Development
SQL
Science
System Testing
Testing
Unit Testing
Version Control
Workflow
YAML
Job Details
- Establishing data workflows for predictive tools to enable more effective identification, characterization, and development of novel medicines and vaccines is a key objective for client. This position sits within the Digital Sciences team in the Analytical Enabling Capabilities sub-department of Analytical Research & Development. You will be part of a team working collaboratively across a wide range of areas impacting all aspects of the drug discovery and development pipeline. A diverse array of projects spanning data workflows to instrument metrology to predictive sciences ensure this Digital Sciences team helps to enable work across all drug modalities including small molecule, peptide, biologics, vaccines, and beyond. The core Digital Sciences team works with a networked group of digital champions across AR&D and has close connectivity to other digital/data facing teams across Research Laboratories including critical IT collaborators.
- Must have/required skills:
- Cloud Services AWS (Lambda Functions, S3, Cloud Formation Templates, RDS, ECR)
- Development of ETL Processes / Data Workflows / Data Pipelines / Data Wrangling / Data Ingestion.
- Python 3.9+ software development
- Python packages - Boto3, Pandas, pyodbc, openpyxl
- Python virtual environments - conda
- IDEs - Visual Studio Code or PyCharm
- Software design, development, and testing (unit testing and system testing)
- Version control - Git, GitHub
- CI/CD - GitHub Actions
- Databases - relational databases, SQL, data modeling and design
- File Formats (XLXS, YAML, JSON, CSV, TSV)
- Excellent verbal and written communications skills.
- Work independently and be able to collaborate as a team.
- Strive for continuous improvement and suggest innovative solutions to scientists common challenges related to data workflows.
- Nice to have/preferred experiences and skills:
- Cloud Services AWS (SQS, DLQ, SNS, EventBridge, API Gateway)
- Development of ETL Processes / Data Workflows / Data Pipelines / Data Wrangling / Data Ingestion.
- Python packages (Cerberus, PyYAML, logging)
- Python linters and type hints; regular expressions
- Experience with data pipeline tools such as Dataiku or Trifacta
- Experience in an IT role within the pharmaceutical research sector
- Responsibilities / Day-to-Day:
- Design and development of data workflows/data pipelines in Python.
- Meet with business clients/SMEs to gather requirements.
- Work with IT to implement data workflows.
- Manage projects and timelines.
- Estimation of duration of work.
- Participate in daily standup meetings.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.