2 days ago Be among the first 25 applicants
Get AI-powered advice on this job and more exclusive features.
Base pay range
$80.00 / hr - $95.00 / hr
Overview
Department information / introduction : Establishing data workflows for predictive tools to enable more effective identification, characterization, and development of Client medicines and vaccines is a key objective for
- . This position sits within the Digital Sciences team in the Analytical Enabling Capabilities sub-department of Analytical Research & Development. You will be part of a team working collaboratively across a wide range of areas impacting all aspects of the drug discovery and development pipeline. A diverse array of projects spanning data workflows to instrument metrology to predictive sciences ensure this Digital Sciences team helps to enable work across all drug modalities including small molecule, peptide, biologics, vaccines, and beyond. The core Digital Sciences team works with a networked group of digital champions across AR&D and has close connectivity to other digital / data facing teams across
- Research Laboratories including critical IT collaborators.
Responsibilities
Design and development of data workflows / data pipelines in Python.Meet with business clients / SMEs to gather requirements.Work with IT to implement data workflows.Manage projects and timelines.Estimation of duration of work.Participate in daily standup meetings.Presentation of updates to collaborators.Qualifications
Sr. Data Engineer.Education : Candidates with a degree in computer science or related field; or a degree in the chemistry disciplines with strong programming capabilities.Experience : 7-8 years of relevant experience.Must-have / Required skills
Cloud Services – AWS (Lambda Functions, S3, Cloud Formation Templates, RDS, ECR)Development of ETL Processes / Data Workflows / Data Pipelines / Data Wrangling / Data Ingestion.Python 3.9+ software developmentPython packages - Boto3, Pandas, pyodbc, openpyxlPython virtual environments - condaIDEs - Visual Studio Code or PyCharmSoftware design, development, and testing (unit testing and system testing)Version control - Git, GitHubCI / CD - GitHub ActionsDatabases - relational databases, SQL, data modeling and designFile Formats (XLXS, YAML, JSON, CSV, TSV)Excellent verbal and written communications skillsWork independently and be able to collaborate as a teamStrive for continuous improvement and suggest innovative solutions to scientists' common challenges related to data workflowsNice to have / Preferred experiences and skills
Cloud Services – AWS (SQS, DLQ, SNS, EventBridge, API Gateway)Development of ETL Processes / Data Workflows / Data Pipelines / Data Wrangling / Data IngestionPython packages (Cerberus, PyYAML, logging)Python linters and type hints; regular expressionsExperience with data pipeline tools such as Dataiku or TrifactaExperience in an IT role within the pharmaceutical research sectorNotes
Location : Required to be onsite for at least 2-3 days per week at the West Point, PA site.Positions available : 2This is not a typical IT workSomeone who can work with scientists to understand what data has been generated from the experiments and helps automate the Electronic notebook.Someone who has expertise generating scientific data, can be analytical, genomicsSomeone who has expertise building data pipelines.Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Other
Industries
IT Services and IT Consulting
J-18808-Ljbffr