Data Integration Engineer / Industry (Biotech, Pharma, Healthcare)

Overview

On Site
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Data Integration
GxP
FTP
SFTP
S3
APIs
SaaS
Informatica
ETL
Data Factory
Databricks
Terraform
AWS
GCP
Data Governance
Compliance
Redshift
data pipeline
Airflow
Biotech
Pharma
Healthcare

Job Details

Senior Data Integration Engineer

Location: Vacaville, CA

Duration:- 6-12 Months+

Company: A leading organization in the Biotech, Pharma, or Healthcare IT sector.

About the Role:

We are seeking a highly skilled and analytical Senior Data Integration Engineer to play a pivotal role in architecting and delivering high-impact data solutions. This position is ideal for a professional who excels at bridging technical execution with strategic project leadership. You will be responsible for leading the design, migration, and optimization of our data integration pipelines, ensuring they are scalable, reliable, and compliant with industry standards (e.g., GxP). Your ability to communicate effectively with technical teams, business stakeholders, and external partners will be key to our success.

Key Responsibilities:

Lead End-to-End Data Integration: Architect, design, and build robust data pipelines to ingest and process data from diverse sources (e.g., FTP, SFTP, S3, APIs, SaaS platforms) into analytics-ready formats.

Drive Cloud Migration & Modernization: Spearhead the migration from legacy ETL tools (e.g., Informatica) to modern cloud-based data stacks on Azure (Data Factory, Databricks, Terraform) or AWS/Google Cloud Platform, implementing best-practice architectures like the Medallion architecture.

Ensure Data Governance & Compliance: Develop and maintain GxP-compliant data integration solutions within the existing data estate, ensuring data integrity, security, and adherence to regulatory requirements.

Optimize Data Operations: Build and maintain proactive monitoring, alerting, and CI/CD processes for data pipelines to minimize downtime and enhance performance and reliability.

Collaborate Cross-Functionally: Act as a liaison between Data Engineering, Product, Finance, and IT teams to translate complex business requirements into technical specifications and deliver effective data warehousing and reporting solutions.

Required Qualifications & Skills:

Proven professional experience in data engineering, data integration, or a similar role, with a track record of progressing into technical leadership.

Expertise in SQL and hands-on experience with Python for data processing and automation.

Deep hands-on experience with cloud data platforms, preferably Microsoft Azure (Data Factory, Databricks, DevOps).

Experience with AWS (S3, Redshift) or Google Cloud Platform (BigQuery) is also valuable.

Experience with modern data pipeline and workflow orchestration tools such as Airflow.

Solid understanding of end-to-end SDLC, including UAT, cut-over, and go-live activities.

Demonstrated experience leading projects and mentoring or managing offshore teams.

Excellent communication skills with the ability to articulate technical concepts to non-technical stakeholders.

Experience in a regulated industry (Biotech, Pharma, Healthcare) is a significant plus.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.