Sr. Database Engineer (Only W2)

Overview

Remote
Depends on Experience
Contract - W2
Contract - 06 Month(s)

Skills

Data
SSIS
ADF
SQL
.NET
Java
Azure
Azure Data
Kafka
Terraform
CI/CD
big data
data lake
SNS
EHR
healthcare

Job Details

Overview

We are seeking an experienced Sr. Database Engineer (contractor) to design, build, and optimize complex data systems. This senior-level contractor will work across multiple domains including data architecture, pipeline development, and system operations.

Key Responsibilities

  • Design and implement scalable and reliable data architectures that support large-scale data processing, transformation, and analysis.
  • Develop, maintain, and optimize ETL/ELT pipelines using modern tools and frameworks to move and transform data from diverse sources (flat files, streaming systems, REST APIs, EHRs, etc.).
  • Build and support high-performance, cloud-based systems for real-time and batch processing (e.g., data lakes, warehouses, and mesh architectures).
  • Collaborate with stakeholders across engineering, data science, and product teams to gather requirements and deliver actionable data solutions.
  • Interface with Electronic Health Records (EHR) and healthcare data formats to ensure integration accuracy and compliance.
  • Own operational excellence for data systems including logging, monitoring, alerting, and incident response.
  • Utilize advanced programming skills (.NET, Java, or similar) and SQL to engineer robust data services.
  • Contribute to architecture frameworks and documentation to guide team standards and best practices.
  • Act as a subject matter expert (SME), mentoring junior engineers and promoting engineering excellence across the organization.

Qualifications

  • 7 10+ years of professional experience in data engineering, software development, or database systems.
  • Proven experience with SSIS and ADF.
  • **Bachelor's degree in Computer Science, Engineering, or related field or equivalent experience.
  • Expertise in SQL, database systems, and modern data processing tools and frameworks.
  • Strong proficiency in at least one programming language (.NET, Java, Python, etc.).
  • Demonstrated experience with modern cloud platforms (Azure, AWS, or Google Cloud Platform).
  • Familiarity with data streaming and queuing technologies (Kafka, SNS, RabbitMQ, etc.).
  • Understanding of CI/CD pipelines, infrastructure-as-code (Terraform), and containerized deployments (e.g., Kubernetes).
  • Comfortable with production system support, debugging, and performance optimization.
  • Strong problem-solving, communication, and collaboration skills.
  • High-level understanding of big data design patterns and architectural principles (e.g., data lake vs. warehouse vs. mesh).
  • Experience with RESTful APIs and integrating external data sources into internal systems.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Apidel Technologies