Data Engineer

Overview

Remote
$65 - $70
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required

Skills

Amazon Web Services
Analytics
Apache Spark
Business Intelligence
Cloud Computing
Collaboration
Conflict Resolution
Continuous Delivery
Continuous Integration
Data Integrity
Data Modeling
Data Processing
Data Quality
Data Security
Data Warehouse
Databricks
Decision-making
Extract
Transform
Load
Good Clinical Practice
Google Cloud Platform
Management
Meta-data Management
Microsoft Azure
Orchestration
Performance Tuning
Problem Solving
Python
Regulatory Compliance
SQL
Scalability
Workflow

Job Details

Title: Data Engineer
Location: Remote
Duration: 6 month+ can be extendable
Domain/Industry: Preferably Insurance
Year of Experience: 6-10 Years
Type: Contract

About the Role: We are looking for a highly skilled Data Engineer with strong Databricks expertise to design, build, and maintain robust data pipelines and architectures. The role involves ensuring data integrity, scalability, and compliance while enabling data-driven decision-making across the organization.

Key Responsibilities:

  • Develop and maintain scalable data pipelines and ETL processes to support diverse business and analytics requirements.
  • Design and optimize data architectures (including data lakes, data warehouses, and other modern data platforms).
  • Manage and enhance data ingestion processes and data cataloging to ensure discoverability and governance.
  • Build, monitor, and troubleshoot data workflows and pipeline performance to ensure reliability and efficiency.
  • Ensure data quality, integrity, security, and compliance with organizational and regulatory standards.
  • Collaborate with cross-functional teams (engineering, analytics, product, and business) to understand data needs and deliver effective solutions.
  • Leverage Databricks to manage large-scale data processing, transformations, and orchestration.
  • Continuously evaluate and implement improvements for performance optimization and scalability.


Required Skills & Qualifications:

  • Proven experience as a Data Engineer with strong background in Databricks.
  • Hands-on experience with building and managing data pipelines and ingestion frameworks.
  • Expertise in ETL development, orchestration, and monitoring.
  • Strong knowledge of data cataloging, governance, and metadata management.
  • Proficiency in SQL, Python, and Spark.
  • Experience with data lakes, warehouses, and modern cloud-based data platforms (Azure, AWS, or Google Cloud Platform).
  • Strong problem-solving skills with the ability to troubleshoot complex data workflow issues.
  • Knowledge of data security, compliance, and best practices.
  • Familiarity with BI/analytics tools and stakeholder collaboration.


Preferred Qualifications:

  • Experience with Delta Lake, Lakehouse architectures, and data modeling.
  • Exposure to tools like Airflow, Azure Data Factory, or similar orchestration tools.
  • Knowledge of CI/CD practices for data pipelines.
  • Relevant certifications in Databricks, Azure, AWS, or Google Cloud Platform.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.