Databricks Engineer with Snowflake - contract to hire or FTE - Remote - W2 only

Overview

Remote
$60 - $80
Contract - W2
Contract - 6 Month(s)
10% Travel

Skills

Databricks
Python
Snowflake
ETL
SQL
Spark
Data Modeling

Job Details

Hello! I'm searching for an experienced Databricks Engineer for a contract to hire or FTE role based in Columbus, OH. This role is 100% remote. Experience with Snowflake, Python, Unity Catalog, SQL, Data Flows, ETL, and Data Modeling would be helpful in this role. This person must be a self starter, a leader, and must be comfortable working with clients, both technical and non-technical. Good communication skills are a must. Hourly pay range for this position is $60-80 an hour (W2) based on experience and full time hire is a possibility as well. Job details below. If you are qualified and interested in learning more, please apply today!

This job is 100% remote. The closer to east coast time, the better.

You must be able to work in the United States without sponsorship to be eligible for this job.

You must be able to pass a background check and drug screen to be eligible for this job.

Databricks Engineer
A Databricks Engineer builds, configures, and optimizes scalable, secure, and high-performance data solutions using the Databricks Lakehouse architecture. This role is hands-on, focusing on developing data pipelines, implementing governance and security controls, and ensuring optimal performance across the platform.

Key Responsibilities

  • Development & Implementation: Build and maintain Databricks-based solutions, including lakehouse architecture, ETL/ELT pipelines, and streaming/batch data processing workflows.
  • Performance Optimization: Configure clusters, tune Spark jobs, apply efficient partitioning strategies, and manage autoscaling to ensure cost and performance efficiency.
  • Data Governance & Security: Apply RBAC, Unity Catalog, data masking/encryption, and audit logging to meet compliance and security requirements.
  • Infrastructure Automation: Use Infrastructure-as-Code tools (Terraform, Bicep, ARM, CloudFormation) to automate Databricks environment setup and deployments.
  • Cloud Integration: Connect Databricks with Azure, AWS, or Google Cloud Platform services (e.g., Data Factory, Synapse, ADLS, S3, BigQuery).
  • Advanced Analytics Enablement: Support data science and analytics teams with MLflow, Databricks SQL, feature store, and AutoML integrations.
  • Collaboration: Work closely with data architects, analysts, and business stakeholders to implement solutions based on defined requirements and designs.
  • Documentation & Support: Maintain technical documentation, troubleshoot platform issues, and contribute to best practices across the engineering team.

Required Skills & Experience

  • 5+ years of experience working with enterprise cloud data platforms, including 2+ years of hands-on Databricks engineering work.
  • Strong knowledge of Apache Spark (PySpark), Delta Lake, and Databricks features like Unity Catalog and Workflows.
  • Proficiency in Python, SQL, and/or Scala for data modeling and transformation.
  • Experience with Azure, AWS, or Google Cloud Platform data services, including storage, networking, and security.
  • Familiarity with CI/CD pipelines and orchestration tools (Airflow, ADF, Databricks Workflows).
  • Strong problem-solving skills and ability to work in Agile environments.

Preferred Qualifications

  • Certifications such as Databricks Certified Data Engineer Professional, Azure Data Engineer, or AWS/Google Cloud Platform equivalents.
  • Exposure to data mesh principles, data products, or multi-region data deployments.
  • Experience in regulated industries with strict compliance and governance needs.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.