Databricks Engineer

Overview

Remote
$110000 - $120000 per annum
Full Time

Skills

Databricks
Azure
AWS

Job Details



Job Title: Databricks Engineer - Remote


Location: Remote
Job Type: Full-Time
Experience Level: Mid to Senior (5+ years)


About the Role:


We are seeking a skilled and experienced Databricks Engineer with a strong background in both Azure and AWS cloud platforms. As a key member of our Data Engineering team, you will design, develop, and optimize scalable data solutions using Apache Spark on Databricks, enabling advanced analytics and data-driven decision-making across the organization.


This is a remote position, offering flexibility and the opportunity to work with a global, cross-functional team on exciting data initiatives.


Key Responsibilities:




  • Design, build, and maintain scalable data pipelines and ETL processes using Databricks on Azure and AWS.




  • Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver optimal solutions.




  • Optimize performance of big data jobs using Apache Spark, and implement best practices in data engineering workflows.




  • Implement data quality checks, data governance, and monitoring processes to ensure reliability and compliance.




  • Leverage cloud-native services (e.g., Azure Data Lake, AWS S3, Azure Synapse, AWS Glue, Lambda) to support data engineering tasks.




  • Develop and maintain CI/CD pipelines for data workloads using tools like GitHub Actions, Azure DevOps, or AWS CodePipeline.




  • Ensure security and compliance of data infrastructure and adhere to company policies on data privacy and access control.




Required Qualifications:




  • 5+ years of experience in data engineering or software development with a focus on cloud platforms.




  • Proven expertise in Databricks, including hands-on experience with Apache Spark (Scala and/or PySpark).




  • Strong experience with Azure (e.g., Azure Data Lake, Azure Synapse, Azure Functions, Azure Key Vault).




  • Proficiency with AWS services related to data processing (e.g., S3, Glue, Athena, Lambda, Redshift).




  • Solid knowledge of SQL, data modeling, and data warehousing concepts.




  • Experience with version control and CI/CD tools (e.g., Git, GitHub, Azure DevOps).




  • Excellent problem-solving and communication skills, with the ability to work independently in a remote environment.




Preferred Qualifications:




  • Databricks certifications (e.g., Databricks Certified Data Engineer Associate/Professional).




  • Experience with orchestration tools like Apache Airflow, Azure Data Factory, or AWS Step Functions.




  • Familiarity with containerization (e.g., Docker) and infrastructure as code (e.g., Terraform, CloudFormation).




  • Understanding of data governance frameworks and tools like Unity Catalog, Purview, or AWS Glue Data Catalog.




What We Offer:




  • 100% remote work with flexible hours




  • Competitive salary and performance-based bonuses




  • Generous PTO and holidays




  • Health, dental, and vision insurance




  • Learning & development opportunities, including certification reimbursements




  • Collaborative and inclusive company culture





Oscar Associates Limited (US) is acting as an Employment Agency in relation to this vacancy.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.