Databricks Data Engineer--Remote with Quarterly travel to Gaithersburg, MD

Hybrid in Gaithersburg, MD, US • Posted 10 hours ago • Updated 10 hours ago
Contract Independent
Contract W2
Hybrid
Depends on Experience
Fitment

Dice Job Match Score™

🧠 Analyzing your skills...

Job Details

Skills

  • .NET
  • Accessibility
  • Agile
  • Amazon Web Services
  • Analytics
  • Apache Spark
  • Artificial Intelligence
  • Cloud Computing
  • Collaboration
  • Computer Science
  • Configuration Management
  • Continuous Delivery
  • Continuous Integration
  • Critical Thinking
  • Cross-functional Team
  • Data Analysis
  • Data Engineering
  • Data Governance
  • Data Integrity
  • Data Lake
  • Data Management
  • Data Quality
  • Data Science
  • Data Security
  • Data Storage
  • Databricks
  • Extract
  • Transform
  • Load
  • Identity Management
  • Management
  • Meta-data Management
  • Microsoft Azure
  • Optimization
  • ProVision
  • Python
  • Regulatory Compliance
  • R
  • Storage
  • Streaming
  • Unstructured Data
  • Workflow

Summary


Title: Databricks Data Engineer
location: Remote with Quarterly travel to Gaithersburg, MD
Duration: 6 months, extension

Note: ideal candidate will be based in or willing to relocate to the Washington, DC or Indianapolis, IN area to support periodic on-site activities.

Databricks Data Engineer to develop/support new/existing data pipelines, and data analytics environments in Azure cloud-based data lake. As a data engineer, you will translate business requirements to data engineering solutions to support an enterprise scale Microsoft Azure based data analytics platform. You will support continued maintenance of ETL operations and development of new pipelines ensuring data quality and data management. The ideal candidate will bring deep expertise in Databricks, a solid foundation in advanced AI technologies, and applies critical thinking to create innovative functions and solve technical issues by cross-functional team collaboration.

In this role, you will:
- Design, build, and optimize scalable data solutions using Databricks and Medallion Architecture.
- Manage ingestion routines for processing multi-terabyte datasets efficiently for multiple projects simultaneously, where each project may have multiple Databricks workspaces.
- Integrate data from various structured and unstructured sources to enable high-quality business insights. Proficiency in data analysis techniques for deriving insights from large datasets.
- Implement effective data management strategies to ensure data integrity, availability, and accessibility. Identify opportunities for cost optimization in data storage, processing, and analytics operations.
- Monitor and support user requests, addressing platform or performance issues, cluster stability, Spark optimization, and configuration management.
- Collaborate with the team to enable advanced AI-driven analytics and data science workflows.
- Integrate with various Azure services including Azure Functions, Storage Services, Data Factory, Log Analytics, and User Management for seamless data workflows.
- Experience with the above Azure services is a plus.
- Provision and manage infrastructure using Infrastructure-as-Code (IaC)
- Apply best practices for data security, data governance, and compliance, ensuring support for federal regulations and public trust standards.
- Proactively collaborate with technical and non-technical teams to gather requirements and translate business needs into data solutions.

For this position, you must possess:
- BS degree in Computer Science or related field and 3+ years or Master s degree with 2+ years of experience
- 3+ years of experience developing and designing Ingestion flows (structured, streaming, and unstructured data) using cloud platform services with data quality
- Databricks Data Engineer certification and 2+ years of experience maintaining Databricks platform and development in Spark
- Ability to work directly with clients and act as front line support for requests coming in from clients. Clearly document and express the solution in form of architecture and interface diagrams.
- Proficient at Python, Spark and R are essential. .NET based development is a plus.
- Knowledge and experience with data governance, including metadata management, enterprise data catalog, design standards, data quality governance, and data security.
- Experience with Agile process methodology, CI/CD automation, and cloud-based developments (Azure, AWS).
- Not required, but additional education, certifications, and/or experience are a plus: Certifications in Azure cloud, Knowledge of FinOps principles and cost managementto be considered to be considered

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: RTX1614ae
  • Position Id: 8934163
  • Posted 10 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Rockville, Maryland

Today

Contract

USD 67.85 - 70.00 per hour

Hybrid in Bethesda, Maryland

Today

Full-time

USD 77,600.00 - 176,000.00 per year

Hybrid in Washington, District of Columbia

4d ago

Easy Apply

Third Party, Contract

Depends on Experience

Washington, District of Columbia

5d ago

Easy Apply

Contract

$40 - $42

Search all similar jobs