Azure Databricks Data Engineer

Overview

Remote
$60 - $100
Contract - Independent
Contract - 6 Month(s)
No Travel Required

Skills

Expert-level Azure Databricks
Azure Cloud Platform
Data Engineering
Programming
Data Integration
Streaming & Batch
Security & Compliance

Job Details

Azure Databricks Data Engineer

Position Overview

Contract Duration: 6 months
Location: Remote or Hybrid (Kansas City area)
Rate: Hourly (contract position)
Work Authorization: Must be legally authorized to work in the US without sponsorship

About the Project

Join CFA s initiative to extend our industry renowned lending platform through Azure Databricks-powered data integration. You ll architect and integrate with a centralized data lake that unifies disparate systems (Salesforce, loan servicing platforms, document management, DocuSign, Conga) into a single source of truth for agricultural finance operations.

Core Responsibilities

  • Design and implement Azure Databricks data lake architecture integrating multiple enterprise data sources
  • Build real-time streaming and batch data processing pipelines for Salesforce, loan servicing, and document management systems
  • Develop data quality, validation, and cleansing processes to ensure data integrity across integrated systems
  • Create analytics-ready data structures optimized for business intelligence and operational reporting
  • Implement data governance, security controls, and compliance measures for SOC 2 Type II requirements
  • Collaborate with API/service layer team to expose unified data through REST and GraphQL endpoints

Required Technical Skills

  • Expert-level Azure Databricks experience with Delta Lake, Spark SQL, and PySpark
  • Azure Cloud Platform: Data Factory, Event Hubs, Blob Storage, Key Vault, Azure AD integration
  • Data Engineering: ETL/ELT pipeline design, data modeling (dimensional/star schema), data warehouse patterns
  • Programming: Python (primary), SQL, Scala (plus)
  • Data Integration: Experience with Salesforce APIs, REST/SOAP integrations, document system connectors
  • Streaming & Batch: Real-time data ingestion patterns, scheduled synchronization strategies
  • Security & Compliance: Encryption at rest/in transit (AES-256, TLS 1.3), RBAC, data classification

Preferred Qualifications

  • Experience integrating loan servicing or financial systems data
  • Knowledge of document management systems (DocuSign, Conga) and email archival
  • Familiarity with GraphQL data exposure patterns
  • Azure certifications (Data Engineer Associate or Databricks certifications)
  • Agricultural finance or fintech domain experience
  • Experience with SOC 2 or similar compliance frameworks

Technical Environment

  • Cloud: Microsoft Azure (Databricks, Data Factory, Blob Storage, Key Vault)
  • Tools: Git/GitHub, Azure DevOps/GitHub Actions, Docker, Azure CLI
  • Data Sources: Salesforce, NLS loan servicing, email archives, DocuSign, Conga
  • Backend Stack:js/Python APIs, PostgreSQL, Redis
  • Frontend Stack: React 18+, TypeScript (awareness helpful for API integration)

Success Criteria

  • Operational data lake with all source systems integrated and synchronized
  • Data quality processes achieving >95% accuracy and completeness
  • Real-time pipelines with <5 minute latency for critical data streams
  • Documentation enabling knowledge transfer and long-term maintainability
  • Security controls meeting SOC 2 Type II requirements

How to Apply

Submit your resume to with your git portfolio or examples of Azure Databricks projects showing multi-source data integration, real-time and batch pipeline implementations at scale and any security/compliance work (SOC 2, GDPR, etc.) you want to showcase.

CFA is an Equal Opportunity Employer

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.