Data Brick engineer with Data Brick certification must

Overview

Remote
$100,000 - $120,000
Full Time

Skills

Data Bricks
ETL
AWS
Azure
Airflow
DevOps
Ci/CD
Terraform
API
SQL
Python
PySpark
SPlunk

Job Details

Hi,

We have urgent requirements for our direct client, please go through the below Job Description. If you are interested please send me your updated word format resume to and reach me @ .

Title: Data Brick engineer with Data Brick certification must

Location: Remote

Duration: Full Time

Role & Responsibilities Overview:

  • Develop and optimize ETL pipelines from various data sources using Databricks on cloud (AWS, Azure, etc.)
  • Experienced in implementing standardized pipelines with automated testing, Airflow scheduling, Azure DevOps for CI/CD, Terraform for infrastructure as code, and Splunk for monitoring
  • Continuously improve systems through performance enhancements and cost reductions in compute and storage
  • Data Processing and API Integration: Utilize Spark Structured Streaming for real-time data processing and integrate data outputs with REST APIs
  • Lead Data Engineering Projects to manage and implement data-driven communication systems
  • Experienced with Scrum and Agile Methodologies to coordinate global delivery teams, run scrum ceremonies, manage backlog items, and handle escalations
  • Integrate data across different systems and platforms
  • Strong verbal and written communication skills to manage client discussions

Candidate Profile:

  • 8+ years experience in developing and implementing ETL pipelines from various data sources using Databricks on cloud
  • Based out of US
  • Some experience in insurance domain/ data is must
  • Programming Languages SQL, Python
  • Technologies - IaaS (AWS or Azure or Google Cloud Platform), Databricks platform, Delta Lake storage, Spark (PySpark, Spark SQL).
  • Good to have - Airflow, Splunk, Kubernetes, Power BI, Git, Azure Devops