Databricks Developer

  • New York City, NY
  • Posted 13 hours ago | Updated 12 hours ago

Overview

Hybrid
Depends on Experience
Contract - W2
Contract - 12 Month(s)

Skills

Databricks
Apache Spark
Python
Scala
Hadoop
Hive
Kafka

Job Details

Job Role : Databricks Developer

Job Location: New York City, NY / Toronto Canada

12+ Months

Your future duties and responsibilities: The Databricks Developer will be responsible for designing, developing, and maintaining scalable data pipelines and solutions using Databricks. The ideal candidate will have a strong background in data engineering, experience with big data technologies, and a deep understanding of Databricks and Apache Spark.

Key Responsibilities:

  • Design and develop scalable data pipelines and ETL processes using Databricks and Apache Spark.
  • Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Optimize and tune data pipelines for performance and scalability.
  • Implement data quality checks and validations to ensure data accuracy and consistency.
  • Monitor and troubleshoot data pipelines to ensure reliable and timely data delivery.
  • Develop and maintain documentation for data pipelines, processes, and solutions.
  • Implement best practices for data security, governance, and compliance.
  • Participate in code reviews and contribute to the continuous impro

Required qualifications to be successful in this role:

  • Strong experience with Databricks and Apache Spark.
  • Proficiency in programming languages such as Python, Scala, or Java.
  • Experience with big data technologies such as Hadoop, Hive, and Kafka.
  • Strong SQL skills and experience with relational databases.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Knowledge of data warehousing concepts and technologies.
  • Experience with version control systems such as Git.
  • Strong problem-solving skills and attention to detail.
  • Excellent communication and collaboration skills.
  • Desired qualifications/non-essential skills required:
  • Experience with Delta Lake and Databricks Delta.
  • Experience with data visualization tools such as Power BI, Tableau, or Looker.
  • Knowledge of machine learning and data science concepts.
  • Experience with CI/CD pipelines and DevOps practices.
  • Certification in Databricks, AWS, Azure, or Google Cloud.

Education: At least a bachelor s degree (or equivalent experience) in Computer Science, Software/Electronics Engineering, Information Systems, or a closely related field is required for the project

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About iTech US, Inc.