Overview
Remote
Depends on Experience
Contract - W2
Contract - Independent
Skills
Spark
Python
pyspark
databricks
Job Details
Title: Databricks Developer
- Experience: 7 + yrs exp
- Position Type : Contract
- Location : Remote
- Must Have Skill : Spark , Pyspark ,Python ,Data Bricks
We are seeking a skilled Databricks Developer to join our team .
As a Databricks Developer, you will be responsible for designing, building, and maintaining data pipelines and analytics solutions using the Databricks platform.
You will collaborate with cross-functional teams to analyze data requirements, develop data models, and implement solutions that drive insights and decision-making.
Responsibilities:
- Design and develop data pipelines and ETL processes using Databricks to ingest, process, and transform large volumes of data.
- Implement data lake architecture and data modeling best practices using Databricks Delta Lake.
- Collaborate with data scientists and analysts to deploy machine learning models and perform advanced analytics on Databricks.
- Optimize and tune Databricks jobs and Spark applications for performance and scalability.
- Implement security controls and data governance policies to ensure compliance and data integrity within the Databricks environment.
- Develop monitoring and alerting solutions to proactively identify and address issues in Databricks clusters and jobs.
- Stay up-to-date with the latest developments in Databricks and big data technologies, and evaluate new tools and techniques to enhance our data platform.
Requirements:
- Bachelor s degree in computer science, Engineering, or related field.
- Proven experience as a Databricks Developer or similar role, with hands-on experience in designing and building data solutions using Databricks.
- Proficiency in Apache Spark and Scala or Python programming languages.
- Experience with data lake architecture, data modeling, and ETL processes.
- Strong understanding of distributed computing principles and big data technologies.
- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud Platform.
- Excellent problem-solving and troubleshooting skills.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
Preferred Qualifications:
- Experience with other big data technologies such as Hadoop, Kafka, or Apache Flink.
- Certifications in Databricks or related technologies.
- Experience with containerization technologies such as Docker and Kubernetes.
- Knowledge of SQL and relational databases.
- Contributions to open-source projects or active participation in the developer community.