Overview
On Site
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Job Details
Job Description:
Our client is seeking a highly skilled and motivated DataBricks Developer with Python experience to join a dynamic team. This role involves designing and implementing scalable data processing solutions using Azure Databricks. The ideal candidate will collaborate with data scientists and analysts to meet data needs effectively and ensure data integrity throughout the processes. This position requires on-site presence three days a week, with the flexibility to work remotely for two days. The role offers an opportunity to engage in a technically advanced environment where innovative, data-driven solutions are a priority.
Responsibilities:
Qualifications:
Our client is seeking a highly skilled and motivated DataBricks Developer with Python experience to join a dynamic team. This role involves designing and implementing scalable data processing solutions using Azure Databricks. The ideal candidate will collaborate with data scientists and analysts to meet data needs effectively and ensure data integrity throughout the processes. This position requires on-site presence three days a week, with the flexibility to work remotely for two days. The role offers an opportunity to engage in a technically advanced environment where innovative, data-driven solutions are a priority.
Responsibilities:
- Design and implement scalable and efficient data processing solutions using Azure Databricks.
- Work collaboratively with data scientists and data analysts to understand and address data requirements.
- Optimize existing data pipelines and workflows to improve performance and scalability.
- Ensure high standards of data quality and integrity are maintained throughout data transformation and loading processes.
- Develop, maintain, and iterate on data architecture and best practices to support data-driven decision-making.
- Identify, troubleshoot, and resolve data-related issues promptly to ensure minimal disruption to operations.
- Implement performance monitoring and setup alerting systems for both pipeline performance and data integrity issues.
Qualifications:
- Proven proficiency in ETL processes between different RDBMS (e.g., Oracle) and integration with Databricks and NoSQL databases.
- Strong working knowledge and experience in Python, PySpark, and Scala.
- Ability to work effectively in both team settings and independently.
- Experience with MongoDB or similar databases is highly desirable.
- Excellent problem-solving skills and ability to think algorithmically.
- Strong communication skills, capable of conveying complex systems and logic to non-technical stakeholders.
- Must be available to work on-site three days per week and remotely two days per week.
- Bachelor’s degree in Computer Science, Engineering, or a related field is preferred.
- Previous experience in a similar role, demonstrating a track record of managing complex data projects.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.