Overview
Remote
$60 - $70
Contract - W2
Contract - 12 Month(s)
Skills
DataBricks
Job Details
Job Title: DataBricks Lead Engineer
Duration: 6 to 12 Months
Location: Remote (100%)
Tax Term: W2
Exp: 12+ Years
Job responsibilities:
- Implementing highly performant data pipelines from multiple sources using Databricks
- Skilled in Databricks, Python, Scala, Azure Synapse and Azure Data Factory
- Integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained
- Develop and deliver documentation on data engineering capabilities, standards, and processes; participate in coaching, mentoring, design reviews and code reviews
- Working with other members of the project team to support delivery of additional project components (API interfaces)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
- Integrate Databricks with other technologies (Ingestion tools, Visualization tools).
Knowledge, Skills and Abilities:
- 7+ years relevant and progressive data engineering experience
- Deep Technical knowledge and experience in Databricks, Python, Scala, Microsoft Azure architecture and platform including Synapse, ADF (Azure Data Factory) pipelines and Synapse stored procedures
- Hands-on experience working with data pipelines using a variety of source and target locations (e.g., Databricks, Synapse, SQL Server, Data Lake, file-based, SQL and No-SQL database)
- Experience in engineering practices such as development, code refactoring, and leveraging design patterns, CI/CD, and building highly scalable data applications and processes
- Knowledge of advanced data engineering concepts such as dimensional modeling, ETL, data governance, data warehousing involving structured and unstructured data
- Thorough knowledge of Synapse and SQL Server including T-SQL and stored procedures
- Experience working with and supporting cross-functional teams in a dynamic environment
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Databricks & Azure Big Data Architecture Certification would be plus
- Must be team oriented with strong collaboration, prioritization, and adaptability skills required
- "Self-starter" attitude and the ability to make decisions with minimal guidance from others
- Innovative and passionate about your work and the work of your teammates
- Ability to comprehend and analyze operational systems and ask appropriate questions to determine how to improve, migrate or modify the solution to meet business needs
- Knowledge of the Agile development process