Overview
On Site
$40 - $50
Contract - W2
Contract - 12 Month(s)
100% Travel
Able to Provide Sponsorship
Skills
Databricks & AWS Big Data Architecture Certification would be plus.
Solid experience with SQL-based database is required
Delta lake
dbConnect
db API 2.0
Databricks workflows orchestration
Job Details
Title: Databricks Developer
Location: DMV Area
Rate: $50 on W2
In this role, the Databricks Developer is responsible for solving the real-world cutting-edge problem to meet both functional and non-functional requirements.
Responsibilities
- Closely work with Architect and lead to design solutions to meet functional and non-functional requirements.
- Participate to understand architecture and solution design artifacts.
- Proactively implement engineering methodologies, standards, and leading practices.
- Identify, communicate, and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle.
Minimum qualifications
- Databricks & AWS Big Data Architecture Certification would be plus.
- Solid experience with SQL-based database is required.
- Two-year experience with Hadoop/Spark in either an administrative, development or support role is required.
- Experience in engineering practices such as development, code refactoring, and leveraging design patterns, CI/CD, and building highly scalable data applications and processes.
- Knowledge of advanced data engineering concepts such as dimensional modeling, ETL, data governance, data warehousing involving structured and unstructured data
- Must have excellent coding skills either Python or Scala.
- Must have experience in Data Engineering domain.
- Must have implemented at least 2 project end-to-end in Databricks.
- Must have experience on Databricks which consists of various components as below.
- Delta lake
- dbConnect
- db API 2.0
- Databricks workflows orchestration
- Must be well versed with Databricks Lakehouse concept, Medallion architecture and its implementation in enterprise environments.
- Must have good understanding to create complex data pipeline.
- Must have extensive knowledge of Spark and Hive data processing framework.
Preferred Qualifications/Skills
- Good to have Unity catalog and basic governance knowledge.
- Good to have MicroStrategy or Tableau reporting experience.
- Good to have knowledge of docker and Kubernetes.
- Good to have Linux/Unix Admin or development skills.