Overview
Skills
Job Details
"Must have experience in Data Engineering domain .
Must have excellent coding skills either Python or Scala, preferably Python.
Must have implemented at least 2 project end-to-end in Databricks.
Must have at least experience on databricks which consists of various components: Delta lake, dbConnect, db API 2.0, Databricks workflows orchestration
Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments.
Must be strong in SQL and sprak-sql.
Must have extensive knowledge of Spark and Hive data processing framework.
Must have worked on any cloud (Azure, AWS, Google Cloud Platform) and most common services like ADLS/S3, ADF/Lambda, CosmosDB/DynamoDB, ASB/SQS, Cloud databases.
Good to have Unity catalog and basic governance knowledge.
CI/CD experience to build the pipeline for Databricks jobs.
knowledge of DBT, docker and Kubernetes."