Overview
Remote
Depends on Experience
Contract - W2
Skills
Azure
AWS
GCP
Python
Databricks
Job Details
Experience:
- Multi-cloud ecosystems experience - Azure, AWS, Google Cloud Platform
- Expertise with PySpark, Scala, or Python for data processing and transformation within Databricks
- CI/CD experience leveraging Gitlab pipelines, Databricks DABs, or similar tooling
- Bachelor's degree in CIS or business-related field to be eligible for future conversion to full-time
- 3-5 years of relevant experience as a Data Engineer or Developer, with significant hands-on production experience in Databricks workflows and data lake house platforms
- Proficiency in Databricks Unity Catalog, API integrations and connector tools for BI platforms, and associated data toolkit frameworks (data governance, lineage tracking, access control, etc)
Ability to work fluently with delimited file formats (CSV, TSV, etc), XML, and JSON for correct parsing and validation of content during ingestion and processing
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.