Overview
Remote
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Skills
data architect
databricks
aws
data pipeline
kafka
delta lake
Job Details
Architecture & Design
- Define and design end-to-end data lakehouse architecture leveraging Databricks, Delta Lake, and cloud-native services.
- Create reference architectures for batch, real-time, and streaming data pipelines.
- Architect data ingestion, curation, storage, and governance frameworks.
- Ensure the platform is scalable, secure, and optimized for performance and cost.
- Establish standards for data lineage, metadata management, and compliance.
- Work with enterprise architects to align Databricks solutions with overall cloud strategy (AWS).
Leadership & Delivery
- Lead and mentor a team of data engineers in building robust data pipelines.
- Collaborate with data scientists, BI teams, and business stakeholders to enable advanced analytics and AI/ML use cases.
- Drive adoption of DevOps and CI/CD practices for data engineering.
- Review designs and solutions to ensure adherence to architectural principles.
Implementation & Optimization
- Build, optimize, and manage large-scale PySpark/SQL pipelines in Databricks.
- Enable real-time data processing through Kafka, Kinesis, or Event Hubs.
- Implement security best practices including RBAC, data masking, and encryption.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.