Databricks Engineer
Full Time
No Travel Required
Remote
$65 - $75/hr


CogniSoft Technologies
Fitment
Dice Job Match Score™
📊 Calculating match score...
Job Details
Skills
- Data Masking
- Business Intelligence
- Continuous Integration
- Data Engineering
- Cloud Computing
- Collaboration
- Communication
- Continuous Delivery
- Analytics
- Apache Kafka
- Apache Spark
- Artificial Intelligence
- Amazon Kinesis
- Amazon Redshift
- Amazon S3
- Amazon Web Services
- Leadership
- Machine Learning (ML)
- Meta-data Management
- Encryption
- Extract
- Transform
- Load
- Git
- IT Management
- Data Processing
- Data Quality
- Data Storage
- Python
- DevOps
- ELT
- Real-time
- Regulatory Compliance
- Scalability
- Databricks
- Optimization
- Performance Tuning
- PySpark
- RBAC
- SQL
- Security Controls
- Storage
- Streaming
- Terraform
- Unity
- Workflow
Summary
Role: Databricks Engineer
Location: REMOTE
Duration: Contract to hire after 6-month term
We are looking for a hands-on Databricks Engineer with strong AWS experience to design, build, and optimize scalable data pipelines and lakehouse solutions. The role focuses on implementing robust batch and streaming data solutions using Databricks, Delta Lake, and AWS cloud-native services, ensuring high performance, scalability, and security.
Key Responsibilities
• Build and maintain end-to-end data pipelines using Databricks, Delta Lake, and AWS services
• Develop batch, real-time, and streaming data processing workflows
• Implement data ingestion, transformation, curation, and storage pipelines
• Build and optimize large-scale PySpark and SQL-based jobs in Databricks
• Enable real-time data processing using Kafka, AWS Kinesis, or similar streaming tools
Data Lakehouse Implementation
• Work on Databricks-based lakehouse architecture using Delta Lake
• Implement scalable and optimized data storage and processing frameworks
• Ensure data quality, consistency, and reliability across pipelines
• Support metadata management, data lineage, and governance implementation
Cloud & Platform Engineering (AWS)
• Work with AWS services such as S3, Glue, Lambda, Kinesis, and Redshift
• Ensure pipelines are scalable, secure, and cost-optimized in AWS environments
• Implement security controls including RBAC, encryption, and data masking
Optimization & Best Practices
• Tune Spark jobs for performance and cost efficiency
• Monitor and troubleshoot data pipeline issues in production
• Follow CI/CD and DevOps practices for deploying data engineering solutions
• Ensure adherence to data engineering standards and best practices
Collaboration
• Work closely with BI teams, and business stakeholders
• Support analytics and AI/ML data requirements through curated datasets
• Collaborate with architects to ensure alignment with AWS-based data strategy
•
Technical Leadership & Architecture
• Lead the design and implementation of scalable, endtoend
• Required Skills & Qualifications:
• Strong hands-on experience with Databricks.
• Proficiency in Python, PySpark, and SQL
• Strong experience in AWS cloud services (S3, Glue, Lambda, Kinesis, Redshift)
• Experience building ETL/ELT data pipelines
• Strong understanding of Delta Lake and lakehouse concepts
• Experience with streaming and batch data processing
• Knowledge of CI/CD tools and Git
• Strong troubleshooting and performance tuning skills
Databricks Certified Data Engineer Professional certification is mandatory
• IaC (Terraform/CloudFormation)
• Data quality & observability frameworks
• Deeper Databricks-specific features (DLT, Unity Catalog, Workflows)
• Security & compliance depth
• DevOps tooling specifics
• Leadership/co
• Communication expectations
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
- Dice Id: 10527831
- Position Id: 8953639
- Posted 1 day ago
Company Info
CogniSoft Technologies focuses on providing niche Business Intelligence & Data Analytic solutions to businesses across various industries. We are Tableau Alliance Partners. We are also an SAP certified Company.
We are a team of experienced, dedicated, hardworking, and innovative professionals who have had proven their expertise in different business verticals such as Banking and Financial Industry, Healthcare, Supply chain, Insurance, Software Development, IT consulting, Business consulting etc.
Create job alert
Similar Jobs
It looks like there aren't any Similar Jobs for this job yet.
Search all similar jobs