Senior Data Engineer (34455)

Overview

Remote
On Site
Contract - W2

Skills

Unstructured Data
Analytical Skill
Data Security
Regulatory Compliance
GitHub
Business Intelligence
Testing
Data Engineering
Data Modeling
Warehouse
Optimization
Snow Flake Schema
PySpark
Cloud Computing
Amazon S3
Step-Functions
Amazon Redshift
Python
SQL
Data Manipulation
Extract
Transform
Load
ELT
Data Governance
Encryption
Management
Communication
Documentation
Collaboration
Orchestration
Terraform
Microsoft Azure
Apache Kafka
Streaming
Continuous Integration
Continuous Delivery
Amazon Web Services
Data Analysis
Databricks
Agile
DevOps
Git
Workflow

Job Details

We are seeking a Senior Data Engineer with proven expertise in Databricks, Snowflake, and AWS Cloud. This fully remote contract role requires hands-on experience designing, developing, and optimizing scalable, secure, and high-performance data pipelines across modern cloud-based ecosystems.

Responsibilities
  • Design, develop, and maintain data pipelines and ETL workflows using Databricks and Snowflake.
  • Implement data ingestion, transformation, and orchestration solutions across structured and unstructured data sources.
  • Develop and optimize data models, warehouse schemas, and partitioning strategies for analytical performance.
  • Build and maintain AWS-based data infrastructure (e.g., S3, Lambda, Glue, Redshift, IAM, CloudFormation).
  • Ensure data security and compliance through encryption/decryption processes and governance frameworks (e.g., encrypt/decrypt guest reservation data).
  • Implement CI/CD pipelines for data engineering using tools like GitHub Actions, AWS CodePipeline, or Azure DevOps.
  • Collaborate with data scientists, analysts, and architects to align infrastructure with business intelligence needs.
  • Monitor, troubleshoot, and resolve data pipeline performance or reliability issues.
  • Document technical solutions and follow best practices for code versioning, testing, and deployment.


Must Have
  • 5+ years of experience in Data Engineering, building and maintaining cloud-based data solutions.
  • Hands-on experience with Snowflake (mandatory):
    • Expertise in Snowflake SQL, data modeling, staging, warehouse optimization, time travel, and data sharing.
    • Experience integrating Snowflake with Databricks and AWS data services.
  • Strong proficiency in Databricks (PySpark, Delta Lake, notebook development).
  • Solid knowledge of AWS Cloud services such as S3, Glue, Athena, Lambda, Step Functions, and Redshift.
  • Proficiency with Python and SQL for data manipulation and ETL logic.
  • Strong understanding of ETL/ELT frameworks, data lakehouse architectures, and data governance principles.
  • Experience with data encryption, decryption, and key management best practices.
  • Excellent communication, documentation, and collaboration skills.

Nice to Have
  • Experience with Airflow, dbt, or AWS Glue Workflows for orchestration.
  • Familiarity with Terraform or CloudFormation for infrastructure as code.
  • Exposure to Azure Data Factory, Google BigQuery, or Kafka streaming pipelines.
  • Knowledge of CI/CD automation for data pipelines.
  • Certifications (e.g., AWS Certified Data Analytics - Specialty, Databricks Certified Data Engineer, SnowPro Core).
  • Experience working in agile environments with DevOps and Git-based workflows.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Myticas LLC