Senior Data Engineer

Overview

Remote
On Site
Hybrid
$70,000 - $80,000
Contract - W2
Contract - Independent

Skills

SQL
Python
ELT
Data Modeling
Data Warehouse
Amazon Web Services
Apache Spark
Hadoop
GCP
Azure
Terraform or CloudFormation
Apache Kafka
Apache Storm
Power BI
Data Engineering
Problem Solving
Advanced Analytics
Analytics
Cloud Computing
Communication
Data Governance
Data Quality
Scalability
Data Integrity

Job Details


Role Overview
We are looking for a mid-level Data Engineer with 3 4 years of hands-on experience to join a client-facing project team. In this role, you'll develop scalable data pipelines, design efficient data models, and work with modern data platforms like Apache Spark, Snowflake, and Python to support advanced analytics and reporting use cases.

Responsibilities
  • Build and manage scalable batch and real-time data pipelines
  • Design and implement ETL/ELT processes using Apache Spark, dbt, and similar tools
  • Write efficient SQL queries for analytics and data warehousing
  • Develop and maintain data warehouse solutions (e.g., Snowflake, BigQuery)
  • Collaborate with stakeholders to gather requirements and design data models
  • Use Python to write clean, reusable code for data ingestion, transformation, and automation
  • Ensure high data integrity, performance, and scalability across systems
  • Follow best practices for data governance, security, and compliance

Required Qualifications
  • 3 4 years of experience as a Data Engineer or similar role
  • Strong proficiency in SQL and Python
  • Experience building ETL pipelines and working with Apache Spark or similar frameworks
  • Solid understanding of data warehousing concepts and architecture
  • Hands-on experience with Snowflake, BigQuery, or equivalent technologies
  • Experience in data modeling and designing schemas
  • Familiarity with cloud platforms such as AWS, Google Cloud Platform, or Azure
  • Strong problem-solving and communication skills

Preferred Skills (Nice to Have)
  • Experience with streaming platforms like Kafka or Kinesis
  • Exposure to CI/CD practices in data engineering workflows
  • Background in consulting or multi-client projects
  • Familiarity with data quality frameworks and monitoring solutions

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.