Senior Data Engineer

Overview

Hybrid
$80 - $86
Contract - W2
Contract - 6 Month(s)

Skills

Python
PySpark
Spark
Apache Kafka
Snow Flake Schema
Streaming
Testing
Terraform
Shell Scripting
Scripting
Scalability
SQL
Version Control
Workflow
Machine Learning Operations (ML Ops)
Machine Learning (ML)
Google Cloud Platform
ELT
Documentation
Databricks
Data-flow Diagrams
Data Warehouse
Data Science
Data Modeling
Decision-making
Data Governance
Data Engineering
Amazon Web Services
Analytics
Big Data
Cloud Computing
Collaboration
Continuous Delivery
Continuous Integration
Extract
Transform
Load
Microsoft Azure
Orchestration
Software Development
Specification Gathering
Git

Job Details

Industry: Mass Media & Entertainment

Job Title: Senior Data Engineer

Location: Hybrid in Seattle, Los Angeles, Glendale, Burbank

Duration: 6 months +

Rate: up to $86.20/HR

Job Description

Our client is looking for a Senior Data Engineer. Data is essential for all our decision-making needs whether it s related to product design, measuring advertising effectiveness, helping users discover new content or building new businesses in emerging markets. This data is deeply valuable and gives us insights into how we can continue improving their service for their users, advertisers and content partners. Their Audience team is seeking a highly motivated Data Engineer with a strong technical background and passionate about diving deeper into Big Data to develop state of the art Data Solutions.

About the Role: We are seeking a highly skilled Senior Data Engineer with expertise in Python, PySpark, SQL, Databricks, Snowflake and Airflow to join our growing data team. This role involves building and maintaining scalable data pipelines, working cross-functionally with data science, product, and architecture teams, and enabling data-driven decision-making across the organization.

Key Responsibilities:

Design, implement, and maintain ETL/ELT pipelines using Python, PySpark, Databricks, and Airflow to deliver clean, reliable, and timely data.

Build optimized data models and develop advanced SQL queries within Snowflake and other cloud data platforms.

Use Python and Shell Scripting to develop data transformation scripts, automation tools, and utilities to support data engineering and analytics.

Collaborate closely with data scientists, product managers, data architects, and external engineering teams to understand requirements and deliver high-quality data solutions.

Support data science workflows by providing well-structured datasets, enabling feature engineering, and integrating model outputs into production pipelines.

Produce and maintain comprehensive documentation, including technical specifications, data flow diagrams, and pipeline monitoring procedures.

Ensure high reliability, scalability, and performance of data systems, implementing best practices for testing, monitoring, and recovery.

Participate in architecture reviews and provide input on strategic data platform decisions.

Mentor junior team members and support peer code reviews.

Required Skills & Qualifications:

5+ years of experience in data engineering, software development, or related fields.

Strong expertise in PySpark, Python, and SQL.

Hands-on experience with Databricks, Snowflake, and Airflow in production environments.

Solid understanding of data warehousing, data modeling, and pipeline orchestration in cloud environments (e.g., AWS, Azure, or Google Cloud Platform).

Experience working with data science teams, understanding ML lifecycle requirements and model deployment strategies.

Strong collaboration and communication skills with technical and non-technical stakeholders.

Excellent documentation skills and experience producing technical design artifacts.

Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.

Nice to Have:

Familiarity with Kafka, or streaming technologies.

Experience with ML Ops, feature stores, or deploying machine learning models in production.

Understanding of data governance, security, and compliance standards.

Experience with CI/CD, version control (Git), and infrastructure-as-code tools (e.g., Terraform).

Education Required: Bachelor s or Master s Degree in Computer Science, Information Systems or related field

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.