Senior Data Engineer (AWS | Snowflake | Databricks | Python)

Overview

Hybrid
Depends on Experience
Contract - Independent
Contract - 12 Month(s)

Skills

AWS Glue
AWS Lambda
Redshift
EMR
Athena
Python
Pandas
PySpark
Terraform
AWS Data Engineer
ETL
Data Pipelines
Cloud Data Engineering
Data Lake
S3
Spark
Snowflake
West Coast

Job Details

About the Role:

We are seeking a Senior AWS Lead Data Engineer with strong hands-on development experience in Python, Pandas, PySpark, Terraform, and AWS services (Glue, Lambda, S3, Redshift, EMR).
This is a lead-level individual contributor position ideal for a technically strong engineer who enjoys building and optimizing data pipelines, not a pure architect role.

Key Responsibilities:

  • Design, develop, and optimize data pipelines using AWS Glue, Redshift, S3, Lambda, EMR, and Athena.

  • Build and maintain ELT processes to integrate and transform data from multiple sources.

  • Collaborate with cross-functional teams to understand data needs and deliver high-quality solutions.

  • Write robust, efficient code using Python, Pandas, and PySpark.

  • Implement infrastructure as code (IaC) using Terraform.

  • Apply data quality, governance, and security best practices.

  • Monitor, troubleshoot, and resolve pipeline performance issues.

  • Stay current with AWS data technologies and continuously improve pipeline efficiency.


Required Skills and Experience:

  • Bachelor s or Master s degree in Computer Science, IT, or related field.

  • 5 8 years of professional experience as a Data Engineer, with a strong focus on AWS-based data solutions.

  • Expertise in:

    • AWS Glue, Redshift, S3, Lambda, EMR, Athena

    • Python, Pandas, PySpark, SQL

    • Terraform (IaC)

  • Hands-on experience with AWS RDS, PostgreSQL, and SAP HANA.

  • Solid understanding of ETL / ELT processes, data modeling, and data warehousing.

  • Familiar with CI/CD and version control (Git).

  • Strong analytical, debugging, and problem-solving skills.


Preferred Qualifications:

  • AWS Certifications:

    • AWS Certified Data Analytics

    • AWS Certified Developer

    • AWS Certified Solutions Architect

  • Experience with SageMaker, Textract, Rekognition, Bedrock, or other GenAI/LLM tools.

  • Familiarity with Apache Spark, Hadoop, or Kafka.

  • Experience with data visualization tools (Tableau, Power BI, or AWS QuickSight).

  • Knowledge of Azure DevOps / Pipelines.

  • Familiarity with data governance and catalog tools (AWS DQ, Collibra, DataBrew).


Sample Certifications (Preferred):

  • AWS Certified Developer 2019

  • Denodo Platform Certified Developer 2018

  • Tableau Desktop Qualified Associate 2018

  • Hortonworks HDP Certified Administrator 2016

  • Cloudera Hadoop Developer 2014

  • Oracle Data Warehousing 11g Essentials 2011

  • Oracle Business Intelligence 10 Foundation Essentials 2011


Additional Information:

  • Remote opportunity preference for Southern California, California, or West Coast time zone candidates.

  • Candidates must be authorised to work in the United States without current or future sponsorship.

  • An immediate start is available for the right candidate.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.