AWS Datalake Engineer

Overview

Remote
$70 - $80
Contract - W2
Contract - Independent
Contract - 24 Month(s)

Skills

AWS
Datalake
Python
Java

Job Details

Job Description:
Job Title: Data Lakehouse Engineer (AWS + Python + DevOps)
Location: Remote + Travel - Dallas, TX - 1 week in quarter (3 Months) - Expenses Paid.
Type: 24+ Months Contract
Experience Level: 14+ years of experience required.

Job Summary
We are seeking a highly skilled Data Lakehouse Engineer to join our team and help solve critical business and technology challenges. You will play a key role in building and maintaining a scalable Data Lakehouse solution to ensure the right data reaches the right users at the right time empowering both business and technical teams with trusted, governed data.

Key Responsibilities
  • Design and implement a modern Data Lakehouse architecture using AWS native services.
  • Develop and maintain ETL pipelines using AWS Glue, Lambda, Step Functions, and Python.
  • Configure and manage AWS Lake Formation for secure, governed access to data in S3.
  • Integrate and optimize data workflows using PySpark and serverless technologies.
  • Build and deploy infrastructure as code using CloudFormation, Terraform, Stacker, and the Serverless Framework.
  • Collaborate with DevOps teams using GitLab for version control and CI/CD pipelines.
  • Work with IAM for access control and DynamoDB for data storage requirements.

Technical Skills Required
Cloud & Data Engineering (AWS):
  • AWS Lake Formation
  • S3
  • AWS Glue (Crawler, Catalog, Registry, Glue Jobs)
  • AWS Step Functions
  • AWS Lambda
  • AWS IAM
  • DynamoDB
Programming & Data Processing:
  • Python (AWS-specific development)
  • PySpark
DevOps & Infrastructure as Code:
  • GitLab (CI/CD)
  • Serverless Framework
  • Stacker
  • CloudFormation
  • Terraform

Preferred Qualifications
  • Proven experience in building data lakehouse or lake-based data platforms on AWS.
  • Strong knowledge of data governance and access management using AWS Lake Formation.
  • Hands-on experience with real-time and batch data processing.
  • Familiarity with best practices in cloud security, DevOps, and CI/CD pipelines.
  • Excellent problem-solving, communication, and collaboration skills.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.