AWS Data Engineer

Overview

Remote
USD60 - USD65
Contract - W2

Skills

Reporting
Real-time
Workflow
Data Processing
Analytics
Computer Science
Data Engineering
Python
PySpark
Writing
Amazon Lambda
Amazon Web Services
Amazon S3
Snow Flake Schema
Amazon Redshift
Management
Cloud Computing
Data Warehouse
SQL
Artificial Intelligence
Machine Learning (ML)

Job Details


AWS Data Engineer
Our client is seeking a AWS Data Engineer to build and optimize cloud-based data pipelines that power analytics and reporting across the enterprise. This role is focused on data engineering, with opportunities to explore AI/ML projects as the team grows. You ll work in a collaborative environment where your expertise in Python, AWS services, and modern cloud data platforms will directly impact the way the business leverages data.


  • Location: Remote (EST Hours)
  • Compensation: This job is expected to pay about 60-65/HR W2
  • No Visa Sponsorship Available for this role


What You ll Do:
Key Responsibilities


  • Design, build, and maintain data pipelines that move data from multiple sources into Snowflake for analytics and dashboarding
  • Write efficient code in Python, including PySpark and AWS Glue, to transform and process large-scale datasets
  • Develop and manage AWS Lambda functions to support real-time and event-driven data workflows
  • Leverage AWS services such as IAM, S3, CloudWatch, and CloudFormation to ensure secure, scalable, and reliable data operations
  • Partner with the team to enhance integration patterns, optimize data processing, and enable downstream consumption by analytics tools


What Gets You the Job:
Qualifications


  • Bachelor s degree in computer science or related field
  • 5+ years of experience in data engineering with strong coding skills in Python (PySpark and Glue)
  • Proven experience writing and deploying AWS Lambda functions
  • Solid understanding of AWS services including IAM, S3, CloudWatch, and CloudFormation
  • Hands-on experience with Snowflake (preferred) or Redshift for building and managing cloud-based data warehouses
  • Strong SQL skills and experience with pipeline development, ingestion, and transformation of diverse data types
  • Nice to have: familiarity with AI/ML concepts, LLMs, or machine learning frameworks (not a core requirement)


After applying to this role, you may receive an invitation from our AI Recruiter, Avery to schedule a virtual meeting to learn more about your background as an initial screening for this role.

Irvine Technology Corporation (ITC) connects top talent with exceptional opportunities in IT, Security, Engineering, and Design. From startups to Fortune 500s, we partner with leading companies nationwide. Our AI recruiter, Avery helps streamline the first step of your journey so we can focus on what matters most: helping you grow. Join us. Let us ELEVATE your career!

Irvine Technology Corporation provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Irvine Technology Corporation complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.