AWS Data Engineer

  • Posted 8 hours ago | Updated 8 hours ago

Overview

Remote
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2

Job Details

Job Description:
Our client is seeking a highly skilled and experienced AWS Data Engineer to join their dynamic technology team. This role involves designing and developing robust data solutions on AWS, leveraging cutting-edge technologies like Apache Iceberg. The successful candidate will play a crucial role in building and maintaining scalable data architectures and pipelines, ensuring high data quality and security. This is a fully remote role and can be performed from an approved location.

Responsibilities:
  • Design and develop scalable, reliable, and efficient data lakehouse solutions on AWS, employing Apache Iceberg and other AWS services for table formats.
  • Construct, automate, and maintain ETL/ELT processes to integrate data from diverse sources into the AWS ecosystem.
  • Develop and manage secure and scalable RESTful and other APIs to facilitate data access for internal teams and applications.
  • Utilize a broad range of AWS tools for data processing, storage, and analytics, including Amazon S3, Amazon EMR, and AWS Lake Formation with native Iceberg support.
  • Build and manage Apache Iceberg tables on Amazon S3, enabling features like ACID transactions, time travel, and schema evolution.
  • Implement data performance optimization strategies, including partitioning, data compaction, and fine-tuning for enhanced query performance.
  • Ensure data integrity and quality through robust validation and error-handling processes, leveraging transactional capabilities of Iceberg.
  • Implement stringent data security measures, access controls, and compliance with data protection regulations using AWS Lake Formation and IAM or Cognito.

Qualifications:
  • Bachelor’s degree in Computer Science, Information Technology, or related field.
  • Proven experience in data engineering with extensive hands-on experience in AWS cloud services.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Strong SQL skills for querying, data modeling, and database design.
  • Expertise in AWS services including S3, EMR, Lambda, API Gateway, SageMaker, and IAM.
  • Hands-on experience in building and managing Apache Iceberg tables.
  • Experience with big data technologies like Apache Spark and Hadoop.
  • Experience in developing and deploying RESTful APIs, with knowledge of performance and security best practices.
  • Experience with ETL tools and workflow orchestration tools like Apache Airflow.
  • Familiarity with DevOps practices, CI/CD pipelines, and infrastructure as code.

Pay Range: $80.00 - $88.60 Hourly

This is a fully remote role and can be performed from an approved location.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About GDH