Senior Data Engineer

  • Dallas, TX
  • Posted 3 days ago | Updated 3 days ago

Overview

Hybrid
$120,000 - $140,000
Full Time
No Travel Required

Skills

AWS
Iceberg
DynamoDB
Kafka
Glue

Job Details

Job Title: Sr. Data Engineer AWS S3, Iceberg, DynamoDB, Kafka or Glue/Glue Streaming

Location: Dallas, TX or NJ or Remote

Job Type: [Full-Time / Contract]

Job Summary:

We are looking for a skilled and proactive Sr. Data Engineer with expertise in cloud-native data platforms and streaming data architectures. The ideal candidate will have hands-on experience with AWS S3, Apache Iceberg, Amazon DynamoDB, Amazon Glue/Glue Streaming, Apache Kafka, and and will play a key role in managing and optimizing our data storage at rest for both cache layer and operational data store

Key Responsibilities:

  • Design and maintain scalable data at-rest solutions using AWS S3 and Apache Iceberg, and DynamoDB with data exception handling
  • Administer and optimize DynamoDB for high-performance, low-latency applications; and Iceberg for operational data store with proper data retention, backup, restore options
  • Design and implement the physical data model with operational considerations.
  • Collaborate with other data engineers, analyst, architects to support data modeling, , and data ingestion and egress integration efforts.
  • Ensure data security, compliance, and governance across all platforms.
  • Monitor system performance and troubleshoot issues across the data stack.
  • Automate database operations using CFT (Cloud Formation Template) and Infrastructure as Code (IaC) tools.
  • Design and implement Glue or Glue streaming jobs
  • Collaborate with DevOps team to promote code into upper environments

Required Qualifications:

  • 5+ years of experience as a or Data Engineer in cloud environments.
  • Strong experience with AWS S3, Apache Iceberg, Amazon DynamoDB, And AWS Glue/Glue streaming
  • Proficiency in SQL, NoSQL, and scripting languages such as Python, Bash, or CFT
  • Experience with streaming data architectures and data lakehouse implementations.
  • Familiarity with monitoring and alerting tools (e.g., CloudWatch, Prometheus, Grafana).
  • Solid experience of data security, encryption, access control, version control, CICD pipeline, DevOps practices, metric tracking, event logs in cloud environments.
  • Espouse for Copilot AI on automation, code quality improvement

Preferred Qualifications:

  • AWS certifications (e.g., AWS Certified Database Specialty, AWS Certified Solutions Architect).
  • Experience with Apache Spark, AWS Glue Streaming,
  • and container orchestration (e.g., Kubernetes).
  • Experience with multi-region deployments and disaster recovery planning.

Soft Skills:

  • Strong analytical and problem-solving skills.
  • Excellent communication and collaboration abilities.
  • Ability to work independently and in cross-functional teams.
  • Self-motivated and have sense of urgency

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.