Cloud Data Engineer

Overview

Remote
Depends on Experience
Full Time

Skills

AWS
Azure Data FActory
PostgreSQL

Job Details

POSITION SUMMARY:

The Cloud Data Engineer / PostgreSQL & NoSQL Specialist is a hands-on, cloud-focused technical role that sits at the intersection of cloud database administration and data engineering. This position plays a critical role in managing, optimizing, and scaling PostgreSQL and NoSQL (DynamoDB) solutions on AWS, while also developing cloud-native ETL pipelines and data integration workflows. The ideal candidate has substantial experience in administering Amazon RDS for PostgreSQL, hands-on PostgreSQL tuning and design, as well as building automated, scalable data pipelines using tools such as Azure Data Factory. This role is foundational in supporting both real-time and batch data workloads across relational and non-relational ecosystems in a modern cloud environment.

QUALIFICATIONS:

  • 5+ years of experience in data engineering and integration pipeline design, especially in cloud environments.
  • 5+ years of experience working with relational databases, including deep expertise in PostgreSQL.
  • 3+ years of RDS PostgreSQL administration experience, with emphasis on performance tuning, monitoring, and security.
  • 3+ years of hands-on experience with AWS services, particularly RDS, S3, Lambda, Kinesis, CloudWatch, IAM, and DynamoDB.
  • 3+ years of experience building ETL/ELT solutions using Azure Data Factory, with pipelines involving both cloud and on-prem sources.
  • Strong skills in SQL, including writing optimized, scalable queries.
  • Working knowledge of Python or Java for automation and pipeline development.
  • Familiarity with NoSQL systems (e.g., DynamoDB, MarkLogic); DynamoDB preferred.
  • Experience with monitoring, logging, and troubleshooting data jobs in production.
  • Exposure to DevOps, CI/CD tools, and source control platforms like Git/GitHub.
  • Understanding of data security best practices, including encryption, IAM, and audit logging.
  • Understanding of database architecture and performance implications required.
  • Experience with integrating Business Intelligence applications like PowerBI, Qlik Sense, or Tableau.
  • Experience with Machine Learning and Artificial Intelligence is preferred.
  • A good understanding of Data Virtualization technologies, such as Denodo, is preferred.
  • Exposure to Snowflake, DBT, or other modern warehousing and transformation tools.
  • Ability to multi-task effectively, work collaboratively as part of an Agile Team, and guide junior engineers.
  • Excellent written and verbal communication skills, sense of ownership, urgency, and drive.

MINIMUM QUALIFICATIONS:

  • Bachelor s degree in computer science, Information Systems, Engineering, Statistics, or a related field (foreign equivalent accepted).
  • 3+ years of hands-on experience with AWS services for data and analytics (e.g., RDS, S3, Lambda, PostgreSQL, DynamoDB).
  • 3+ years of hands-on experience with Azure Data Factory.
  • 3+ years of experience with data ingestion, extraction, and integration pipelines.
  • 3+ years of experience with Python or Java for data engineering and automation tasks.
  • 3+ years of experience writing efficient SQL queries, including performance tuning.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.