*Senior ETL/ELT & Cloud Database Engineer* (Local to Boston, MA Area)

  • Boston, MA
  • Posted 3 hours ago | Updated 3 hours ago

Overview

Remote
Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12 Month(s)

Skills

Amazon DynamoDB
Amazon RDS
Amazon S3
Analytics
Apache Airflow
Apache Kafka
Database Administration
ELT
Extract
Transform
Load
JSON
JIRA
Dimensional Modeling
Data Marts
Data Migration
Data Warehouse
Database
Debugging
PL/SQL
Oracle
PostgreSQL
Snow Flake Schema
Scripting
SQL
XML
Migration
Mergers and Acquisitions
Microsoft SQL Server

Job Details

Kellton Tech is a full-service software development company, offering end-to-end IT solutions, strategic technology consulting and product development services in Web, SMAC (Social, Mobile, Analytics, Cloud), ERP-BPM, and IoT space Our methodology of inventing infinite possibilities with technology helps us develop best in-class and cost effective solutions for our clients.

Currently Kellton Tech is looking for talented resources for one of our listed client. Below are the position details

 

Position: Senior Cloud DBA / ETL–ELT Data Pipeline Engineer
Location: Boston, MA (Remote)
Duration: 12+ Months (Contract)

The organization is seeking a Cloud DBA/ETL Engineer to support the maintenance, modernization, optimization, and troubleshooting of cloud-based data warehouses, data marts, and enterprise data platforms. The role involves managing cloud-native databases and data services to ensure performance, security, high availability, and compliance with governance standards.

 

Key Responsibilities:

  • Manage and optimize cloud databases such as RDS Oracle, Aurora, Postgres, and Snowflake.
  • Monitor query performance, compute scaling, storage usage, and system reliability.
  • Implement backup policies, PITR, cross-region replication, encryption, access controls, and auditing.
  • Lead schema migrations, data pipeline development, and versioned deployments.
  • Re-engineer and migrate legacy SSIS ETL code to SQL-based solutions orchestrated via Apache Airflow.
  • Develop Airflow DAGs, scheduling frameworks, and dependency management structures.
  • Conduct performance tuning, benchmarking, troubleshooting, and code unit testing.
  • Use GitHub for code management and Jira for task tracking.

 

Required Skills:

  • Experience with Oracle RDS and AWS services (S3, MWAA, DMS).
  • Strong SQL and PL/SQL translation skills across platforms (e.g., Snowflake).
  • Knowledge of data warehousing concepts (facts, dimensions, SCDs) and CDC frameworks.
  • Familiarity with file formats (JSON, XML, CSV) and scripting (Python, PowerShell, Bash).

 

Preferred:

  • Experience with Airflow configuration and advanced Snowflake features.
  • Background in large organizations or government environments.
  • Education data domain knowledge.

 

Apply: Interested candidates can apply with their detailed word formatted resume along with their contact information\availability to raj (dot) sekhar (at) kellton (dot) com Phone 703-5 9 2 .9.5.1.7

 

Thanks for all your time!

 

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.