Senior Cloud DBA/ETL/ELT Data Pipeline Engineer - DPE 25-33042

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
No Travel Required

Skills

Workflow
Stored Procedures
Slowly Changing Dimensions
Snow Flake Schema
Software Development
Python
RBAC
Recovery
Regulatory Compliance
Remote Desktop Services
Management
Mergers and Acquisitions
Microsoft SQL Server
Microsoft SSIS
Data Warehouse
Database
Unit Testing
Version Control
Windows PowerShell
Reporting
SQL
Normalization
Optimization
Oracle
PL/SQL
Extract
Transform
Load
Database Administration
Dimensional Modeling
ELT
Encryption
Cloud Computing
Collaboration
Computerized System Validation
Data Domain
Amazon DynamoDB
Amazon RDS
Amazon S3
Streaming
Writing
PostgreSQL
Data Migration
Data Security
File Formats
GitHub
JSON
Migration
Change Data Capture
DMS
Data Engineering
Data Integration
Data Marts
Amazon Web Services
Analytics
Apache Airflow
Scripting
Storage
XML
Apache Kafka
Auditing
Bash
JIRA
Replication
Scalability
Scheduling

Job Details

Job Title: Senior Cloud DBA/ETL/ELT Data Pipeline Engineer
Location: Everett, MA 02149
Duration: 6 Months


Position Overview

The organization is undergoing a multi-year application and platform modernization effort. As part of this initiative, the IT department is seeking a highly skilled Senior Cloud Database Administrator (DBA) / ETL/ELT Data Pipeline Engineer to support the modernization, optimization, and maintenance of cloud-based data platforms, including data warehouses, data marts, and related systems.

Reporting to the Chief Applications Officer and Data Engineering & Analytics Team Leads, this role will manage cloud-hosted databases and data services, ensuring high performance, security, scalability, and alignment with governance standards.

The engineer will collaborate closely with cloud engineers, ETL developers, DBAs, technical leads, analysts, and project managers to design and implement modernized data pipelines and transformations within a scalable and cost-effective architecture.


Key Responsibilities

Cloud Database & Data Platform Management

  • Create, configure, and manage cloud-native databases and data services (e.g., RDS Oracle, Aurora, Postgres, Snowflake).

  • Optimize query execution, storage performance, and compute scaling.

  • Define and maintain policies for snapshots, point-in-time recovery (PITR), and cross-region replication.

  • Implement data security controls including encryption, access policies, masking, and auditing in compliance with FERPA and PII standards.

  • Manage schema migrations, versioned deployments, and data pipeline operations.

ETL/ELT Engineering & Modernization

  • Migrate legacy SSIS ETL solutions to SQL-based pipelines using Apache Airflow for scheduling and dependency management.

  • Perform hands-on design, discovery, and troubleshooting of data integration workflows.

  • Re-engineer solution approaches, build code packages, fix defects, perform unit testing, and maintain source control using GitHub.

  • Develop and guide the implementation of Airflow scheduling and dependency frameworks.

  • Tune pipeline performance and benchmark cloud solutions against on-premises platforms.

Tools, Collaboration & Development Workflow

  • Use Jira to track, review, and complete assigned tasks.

  • Use GitHub for code management, pull requests, and collaborative development.

  • Support data engineering initiatives across a variety of backend sources (SQL Server, Oracle, Postgres, DynamoDB, Snowflake).


Required Qualifications

Technical Expertise

  • Experience with Oracle RDS.

  • Experience with AWS services such as S3, Managed Workflows for Apache Airflow (MWAA), and Data Migration Service (DMS).

  • Strong background with backend data sources (SQL Server, Oracle, Postgres, DynamoDB, Snowflake).

  • Advanced SQL coding skills with the ability to translate PL/SQL and stored procedures to other SQL platforms (e.g., Snowflake).

  • Knowledge of data warehouse and data mart concepts (normalization, facts/dimensions, slowly changing dimensions).

  • Understanding of Change Data Capture (CDC) methodologies; knowledge of Kafka or similar tools is a plus.

  • Familiarity with common file formats (JSON, XML, CSV).

  • Experience writing scripts using Python, PowerShell, or Bash.

  • Ability to write unit tests and validate migrated ETL/ELT code.

  • Experience configuring, managing, and troubleshooting Apache Airflow, including DAG management.

  • Knowledge of Snowflake features such as Snowpipe streaming, time travel, cloning, and RBAC.

Additional Qualifications

  • Experience in large organizations, preferably government agencies.

  • Strong understanding of education and student data domain concepts.

  • Experience using software development and workflow tools such as GitHub and Jira.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.