Overview
Remote
Accepts corp to corp applications
Contract - W2
Contract - 12+ Month(s)
Skills
Cloud
Automation
ETL
SQL
ORACLE
migration
RDS
Snowflake
dba
ELT
CDC
Job Details
Role: Cloud Database Administrator (DBA / ETL)
Location: Remote
Job Type: W2 Contract (No C2C)
Experience: 12+ Years
The Cloud Database Administrator / ETL Engineer will support, maintain, optimize, and modernize cloud-hosted databases, data pipelines, data warehouses, and data marts. The engineer will work closely with the Chief Applications Officer, Data Engineering Leads, DBAs, and cloud engineering teams to build scalable, secure, and high-performance data solutions across cloud platforms.
- Strong experience with Oracle RDS.
- Hands-on experience with AWS Cloud Services such as:
S3, Managed Airflow (MWAA), Data Migration Service (DMS). - Experience working with multiple back-end data sources:
SQL Server, Oracle, Postgres, DynamoDB, Snowflake. - Strong Advanced SQL capability; ability to translate PL/SQL and Stored Procedures to platforms like Snowflake.
- Understanding of data warehouse/mart concepts:
Normalization, Facts/Dimensions, Slowly Changing Dimensions (SCDs). - Knowledge of Change Data Capture (CDC); Kafka experience is a plus.
- Familiarity with structured/unstructured formats: JSON, XML, CSV.
- Experience automating tasks using: Python, PowerShell, Bash.
- Ability to write unit test scripts and validate migrated ETL/ELT code.
- Experience configuring and troubleshooting Apache Airflow, including DAG design and dependency management.
- Knowledge of Snowflake features: Snowpipe Streaming, Cloning, Time-Travel, RBAC.
- Experience working in large organizations; state or federal agencies preferred.
- Domain experience in Education / Student Data is a plus.
- Experience with development tools: GitHub, Jira.
- Create and manage cloud-native databases and services:
RDS Oracle, Aurora, Postgres, Snowflake. - Tune database performance: query optimization, compute scaling, storage performance.
- Establish policies for snapshots, PITR, cross-region replication.
- Implement security controls: encryption, access policies, masking, auditing (FERPA/PII compliance).
- Manage schema migrations, data pipelines, versioned deployments.
- Re-engineer and migrate legacy SSIS ETL packages to SQL-based pipelines orchestrated via Apache Airflow.
- Perform hands-on solution design, coding, bug fixing, and unit testing.
- Develop and maintain Airflow scheduling frameworks for complex workflows.
- Conduct performance benchmarking: compare new cloud solutions with legacy on-prem systems.
- Work with Jira for task management and GitHub for code reviews, pull requests, and version control.
How to Apply
Interested candidates can share their updated resumes at:
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.