DEVOPS ENGINEER

Overview

On Site
USD 70.00 - 80.00 per hour
Contract - Independent

Skills

Mergers and Acquisitions
EHS
Public Health
Promotions
Development Testing
Virtual Private Network
Firewall
Dashboard
Data Quality
Warehouse
Concurrent Computing
Extract
Transform
Load
ELT
SQL
Scripting
Meta-data Management
Provisioning
RBAC
Incident Management
DevSecOps
Encryption
Management
Auditing
Regulatory Compliance
Backup
Recovery
Testing
Release Management
Workflow
Performance Tuning
Cost Control
Apache Airflow
Orchestration
Informatica
Continuous Integration
Continuous Delivery
GitLab
GitHub
DevOps
Jenkins
Terraform
Python
Shell Scripting
Cloud Computing
Amazon Web Services
Microsoft Azure
Google Cloud Platform
Google Cloud
Docker
Kubernetes
Computer Networking
Security Controls
Migration
Microsoft SQL Server
Data Warehouse
Snow Flake Schema
Optimization
Health Care
Finance
Soft Skills
Effective Communication
Documentation
Collaboration
Attention To Detail
FOCUS
Process Improvement
Computer Science
Information Systems
Data Engineering
Health Informatics
Privacy
Marketing

Job Details

Location: Boston, MA
Salary: $70.00 USD Hourly - $80.00 USD Hourly
Description: Our client is currently seeking a DEVOPS ENGINEER

Overview

Client is seeking an experienced DevOps Engineer to support a cloud data warehouse modernization initiative, migrating from a SQL Server/AWS based system to a Snowflake-based data platform.
The DevOps Engineer will:

  • Develop, maintain, and optimize data pipelines and integration processes.
  • Design and implement CI/CD pipelines.
  • Automate deployments and ensure operational reliability across Snowflake, Informatica, and Apache Airflow environments.
  • Collaborate closely with the EHS IT team supporting the Department of Mental Health and Department of Public Health Hospitals.


Detailed Responsibilities
  • Build and maintain CI/CD pipelines for Snowflake, Informatica (IICS), and Airflow DAG deployments.
  • Implement automated code promotion between development, test, and production environments.
  • Integrate testing, linting, and security scanning into deployment processes.
  • Develop Infrastructure as Code (IaC) using Terraform or similar tools.
  • Manage configuration and environment consistency across multi-region/multi-cloud setups.
  • Maintain secure connectivity between cloud and on-prem systems (VPNs, private links, firewalls).
  • Implement logging and alerting for Airflow DAGs, Informatica workflows, and Snowflake performance.
  • Develop proactive monitoring dashboards for job failures, data quality triggers, and warehouse usage.
  • Optimize pipeline performance, concurrency, and cost governance in Snowflake.
  • Own deployment frameworks for ETL/ELT code, SQL scripts, metadata updates.
  • Support user access provisioning & RBAC alignment across Snowflake, Informatica, and Airflow.
  • Troubleshoot platform and orchestration issues; lead incident response during outages.
  • Enforce DevSecOps practices including encryption, secrets management, and key rotation.
  • Implement audit, logging, compliance, and backup/restore strategies aligned with governance requirements.
  • Participate in testing, deployment, and release management for new data workflows and enhancements.


Required Qualifications
  • 3-7+ years in DevOps, Cloud Engineering, or Data Platform Engineering roles.
  • Hands-on experience with:
    • Snowflake (roles, warehouses, performance tuning, cost control).
    • Apache Airflow (DAG orchestration, monitoring, deployments).
    • Informatica (IICS pipeline deployment automation preferred).
  • Strong CI/CD skills using GitLab, GitHub Actions, Azure DevOps, Jenkins, or similar.
  • Proficiency with Terraform, Python, and Shell scripting.
  • Deep understanding of cloud platforms: AWS, Azure, or Google Cloud Platform.
  • Experience with containerization (Docker, Kubernetes), especially for Airflow.
  • Strong knowledge of networking concepts and security controls.


Preferred Knowledge, Skills & Abilities
  • Experience migrating from SQL Server or other legacy DW platforms.
  • Knowledge of FinOps practices for Snowflake usage optimization.
  • Background in healthcare, finance, or regulated industries a plus.


Soft Skills
  • Effective communication with technical and non-technical stakeholders.
  • Ability to troubleshoot complex distributed data workloads.
  • Strong documentation and cross-team collaboration skills.
  • Proactive and committed to process improvement and automation.
  • Detail-oriented, with a focus on data accuracy and process improvement.


Education & Certification
  • Bachelor's degree or equivalent experience in Computer Science, Information Systems, Data Engineering, Health Informatics, or related field.

By providing your phone number, you consent to: (1) receive automated text messages and calls from the Judge Group, Inc. and its affiliates (collectively "Judge") to such phone number regarding job opportunities, your job application, and for other related purposes. Message & data rates apply and message frequency may vary. Consistent with Judge's Privacy Policy, information obtained from your consent will not be shared with third parties for marketing/promotional purposes. Reply STOP to opt out of receiving telephone calls and text messages from Judge and HELP for help.

Contact:

This job and many more are available through The Judge Group. Please apply with us today!
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Judge Group, Inc.