Overview
Hybrid
$50+
Contract - W2
Skills
DevOps
Cloud Computing
Snowflake
Apache Airflow
DAG
Informatica
Health Care
Python
Shell scripting
Terraform
Azure DevOps
GitLab
AWS
Docker
Kubernetes
SQL
FinOps
finance
Job Details
*** Only LOCAL Candidates Eligible ***
Job Title: DevOps Engineer
Location: Boston, MA (Hybrid)Work Location: 40 Broad Street, Boston, MA
Duration: 12 months+ contract
Visa: No H1B OR CPT
Interview: Onsite interview required
Residency Requirement: Local candidates only; proof of residency required
Job Overview
The Client is seeking an experienced DevOps Engineer to support a cloud data warehouse modernization initiative. This role focuses on migrating from a SQL Server/AWS based system to a Snowflake-based data platform.
The DevOps Engineer will design, implement, and maintain CI/CD pipelines, automate data pipeline deployments, and ensure operational reliability across Snowflake, Informatica (IICS), and Apache Airflow environments. This position works closely with the EOHHS IT team supporting the Department of Mental Health and Department of Public Health Hospitals.
Work Schedule
- Monday Friday
- 9:00 AM 5:00 PM EST
- Hybrid work model (local candidates required)
- Build and maintain CI/CD pipelines for Snowflake, Informatica (IICS), and Apache Airflow DAG deployments
- Implement automated code promotion across development, test, and production environments
- Integrate testing, linting, and security scanning into deployment workflows
- Develop Infrastructure as Code (IaC) using Terraform or similar tools to manage Snowflake objects, cloud resources, and networking
- Ensure configuration and environment consistency across multi-region and multi-cloud environments
- Maintain secure connectivity between cloud and on-premise systems (VPNs, private links, firewalls)
- Implement logging, alerting, and monitoring for Airflow DAGs, Informatica workflows, and Snowflake performance
- Create proactive monitoring dashboards for job failures, data quality checks, and warehouse utilization
- Optimize Snowflake performance, concurrency, and cost governance
- Own deployment frameworks for ETL/ELT code, SQL scripts, and metadata updates
- Manage user access provisioning and RBAC across Snowflake, Informatica, and Airflow
- Troubleshoot platform and orchestration issues; lead incident response during outages
- Enforce DevSecOps best practices including encryption, secrets management, and key rotation
- Implement audit, logging, compliance, and backup/restore strategies aligned with governance requirements
- Participate in testing, deployment, and release management for new data workflows and enhancements
- 3 7+ years of experience in DevOps, Cloud Engineering, or Data Platform Engineering
- Hands-on experience with:
- Snowflake (roles, warehouses, performance tuning, cost control)
- Apache Airflow (DAG orchestration, monitoring, deployments)
- Informatica IICS (pipeline deployment automation preferred)
- Strong CI/CD experience using GitLab, GitHub Actions, Azure DevOps, Jenkins, or similar tools
- Proficiency in Terraform, Python, and Shell scripting
- Strong understanding of cloud platforms: AWS, Azure, or Google Cloud Platform
- Experience with containerization technologies such as Docker and Kubernetes, particularly for Airflow
- Solid knowledge of networking concepts and security controls
- Experience migrating from SQL Server or other legacy data warehouse platforms
- Knowledge of FinOps practices for Snowflake cost optimization
- Prior experience in healthcare, finance, or other regulated industries
- Strong communication skills with both technical and non-technical stakeholders
- Proven ability to troubleshoot complex, distributed data workloads
- Excellent documentation and cross-team collaboration skills
- Proactive mindset with a focus on automation and continuous improvement
- Detail-oriented with a strong emphasis on data accuracy and reliability
Bachelor s degree (or equivalent experience) in Computer Science, Information Systems, Data Engineering, Health Informatics, or a related field.
Best Regards,
Hari Daile
Senior US IT Recruiter
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.