Devops Data Engineer
Hybrid in Phoenixville, PA, US • Posted 12 hours ago • Updated 12 hours agoDice Job Match Score™
🫥 Flibbertigibetting...
Job Details
Skills
- Agile
- Cloud Computing
- Continuous Integration
- Continuous Delivery
- Continuous Integration and Development
- DevOps
- Business Intelligence
- Reporting
- Microsoft Azure
Summary
Role: DevOps Data Engineer
Location: Oaks, PA
Mode of Hire: Full Time
Responsibilities :
• Design, develop, and maintain CI/CD pipelines in GitLab for automated deployment of data platform components including dbt transformations, Airflow DAGs, and Snowflake database objects across development, QA, UAT, and production environments.
• Implement and optimize blue-green deployment patterns and environment promotion strategies to ensure zero-downtime releases and safe rollback capabilities for the Data Cloud infrastructure.
• Build automated testing integration within deployment pipelines to validate data transformations, Snowflake stored procedures, functions, and materialized views before production promotion.
• Collaborate with QA teams to integrate validation frameworks and testing portals into the CI/CD workflow, ensuring data quality gates are enforced at each stage of the deployment process.
• Transition into hands-on data engineering work developing Snowflake data shares for cross-functional data access and building reporting analytics warehouses that consolidate data from multiple source systems including Investran/KYC, Geneva RSL, and Investier.
• Develop and optimize Snowflake objects including views, stored procedures, functions, and materialized views to support reporting and analytics requirements while maintaining performance and cost efficiency.
• Work within the Data Cloud/Azure infrastructure team to deploy and manage data pipeline components, coordinating with parallel teams handling Snowflake extracts and Reporting/PowerBI workstreams.
Skills Required:-
• Strong hands-on experience with GitLab CI/CD pipeline development and deployment automation, including YAML configuration, pipeline orchestration, and environment management strategies.
• Solid understanding of DevOps practices and principles including infrastructure as code, automated testing, continuous integration/continuous deployment, and version control workflows using Git.
• Proficiency in SQL for writing queries, stored procedures, and functions with the ability to implement data transformations based on provided specifications and requirements.
• Working knowledge of Snowflake architecture and database objects including tables, views, materialized views, stored procedures, functions, and data sharing capabilities.
• Experience with dbt (data build tool) for implementing SQL-based data transformations, including model development, testing, documentation, and deployment patterns based on existing designs.
• Familiarity with Apache Airflow for workflow orchestration, including DAG development, task dependencies, and scheduling strategies for data pipeline automation.
• Good Python scripting skills for automation tasks, data processing, and integration work between various platform components.
• Demonstrated ability to work in Agile environments and collaborate effectively with cross-functional teams including QA engineers, data analysts, business stakeholders, and infrastructure teams.
• Experience working with Azure cloud infrastructure and services, particularly as they relate to data platform deployments and CI/CD tooling integration.
• Knowledge of snowsql scripting and command-line interfaces for Snowflake automation and deployment scripting.
• Understanding of testing methodologies for data pipelines including unit testing, integration testing, and user acceptance testing coordination.
• Exposure to reporting and visualization tools such as PowerBI or similar business intelligence platforms.
• Experience managing deployments across complex multi-environment landscapes with clear separation between development, QA, UAT, and production tiers.
• Track record of implementing automated testing and validation within CI/CD pipelines to catch issues early and maintain high data quality standards.
• Strong problem-solving abilities with a mindset toward building reusable, maintainable automation solutions that can scale with project growth.
• Excellent communication skills and the ability to document technical processes clearly for knowledge transfer to QA teams and other stakeholders.
• Willingness to grow from a DevOps-focused role into broader data engineering responsibilities as the platform matures and pipeline automation stabilizes.
• Self-motivated approach to learning new technologies and adapting to the evolving needs of a large-scale data migration and analytics platform project.
- Dice Id: RTX1d9f8f
- Position Id: 8899218
- Posted 12 hours ago
Company Info
About Coforge
Coforge is a global digital services and solutions provider, that enables its clients to transform at the intersect of domain expertise and emerging technologies to achieve real-world business impact. A focus on very select industries, a detailed understanding of the underlying processes of those industries and partnerships with leading platforms provides us a distinct perspective. Coforge leads with its product engineering approach and leverages Cloud, Data, Integration and Automation technologies to transform client businesses into intelligent, high growth enterprises. Coforge’s proprietary platforms power critical business processes across its core verticals. The firm has a presence in 21 countries with 25 delivery centers across nine countries.
Similar Jobs
It looks like there aren't any Similar Jobs for this job yet.
Search all similar jobs