Overview
Skills
Job Details
Datacenter Lift and Shift Project for Insurance Client
Top 5 Technical Skills:
- 8+ years total experience in Data Engineering / ETL / Data Warehousing.
- 3+ years hands-on experience with dbt (Core or Cloud) building production-grade pipelines.
- Proven experience leading an Informatica ? dbt migration to Snowflake on Azure (or similar large-scale ETL modernization).
- Strong Snowflake experience: designing and developing schemas, views, warehouses, and performance optimization.
- Solid working knowledge of Azure data stack: Azure Data Factory, ADLS, Azure DevOps/GitHub.
Job Description:
Roles & Responsibilities:
- Define the target ELT architecture using dbt on Snowflake, integrated with Azure services (ADF, ADLS, Synapse/Databricks, Key Vault, Azure DevOps/GitHub).
- Translate legacy Informatica mappings, workflows, and sessions into modular dbt models (staging, core, mart layers).
- Establish modeling standards (naming conventions, layer design, folder/package structure) for staging, integration, and mart layers.
- Define and implement performance-optimized patterns in dbt and Snowflake (incremental models, clustering, partitioning logic, query tuning).
- Lead the migration strategy, roadmap, and wave planning for converting Informatica jobs to dbt on Snowflake.
- Analyze existing ETL logic, dependencies, and schedules in Informatica and design equivalent or improved logic in dbt.
- Design a repeatable migration factory: templates, accelerators, mapping spreadsheets, and conversion playbooks for Informatica ? dbt.
- Oversee conversion, unit testing, and parallel runs to validate that dbt models match legacy outputs (row counts, aggregates, business rules).
- Lead hands-on development of dbt models, seeds, snapshots, tests, macros, and documentation.
- Define and implement testing strategy using dbt tests (schema tests, data tests, custom tests) and integrate with broader data quality checks.
- Set up and maintain dbt environments (dev/test/prod), profiles, and connections to Snowflake on Azure.
- Introduce and enforce code quality practices.
- Code reviews & pull requests
- Modular, reusable models and packages
Minimum Skills Required:
- 8+ years total experience in Data Engineering / ETL / Data Warehousing.
- 3+ years hands-on experience with dbt (Core or Cloud) building production-grade pipelines.
- Proven experience leading an Informatica ? dbt migration to Snowflake on Azure (or similar large-scale ETL modernization).
- Strong Snowflake experience: designing and developing schemas, views, warehouses, and performance optimization.
- Solid working knowledge of Azure data stack: Azure Data Factory, ADLS, Azure DevOps/GitHub.
Benefits:
SES hires W2 benefitted and non-benefitted consultants. Our contract employee benefits include group medical dental vision life LT and ST disability insurance, 21 days of accrued paid time off, 401k, tuition reimbursement, performance bonuses, paid overtime, and more. Due to corporate regulations, we cannot work with outside companies on this opening; only direct, individual candidates representing themselves to SES, please.
About SES Systems Engineering Services Corporation:
SESC was founded in 1989, is a leading provider of technology solutions to Fortune 1000 companies and government organizations. Specializing in Accelerated Development (agile application development, mobile, systems integration, project and program management), Architecture (SOA, microservices, cloud), Data (analytics, DW, BI, big data), Testing (test architecture, manual, automation, data), Cyber Security (SSO, mobile, IAM) and DevOps (roadmap creation, assessments, CICD, tool evaluation, implementation), Employer is guided by a corporate mission to provide valuable solutions to our client s technology needs through responsive quality services.
Jim Murphy
jmurphy sec com