Overview
Remote
On Site
$40
Contract - W2
Contract - 6 Month(s)
Skills
SQL
ETL
ADF
Job Details
ADF ETL Data Engineer - REMOTE WORK - 53036
We have an immediate long-term opportunity with one of our prime clients for a position of ADF ETL Data Engineer to work on Remote basis.
Project Scope:
- Contract to hire - ETL Person leaving so this will essentially be backfilling them
- Going to be filled through the new CoE - Center of Excellence
- Has around 15 members right now
- State of Arkansas project Medicaid Decision Support System
- Currently in Informatica/SQL and will be moving all ETL to Azure Data Factory/Snowflake as a database
Team Size & Breakdown:
- 100% Remote, just need to be in the US
- ETL Developers, Modelers, Architects and Managers currently working in the CoE
High-Level Individual Duties
- Analyzing project requirements and developing specs for ETL requirements
- Building out data pipelines in Azure Data Factory
- Working in RDBMS (SQL + Snowflake)
Must-Haves (Concepts & Tools):
- 5 + years of data engineering experience
- Experience creating pipelines in ADF
- Understanding of other ETL tools (i.e. SSIS or Informatica)
- Strong SQL skills
What will win:
- Healthcare experience, specifically within state projects
- Snowflake experience (using it as a database)
- Python scripting skills
Job Description:
Minimum Qualifications:
Minimum Qualifications:
- 5+ years of Data engineering experience with a focus on Data Warehousing
- 2+ years of experience creating pipelines in Azure Data Factory (ADF)
- 5+ years developing ETL using Informatica PowerCenter, SSIS, Azure Data Factory, or similar tools.
- 5+ years of experience with Relational Databases, such as Oracle, Snowflake, SQL Server, etc.
- 3+ years of experience creating stored procedures with Oracle PL/SQL, SQL Server T-SQL, or Snowflake SQL
- 2+ years of experience with GitHub, SVN, or similar source control systems
- 2+ years of experience processing structured and un-structured data.
- Experience with HL7 and FHIR standards, and processing files in these formats.
- 3+ years analyzing project requirements and developing detailed specifications for ETL requirements.
- Excellent problem-solving and analytical skills, with the ability to troubleshoot and optimize data pipelines.
- Ability to adapt to evolving technologies and changing business requirements.
- Bachelors or Advanced Degree in a related field such as Information Technology/Computer Science, Mathematics/Statistics, Analytics, Business
Preferred Qualifications:
- 2+ years of batch or PowerShell scripting
- 2+ years of experience with Python scripting.
- 3+ years of data modeling experience in a data warehouse environment
- Experience or familiarity with Informatica Intelligent Cloud Services (specifically Data Integration)
- Experience designing and building APIs in Snowflake and ADF (e.g. REST, RPC)
- Experience with State Medicaid / Medicare / Healthcare applications
- Azure certifications related to data engineering or data analytics
**ALL successful candidates for this position are required to work directly for PRIMUS. No agencies please only W2**
For immediate consideration, please contact:
Rahul Kumar
PRIMUS Global Services
Direct
Desk x 259
Email: