Azure ETL Developer

  • Richmond, VA
  • Posted 20 hours ago | Updated 20 hours ago

Overview

On Site
Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 3 Month(s)
No Travel Required

Skills

Azure ETL Developer
DW
Kimball
Inmon
Snowflake
IBM Datastage
Erwin
SSIS
SSRS
SSAS
ORACLE
T-SQL
spatial data ingestion
MS Project
Visio
TFS
Data Factory
Lake
Synapse
Python
shell scripting
MOLAP
ROLAP
ODS
DM
EDW
Power BI
Tableau
Databricks
Snow Flake Schema
Transact-SQL
Unix
Warehouse
XML
Reporting
SAS Display Manager
SQL
SQL Azure
Scripting
Shell
Storage
Modeling
OLAP
Operating Systems
Problem Solving
Purchasing
RTR
Microsoft SSIS
Microsoft SSRS
Microsoft TFS
Microsoft Visio
Microsoft Windows
Microsoft Power BI
Microsoft PowerPoint
Microsoft SQL Server
Microsoft SSAS
Information Technology
Management
Microsoft Azure
Microsoft Excel
Extract
Transform
Load
HDFS
IBM InfoSphere DataStage
Linux
Datastage
Dialog Manager
Dimensional Modeling
Documentation
Extraction
Data Marts
Data Storage
Data Warehouse
Database
Database Design
Conflict Resolution
Data Engineering
Data Lake
Data Management
Business Objects
Cloud Architecture
Cloud Computing
Communication
Agile
Analytical Skill
Analytics
Business Data
Business Intelligence
Data Analysis

Job Details

Job ID: VA-762420

Hybrid/Local (Richmond ONLY) Azure ETL Developer with DW/Kimball/Inmon, Snowflake, IBM Datastage, Erwin, SSIS/SSRS/SSAS, ORACLE, T-SQL, spatial data ingestion, MS Project/Visio/TFS, Data Factory/Lake, Synapse, Python, shell scripting, MOLAP/ROLAP/ODS/DM/EDW, Power BI, Tableau, Databricks experience

Location: Richmond, VA (VDOT)
Duration: 3 Months
*Local Richmond, VA candidates ONLY required due to onsite requirement
**This position requires onsite 3 days a week with 2 remote
**Contractor will be responsible for purchasing parking through VDOT s Parking Management Office or procuring their own parking

Skills:
Designs and develops systems for the maintenance of the Data Asset Program, ETL processes, and business intelligence. Required 10 Years
Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data Required 10 Years
Work closely with data analysts, data scientists, and other data consumers within the business in an attempt to gather and populate data hub and data Required 10 Years
Advanced understanding of data integrations. Strong knowledge of database architectures, strong understanding of ingesting spatial data Required 10 Years
Ability to negotiate and resolve conflicts, Ability to effectively prioritize and handle multiple tasks and projects Required 10 Years
Excellent computer skills and be highly proficient in the use of Ms Word, PowerPoint, Ms Excel, MS Project, MS Visio, and MS Team Foundation Server Required 10 Years
Experience with key data warehousing architectures including Kimball and Inmon, and has a broad experience designing solutions using a broad set of da Required 10 Years
expertise in Data Factory v2,Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse Required 10 Years
IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse. Required 10 Years
Operating System Environments (Windows, Unix, etc.). Scripting experience with Windows and/or Python, Linux Shell scripting Required 10 Years
Experience in AZURE Cloud engineering Required 10 Years
Experience with Snowflake Desired 5 Years

Job Description:

JOB DESCRIPTION: SR ETL Developer
The Virginia Department of Transportation (VDOT) Information Technology Division is seeking a senior ETL developer to ingest/transform and load Data Assets and implement a cloud-based data management platform that will support the agency.
ETL developer to extract business data and load it into a data warehousing environment. Design, program and test the performance of the system. Consult with various teams to understand the agency s data storage needs and develop data warehousing options. Deep knowledge of coding languages, such as Azure Data Factory, Databricks, python, XML, and SQL. Well-versed in warehousing architecture techniques such as MOLAP, ROLAP, ODS, DM, and EDW.
Responsibilities:
Designs and develops integrations for the Enterprise Data Asset program, ETL processes, and business intelligence.
Develop data engineering processes that leverage a cloud architecture and will extend or migrate our existing data pipelines to this architecture as needed.
Design and supports the DW database and table schemas for new and existent data sources for the data hub and warehouse. Design and development of Data Marts.
Work closely with data analysts, data scientists, and other data consumers within the business in an attempt to gather and populate data hub and data warehouse table structure, which is optimized for reporting.
The Data developers partners with Data modeler and Data architect in an attempt to refine the business s data requirements, which must be met for building and maintaining Data Assets.
Understanding of Agile methodologies and processes
Preferred Skills:
Advanced understanding of data integrations.
Strong knowledge of database architectures
Strong analytical and problem solving skills
Ability to build strong relationships both internally and externally
Ability to negotiate and resolve conflicts
Ability to effectively prioritize and handle multiple tasks and projects
Strong written and verbal communication skills
Desire to learn, innovate and evolve technology
Computer Skills/Ms Office/Software:
Excellent computer skills and be highly proficient in the use of Ms Word, PowerPoint, Ms Excel, MS Project, MS Visio, and MS Team Foundation Server, which will all be necessary in the creation of visually and verbally engaging ETL, data designs and tables as well as the communication of documentation and reporting.
Deep passion for data analytics technologies as well as analytical and dimensional modeling. The candidate must be extensively familiar with ETL (Extraction, Transformation & Load), data warehousing, and business intelligence tools such as business objects,PowerBI and Tableau.
The candidate must also have vast knowledge of database design and modeling in the context of data warehousing.
Experience with key data warehousing architectures including Kimball and Inmon, and has a broad experience designing solutions using a broad set of data stores (e.g., HDFS, Azure Data Lake Store, Azure Blob Storage, Azure SQL Data Warehouse, Databricks
Technologies Required:
Data Factory v2,Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse, Snowflake
IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse.
Operating System Environments (Windows, Unix, etc.).
Scripting experience with Windows and/or Python, Linux Shell scripting

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.