Azure Data Warehouse Developer @ Dauphin County, PA - Remote

Overview

Remote
$55 - $65
Contract - Independent
Contract - W2
Contract - 10 Month(s)

Skills

Microsoft Azure
Data Warehouse
Apache Spark
Art
Business Intelligence
Cloud Computing
Computer Hardware
Computer Science
Continuous Delivery
Continuous Integration
Customer Satisfaction
Daptiv
Data Analysis
Data Engineering
Data Mining
Data Modeling
Data Processing
Databricks
DevOps
Dimensional Modeling
ELC
Extract
Transform
Load
Extraction
Flowchart
File Systems
Health Care
IaaS
LIMS
Microsoft SQL Server
Microsoft SSIS
Microsoft SharePoint
Network
NextGen
Modeling
Onboarding
PaaS
Public Health
Python
Quality Control
Regulatory Compliance
Relational Databases
Reporting
Research
SQL
SaaS
Software Development Methodology
Software Quality Assurance
Star Schema
Status Reports
Storage
Supervision
Surveillance
Technical Writing
Test Cases
Test Plans
Testing
Transact-SQL
Use Cases
Statistics

Job Details

Azure Data Warehouse Developer

Dauphin County, PA - Remote

10 months Contract

We are currently hiring candidates who are authorized to work on our W2.

Candidate with previous state/govt client experience is preferred.

Successful candidate must be located in PA, and be able to work '1 day a month on-site

Support of a Data Modernization Initiative, with the vision that all public health policies and interventions are driven by data, and the mission to provide all internal and external public health decision makers with accessible, timely, reliable, and meaningful data to drive policies and interventions. The Enterprise Data Warehouse (EDW) is responding to DOH s need for centralized data and state of the art data analysis services by modernizing its data portfolio, architecture, and statistical analysis capabilities aimed at improving public health surveillance, interventions, future outbreak prevention, outcomes, and research.

Architect / Azure DW Developer position will support both the existing business and reporting requirements of individual DOH / DDAP systems and program areas, and the construction of a modern data warehouse that will serve DOH / DDAP from an enterprise perspective.

The primary objective of this engagement is for the selected candidate to serve as the data warehouse developer supporting the analysis and reporting needs of the DOH / DDAP, and the design and construction of a modern EDW in Azure.

This position s scope includes the modernization of DOH operations; plan, coordinate and respond to data reporting needs, set standards and define framework; assist with large volume data processing, statistical analysis of large datasets; revamping the EDW into Microsoft s Azure Cloud utilizing Azure Databricks, Delta Lake and Synapse, including compute, storage and application fabric, as well as services for infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) and serverless technologies; create a centralized data model; support for DOH projects like ELC Enhanced Detection Expansion, Data Modernization Initiative, PA NEDSS NextGen, PA LIMS Replacement, Reporting Hub, Verato UMPI, COVID-19 response, and onboarding additional DOH systems into the EDW.

REQUIREMENTS

The Architect is a senior level resource with advanced, specialized knowledge and experience in data warehousing, database, and programming concepts and technology. The selected contractor must have proven experience in the development, maintenance, testing, and maintenance of Azure production systems and projects. This position designs, develops, tests, and implements data lakes, databases, extract-load-transform programs, applications, and reports. This position will work with business analysts, application developers, DBAs, network, and system staff to achieve project objectives - delivery dates, cost objectives, quality objectives, and program area customer satisfaction objectives.

  • Manage assignments and track progress against agreed upon timelines.
  • Plan, organize, prioritize, and manage work efforts, coordinating with the EDW and other teams.
  • Participate in status reviews, process reviews, deliverable reviews, and software quality assurance work product reviews with the appropriate stakeholders.
  • Participate in business and technical requirements gathering.
  • Perform research on potential solutions and provide recommendations to the EDW and DOH.
  • Develop and implement solutions that meet business and technical requirements.
  • Participate in testing of implemented solution(s).
  • Build and maintain relationships with key stakeholders and customer representatives.
  • Give presentations for the EDW, other DOH offices, and agencies involved with this project.
  • Develops and maintains processes and procedural documentation.
  • Ensure project compliance with relative federal and commonwealth standards and procedures.
  • Conduct training and transfer of knowledge sessions for system and code maintenance.
  • Complete weekly timesheet reporting in PeopleFluent/VectorVMS by COB each Friday.
  • Complete weekly project status updates in Daptiv if necessary. This will be dependent on a project being entered in Daptiv.
  • Provide weekly personal status reporting by COB Friday submitted on SharePoint.
  • Utilize a SharePoint site for project and operational documentation; review existing documentation.

The Architect can design, develop, and implement data and ELT application infrastructure in Azure to provide reliable and scalable applications and systems to meet the organization s objectives and requirements. The Architect is familiar with a variety of application and database technologies, environments, concepts, methodologies, practices, and procedures

The candidate must have significant, hands-on technical experience and expertise with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python.

  • Significant, hands-on technical experience and expertise with the design, implementation and maintenance of business intelligence and data warehouse solutions, with expertise in using SQL Server and Azure Synapse.
  • Experience producing ETL/ELT using SQL Server Integration Services and other tools.
  • Experience with SQL Server, T-SQL, scripts, queries.
  • Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design, implement, and maintain robust and scalable CI/CD pipelines, automate the build, test, and deployment processes for various applications and services, troubleshoot and resolve pipeline issues and bottlenecks, and has experience with Monorepo-based CI/CD pipelines
  • Experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering techniques.
  • Experience with data mining architecture, modeling standards, reporting and data analysis methodologies.
  • Experience with data engineering, database file systems optimization, APIs, and analytics as a service.
  • Analyzing and translating business requirements and use cases into optimized designs and developing sound solutions.
  • Advanced knowledge of relational databases, dimensional databases, entity relationships, data warehousing, facts, dimensions, and star schema concepts and terminology.
  • Creates and maintains technical documentation, diagrams, flowcharts, instructions, manuals, test plans, and test cases. Follows established SDLC best practices, documents code and participates in peer code reviews.
  • Ability to balance work between multiple projects and possess good organizational skills, with minimal or no direct supervision.
  • Demonstrated ability to communicate and document clearly and concisely
  • Ability to work collaboratively and effectively with colleagues as a member of a team.
  • Ability to present complex technical concepts and data to a varied audience effectively.
  • More than 5 years of relevant experience.
  • 4-year college degree in computer science or related field with advanced study preferred.


PREFERRED EXPERIENCE

  • Experience working in the public health or healthcare industry with various health data sets.

TIMEFRAMES

This will be a 10-month engagement beginning in September 2025

LOCATION:

The contractor must reside in PA, and will be permitted to work from home.

The contractor is expected to be in the office at least '1' day per month, subject to additional days in office at Manager discretion.

In addition, DOH will supply all necessary hardware and software for daily use that are needed to complete assigned work items.

Skill

Required Exp

Candidate Exp

Technical experience and expertise with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python.

5

Design, implementation, and maintenance of business intelligence and data warehouse solutions, with expertise in using SQL ServerAzure Synapse

5

Experience producing ETL/ELT using SQL Server Integration Services and other tools.

5

Experience with SQL Server, T-SQL, scripts, queries

5

Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design, implement, and maintain robust and scalable CI/CD pipelines

5

Experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering

5

Experience with data engineering, database file systems optimization, APIs, and analytics as a service

5

Experience with data mining architecture, modeling standards, reporting and data analysis methodologies

5

4-year college degree in computer science or related field with advanced study preferred.

Required

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About KSN Technologies, Inc.