Overview
Skills
Job Details
Job: Data Warehouse Developer/Architect
Location: Harrisburg, PA( 1 day onsite in a month)
This position will work with business analysts, application developers, DBAs, network, and system staff to achieve project objectives - delivery dates, cost objectives, quality objectives, and program area customer satisfaction objectives.
- Manage assignments and track progress against agreed upon timelines.
- Plan, organize, prioritize, and manage work efforts, coordinating with the EDW and other teams.
- Perform research on potential solutions and provide recommendations to the EDW and DOH.
- Develop and implement solutions that meet business and technical requirements.
- Participate in testing of implemented solution(s).
- Build and maintain relationships with key stakeholders and customer representatives.
- Give presentations for the EDW, other DOH offices, and agencies involved with this project.
- Complete weekly timesheet reporting in PeopleFluent/VectorVMS by COB each Friday.
- Provide weekly personal status reporting by COB Friday submitted on SharePoint.
- Utilize a SharePoint site for project and operational documentation; review existing documentation.
The Architect can design, develop, and implement data and ELT application infrastructure in Azure to provide reliable and scalable applications and systems to meet the organization s objectives and requirements. The Architect is familiar with a variety of application and database technologies, environments, concepts, methodologies, practices, and procedures
The candidate must have significant, hands-on technical experience and expertise with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python.
- Significant, hands-on technical experience and expertise with the design, implementation and maintenance of business intelligence and data warehouse solutions, with expertise in using SQL Server and Azure Synapse.
- Experience producing ETL/ELT using SQL Server Integration Services and other tools.
- Experience with SQL Server, T-SQL, scripts, queries.
- Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design, implement, and maintain robust and scalable CI/CD pipelines, automate the build, test, and deployment processes for various applications and services, troubleshoot and resolve pipeline issues and bottlenecks, and has experience with Monorepo-based CI/CD pipelines
- Experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering techniques.
- Experience with data mining architecture, modeling standards, reporting and data analysis methodologies.
- Experience with data engineering, database file systems optimization, APIs, and analytics as a service.
- Analyzing and translating business requirements and use cases into optimized designs and developing sound solutions.
- Advanced knowledge of relational databases, dimensional databases, entity relationships, data warehousing, facts, dimensions, and star schema concepts and terminology.
- Creates and maintains technical documentation, diagrams, flowcharts, instructions, manuals, test plans, and test cases. Follows established SDLC best practices, documents code and participates in peer code reviews.
- Ability to balance work between multiple projects and possess good organizational skills, with minimal or no direct supervision.
- More than 5 years of relevant experience.
- 4-year college degree in computer science or related field with advanced study preferred.