Data Integration Developer

Sacramento, CA, US • Posted 1 day ago • Updated 1 day ago
Contract Corp To Corp
Contract W2
Contract Independent
Travel Required
On-site
Depends on Experience
Fitment

Dice Job Match Score™

📋 Comparing job requirements...

Job Details

Skills

  • Java
  • Oracle
  • PL/SQL
  • ETL
  • Change Management
  • SQL
  • Test Cases
  • business requirements
  • data analysis
  • PeopleSoft
  • Data Quality
  • data warehouse
  • Finance
  • database
  • automated testing
  • Dev Ops
  • Groovy
  • Data Warehousing
  • data warehouses
  • maintenance
  • Oracle Data Integrator
  • Business Intelligence
  • OPERATIONS
  • Star Schema
  • Data Analytics
  • Translating
  • Database Modeling
  • Deployment
  • Data Integration
  • Data Sources
  • Datamart
  • Operational Data Store
  • Dimensional Data
  • Project Lifecycle
  • Business Operations
  • Technical Specifications
  • Marketing Analysis
  • Collection
  • Data Services
  • Customer Service Oriented
  • Retail Sales

Summary

Position: Data Integration Developer

Duration: 12 months

Location: Sacramento, CA (Hybrid/Onsite work)

Client: State of California

 

Project Effective Dates:

7/1/2026, or upon Purchase Order execution, whichever date is later, through 6/30/2027.

 

Project Scope/Tasks

The Enterprise Analytics & Data Services team requires one (1) consultant proficient in Data Integration Development. The consultants will develop dimensional models, transformation solutions and provide data integration environment functional administration to develop and support data analytics solutions for CalPERS business users. The data integrator role provides technical expertise to perform data integration services that support the data warehouse, operational data store and reporting. These services include new development, software changes and environment maintenance and operations. New development includes the expansion of the data warehouse data sets such as self-service data marts, facts and related dimension tables, and Extract/Transform/Load (ETL) job development using the Oracle Data Integrator product.


To quantify the need, maintenance and operations are needed for the Data & Analytics foundation. The major components of the foundation are listed below.

Data Warehouse – a centralized collection of transformed, business data sourced from transactional application databases across CalPERS, data is in the form of dimensional models, historical snapshots (i.e., CAFR data), metric marts and data marts. Hundreds of business unit users rely on this data to be current and accurate. New development is required to adapt to changing business needs. Maintenance and operations is required to monitor the data for freshness and accuracy. The data warehouse is key to enabling self-service data analytics across CalPERS.

Data Hub – a centralized collection of the raw data sourced from transactional application databases across CalPERS and the cloud. The Data Hub is used to fulfill data extract requests and any analytical requirement that is not directly supported by the data warehouse. (c) Source application change management coordination – Many CalPERS applications feed into the Data & Analytics foundation. A few of the larger applications are my CalPERS, PeopleSoft HCM, Wavetec, and CalPERS Customer Education Center (CEC). When source applications are upgraded and patched, coordinated Data & Analytics maintenance is required for continued operations. (d) Modernization of BI data platform and Architecting DevOps into the data & analytics software project lifecycle. Architecting frameworks that increase automation capabilities. Java programming of add-on modules and cloud data source connectors that work with the existing data platform. Groovy programming for task automation and repeatable frameworks. Exploring new data warehousing tools.

The scope of this project encompasses the following tasks:

 

Deliverable 1: Design and development, including programming, testing, enhancements, and maintenance, of specialized software for Business Intelligence (BI) data integration solution for an advanced analysis of CalPERS business transactions.

Design, develop, and maintain specialized software for a Business Intelligence (BI) solution to analyze CalPERS business transactions. BI provides historical, current, and predictive insights into business operations by translating business data requirements into technical specifications and developing enterprise data warehouse star schemes. Each data warehouse deliverable focuses on a specific business subject area (star schema) and includes technical specifications, dimensional data models, and Extract, Transform, and Load (ETL) programs. Validation and testing to ensure data aligns with the subject area and integrates seamlessly into the enterprise data platform, which includes a data warehouse and scheduled ETL jobs. Design multidimensional data models (star schemas) to represent business transactions, supporting data analysis and analytic dashboards. Integrate data models and ETL jobs into the enterprise data warehouse following established standards. Validate data records to ensure they meet business requirements. Provide backup support for maintaining and operating the data platform, which includes over 3,000 ETL jobs, an operational data store (raw, unprocessed data from multiple applications), and a data warehouse (star schemas and data marts). Address and troubleshoot issues related to deliverables during the assignment.


Deliverable 2: Modernization of BI data platform. Architecting frameworks that increase automation capabilities.

Modernize the BI data platform and integrate DevOps practices into the data and analytics software project lifecycle. Architect frameworks to enhance automation capabilities, ensuring scalability and efficiency. Implement Java programming for developing add-on modules and cloud data source connectors that integrate seamlessly with the existing data platform. Utilize Groovy programming to automate tasks and create repeatable frameworks that streamline development and operational processes. Develop a proof of concept (PoC) for CalPERS' data integration process using a modern data integration tool, demonstrating improved performance, flexibility, and scalability. This includes evaluating and implementing advanced data integration technologies to optimize data flow, reduce latency, and enhance data quality. The modernization effort focuses on aligning the BI platform with current industry standards, leveraging cloud-based solutions, and ensuring the platform is future-ready. Additionally, Establish automated testing, deployment pipelines, and monitoring frameworks to ensure faster delivery, higher quality, and reduced operational risks. Architect solutions that support hybrid data environments, enabling seamless integration of on-premises and cloud data sources.

 

Deliverable 3: Knowledge Transfer

On an ongoing basis, through the end of the contract, all work products and deliverables (project status reports, business process, triage incident reports with resolution, meeting minutes, test cases, test outcomes) details have to be discussed with the contract manager to ensure that all of the information is documented and placed in a file share. The contract manager will schedule knowledge transfer sessions at regular intervals to ensure that all of the work production details have been documented and the knowledge has been transferred to state personnel. Knowledge transfer to State staff is part of each project when the projects are completed and released into maintenance mode. On an ongoing basis through the end of the contract all work products and deliverables (project status reports, business process, triage incident reports with resolution, meeting minutes, test cases, test outcomes) details have to be discussed with the contract manager to ensure that all of the information is documented and placed in a file share. The contract manager will schedule knowledge transfer sessions at regular intervals to ensure that all of the work production details have been documented and the knowledge has been transferred to the state personnel.

 

Required Skills                             

1. At least five (5) years of experience in some or all areas of Pension, Health Insurance, Customer Service, or Finance.                                

2. At least five (5) years of experience performing tasks pertaining to data integration (ETL) processes to transform disparate source data into the target data stores, data warehouses or DataMart’s.

3. At least three (3) years of experience in data integration projects using Oracle Database 19c and Oracle Data Integrator or similar ETL tools                        

4. At least five (5) years of experience with interpreting requirements and developing complex PL/SQL, SQL and data marts.                  

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 10365731
  • Position Id: 3017-14481-
  • Posted 1 day ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Sacramento, California

2d ago

Easy Apply

Contract

Depends on Experience

Sacramento, California

Today

Easy Apply

Full-time, Contract, Third Party

Sacramento, California

Today

Easy Apply

Contract

$50+

Sacramento, California

Today

Easy Apply

Contract, Third Party

Search all similar jobs