Data Warehouse Architect

Overview

Remote
$50 - $70
Accepts corp to corp applications
Contract - W2
Contract - Independent
No Travel Required

Skills

Analytics
Data Warehouse
Continuous Integration
Data Governance
Meta-data Management
HIPAA
Informatica
Microsoft Power BI
Microsoft SSIS
Amazon Redshift
Apache Airflow
Ansible
Apache Spark
Business Intelligence
Big Data
Virtual Private Cloud
System On A Chip
Snow Flake Schema
Query Optimization
Regulatory Compliance
Microsoft Azure
Dimensional Modeling
IaaS
Databricks
Data Marts
Encryption

Job Details

Job Title: Data Warehouse Architect

Client: State client

Location: Sacramento, California (Remote Local candidates preferred)

Duration: Long Term

Job Description

We are seeking an experienced Data Warehouse Architect to support a critical project for the State of California. The role will primarily be remote, but candidates local to Sacramento, CA will be given strong preference.

Minimum Qualifications (MSQs):

1. Must have a minimum of eight (8) years of experience in dimensional modeling, data vault, and normalized/denormalized schema design with the ability to architect data models that translate business requirements into scalable technical solutions, developing documentation and architectural diagrams, and leading data projects from end-to-end.

2. Must have a minimum of eight (8) years of experience performing data partitioning, clustering, indexing, and query optimization techniques to enhance data warehouse performance

3. Must have a minimum of eight (8) years of experience integrating with BI platforms such as Power BI, Tableau, Looker or other similar products, and designing data marts or sematic layers for analytics

4. Must have a minimum of five (5) years of experience in designing and implementing scalable data warehouse solutions in Snowflake, Amazon Redshift, Google Big query, Azure Synapse, or similar.

5. Must have a minimum of five (5) years of experience in building and orchestrating data pipelines using dbt, Apache Airflow, Talend, Informatica, Azure Data Factory, Databricks or SSIS.

6. Must have a minimum of five (5) years of experience in big data ecosystems and distributed processing frameworks, utilizing Databricks or other Spark based tools, for handling large scale data transformation.

7. Must have a minimum of five (5) years of experience in data governance best practices including metadata management, data lineage, cataloging, RBAC, encryption, and regulatory compliance such as GDPR, HIPPA, SOC 2. Experience in one of the tools like Alation, Collibra, Azure Purview, Informatica Axon or Open Metadata.

8. Must have a minimum of five (5) years of experience in cloud infrastructure (e.g., IAM, VPC, Storage) and CI/CD practices, including tools such as Git, Terraform, Ansible

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.