Data warehouse Architect

Overview

Remote
Depends on Experience
Full Time
Accepts corp to corp applications
10% Travel
Able to Provide Sponsorship

Skills

Data warehouse
Cloud
Public Sector Experience

Job Details

Job Title: Data Warehouse Architect

Location: Sacramento, CA

Job Type: Long Term Contract

About R Systems:

R Systems is a global leader in technology and analytics services, delivering innovative solutions to drive digital transformation across industries like Telecom, Banking, Healthcare, and Public Services. With over 4,400 professionals in 25+ countries, we are committed to helping our clients achieve operational excellence and enhanced customer experiences.

We are proud to be Great Place to Work Certified in 10 countries, including India, the USA, Canada, and more. Our vibrant team culture fosters collaboration, innovation, and growth, making R Systems a fantastic place to work.

Join us in creating impactful solutions and advancing technology for a better tomorrow.

MINIMUM TECHNICAL QUALIFICATIONS

Mandatory Qualifications:

  • Must have a minimum of eight (8) years of experience in dimensional modeling, data vault, and normalized/denormalized schema design with the ability to architect data models that translate business requirements into scalable technical solutions, developing documentation and architectural diagrams, and leading data projects from end-to-end.
  • Must have a minimum of eight (8) years of experience performing data partitioning, clustering, indexing, and query optimization techniques to enhance data warehouse performance
  • Must have a minimum of eight (8) years of experience integrating with BI platforms such as Power BI, Tableau, Looker or other similar products, and designing data marts or sematic layers for analytics
  • Must have a minimum of five (5) years of experience in designing and implementing scalable data warehouse solutions in Snowflake, Amazon Redshift, Google Big query, Azure Synapse, or similar.
  • Must have a minimum of five (5) years of experience in building and orchestrating data pipelines using dbt, Apache Airflow, Talend, Informatica, Azure Data Factory, Databricks or SSIS.
  • Must have a minimum of five (5) years of experience in big data ecosystems and distributed processing frameworks, utilizing Databricks or other Spark based tools, for handling large scale data transformation.
  • Must have a minimum of five (5) years of experience in data governance best practices including metadata management, data lineage, cataloging, RBAC, encryption, and regulatory compliance such as GDPR, HIPPA, SOC2. Experience in one of the tools like Alation, Collibra, Azure Purview, Informatica Axon or Open Metadata.
  • Must have a minimum of five (5) years of experience in cloud infrastructure (e.g., IAM, VPC, Storage) and CI/CD practices, including tools such as Git, Terraform, Ansible or Jenkins.

Desirable qualifications:

  • Public Sector Exp
  • Local to Sacramento, CA
  • Strong communication and collaboration skills
  • Exp with Modeling & Metadata Management tools such as ER/Studio, Erwin or similar tools
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.