Data Architect || Onsite in Phoenix, AZ || W2 || Need Local to AZ

Phoenix, AZ, US • Posted 4 hours ago • Updated Just Now
Contract W2
Contract Corp To Corp
Travel Required
Able to Sponsor
On-site
$60 - $65/hr
Company Branding Image
Fitment

Dice Job Match Score™

🤯 Applying directly to the forehead...

Job Details

Skills

  • Amazon Web Services
  • Blueprint
  • Data Architecture
  • Data Engineering
  • Data Modeling
  • Continuous Delivery
  • Electronic Health Record (EHR)
  • Grafana
  • NoSQL
  • Neo4j
  • Apache Spark
  • Snow Flake Schema
  • Kafka
  • RDBMS

Summary

Role : Enterprise Data Architect

Location : Phoenix, AZ

Experience : 12+

Role Summary

The Enterprise Data Architect is a hands-on enterprise technologist with enterprise-wide visibility and accountability for data architecture across the organization. This role goes beyond strategy and governance—requiring the ability to design, build, prototype, and operationalize architectures using code.

The architect is expected to demonstrate solutions, codify architectural standards as reusable assets, and work directly with engineering teams to ensure architectures are implemented, observable, secure, and scalable in practice, not just in theory.


Key Responsibilities

1. Enterprise Data Strategy & Architecture (with Execution Ownership)

  • Define and continuously evolve the enterprise data architecture blueprint, but crucially express it as Architecture-as-Code (reference implementations, templates, IaC, CI/CD patterns).

  • Own enterprise-wide visibility into data platforms, data products, pipelines, and usage patterns across domains.

  • Translate enterprise data strategy into working, deployable solutions, not static documentation.

  • Establish data product standards including domain ownership, contracts, schemas, and SLAs, validated through real implementations.

  • Lead data platform modernization by personally contributing to design reviews, PoCs, and production-grade builds.


2. Hands-on Data Platforms & Engineering

  • Architect and implement scalable data platforms using AWS, EMR, Kafka, Snowflake, Databricks, Iceberg, and lakehouse patterns.

  • Personally build and review reference pipelines for batch, streaming, real-time, and event-driven use cases.

  • Define and implement data modeling, metadata, lineage, and data quality frameworks using code-first approaches.

  • Create reusable enterprise accelerators (templates, libraries, patterns) that teams can adopt.


3. AI, ML & GenAI Enablement (Practical, Not Conceptual)

  • Partner with AI and ML teams to hands-on enable feature stores, training pipelines, vector databases, and GenAI workflows.

  • Define and implement reference architectures for LLM integration, prompt orchestration, and retrieval-augmented generation (RAG).

  • Ensure AI-readiness through automated lineage, observability, and governance controls, embedded into pipelines.


4. Governance, Security & Compliance Embedded in Code

  • Operationalize governance by embedding controls into pipelines, platforms, and CI/CD, not manual reviews.

  • Implement security, access controls, encryption, and privacy-by-design directly in infrastructure and data workflows.

  • Ensure regulatory compliance is provable through automation, telemetry, and audit artifacts.


5. Leadership Through Doing

  • Act as a hands-on mentor who codes alongside teams when needed to unblock delivery.

  • Serve as an enterprise-wide advisor, with the credibility earned through demonstrated implementations.

  • Continuously evaluate and test emerging technologies before recommending enterprise adoption.

Required Skills & Experience

  • 10+ years in data architecture and data engineering roles with hands-on delivery experience.

  • Proven ability to code, build, and productionize enterprise data solutions.

  • Deep expertise in AWS, Snowflake, Spark, Kafka, EMR, Iceberg, and lakehouse architectures.

  • Strong experience across:

    • RDBMS, NoSQL, graph (Neo4j), vector databases, search platforms.

    • Streaming platforms (Kafka, Kinesis).

    • Observability tools (Prometheus, Grafana, Datadog, Splunk, CloudWatch).

  • Experience with data mesh and data products, implemented in practice.

  • Strong executive communication skills grounded in real system ownership.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91165686
  • Position Id: 8952503
  • Posted 4 hours ago

Company Info

About Value Spectrum Technologies LLC

Step into a future defined by empowerment at Value Spectrum Technologies. With leading-edge software solutions and strategic consulting, were dedicated to shaping and elevating your digital tomorrow. Experience the synergy of innovation and collaboration as we unlock unparalleled opportunities for growth in the dynamic landscape of technology. Welcome to empowerment.

Join us in navigating the ever-evolving digital landscape with confidence, as we work together to unlock unprecedented opportunities and build a tomorrow that is truly empowered by the limitless possibilities of technology. Your digital future starts here.

About_Company_OneAbout_Company_Two
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Phoenix, Arizona

Today

Easy Apply

Contract

65 - 70

Phoenix, Arizona

2d ago

Easy Apply

Third Party, Contract

60 - 65

Hybrid in Palo Alto, California

Today

Easy Apply

Third Party, Contract

55 - 60

Search all similar jobs