Data Solution Architect-Phoenix, AZ(Hybrid)(Only locals to Arizona, F2F Interview must)

  • Phoenix, AZ
  • Posted 22 hours ago | Updated 5 hours ago

Overview

Hybrid
Depends on Experience
Contract - W2
Contract - Independent

Skills

Apache Kafka
.NET
ADF
Amazon Redshift
Amazon Web Services
Data Lake
Databricks
Continuous Delivery
Continuous Integration
Data Engineering
Data Governance
Apache Hadoop
Apache Spark
Management
Meta-data Management
Microsoft Azure
Extract
Transform
Load
IT Architecture
IT Management
Legacy Systems
Migration
Artificial Intelligence
Cloud Computing
DevOps
Documentation
Workflow
ELT
Git
Optimization
Prompt Engineering
Python
SQL
Warehouse

Job Details

Hello,

I Hope you are doing well.

This is Surya from Humac Inc., Please check the following job description, and if you are interested, or know someone who might be interested, please share your updated resume to reach you.

Role: Data Solution Architect

Location: Phoenix, AZ(Hybrid)

Note: Only locals to Arizona, F2F Interview must

Overview
The Solution Architect Data is responsible for contributing to the design, modernization, and optimization of enterprise-scale data systems, as well as the maintenance and operations strategy for CHP. This role involves designing and implementing data systems that organize, store, and manage data within our cloud data platform.

The ideal candidate will have a proven track record of delivering end-to-end ETL/ELT pipelines across Databricks, Azure, and AWS environments. Key Responsibilities
Design scalable data lake and data architectures using Databricks and cloud-native services.
Develop metadata-driven, parameterized ingestion frameworks and multi-layer data architectures.
Optimize data workloads and performance.
Define data governance frameworks for CHP.
Design and develop robust data pipelines.
Architect AI systems, including RAG workflows and prompt engineering.
Lead cloud migration initiatives from legacy systems to modern data platforms.
Provide architectural guidance, best practices, and technical leadership across teams.
Build documentation, reusable modules, and standardized patterns.
Required Skills and Experience
Strong expertise in cloud platforms, primarily Azure or AWS.
Hands-on experience with Databricks.
Deep proficiency in Python and SQL.
Expertise in building ETL/ELT pipelines and ADF workflows.
Experience architecting data lakes and implementing data governance frameworks.
Hands-on experience with CI/CD, DevOps, and Git-based development.
Ability to translate business requirements into technical architecture.
Technical Expertise
Programming: Python, SQL, R
Big Data: Hadoop, Spark, Kafka, Hive
Cloud Platforms: Azure (ADF, Databricks, Azure OpenAI), AWS
Data Warehousing: Redshift, SQL Server
ETL/ELT Tools: SSIS
Required Educational Background
Bachelor s degree in Computer Science, Information Technology, Information Systems, Engineering, or a related field.
6+ years of experience in data engineering or .NET development.

--

Best Regards,

Sai Surya Teja

US IT Recruiter

Humac Inc.

P:

E: | W:

LinkedIn:

Phoenix, AZ 85027

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.