Overview
Skills
Job Details
Our client is looking for a Solutions Architect for a 6 month contract opportunity located in Phoenix, AZ.
location: Phoenix, Arizona
job type: Contract
salary: $120 - 135 per hour
work hours: 8am to 5pm
education: Bachelors
responsibilities:
The Solution Architect - Data is responsible for contributing to the design, modernization, and optimization of enterprise-scale data systems, as well as the maintenance and operations strategy for our client. This role involves designing and implementing data systems that organize, store, and manage data within our cloud data platform.
The architect will perform continuous maintenance and operations work for our client in the cloud environment. They will review and analyze our client's data infrastructure, plan future database solutions, and implement systems to support data management for client's users.
Additionally, this role is accountable for ensuring data integrity, making sure our client's team adheres to data governance standards to maintain accuracy, consistency, and reliability across all systems. The architect will identify data discrepancies and quality issues, and work to resolve them.
This position requires a strong blend of architectural leadership, technical depth, and the ability to collaborate with business stakeholders, data engineers, machine learning practitioners, and domain experts to deliver scalable, secure, and reliable AI-driven solutions.
The ideal candidate will have a proven track record of delivering end-to-end ETL/ELT pipelines across Databricks, Azure, and AWS environments.
Key Responsibilities
- Design scalable data lake and data architectures using Databricks and cloud-native services.
- Develop metadata-driven, parameterized ingestion frameworks and multi-layer data architectures.
- Optimize data workloads and performance.
- Define data governance frameworks for CHP.
- Design and develop robust data pipelines.
- Architect AI systems, including RAG workflows and prompt engineering.
- Lead cloud migration initiatives from legacy systems to modern data platforms.
- Provide architectural guidance, best practices, and technical leadership across teams.
- Build documentation, reusable modules, and standardized patterns.
Required Skills and Experience
- Strong expertise in cloud platforms, primarily Azure or AWS.
- Hands-on experience with Databricks.
- Deep proficiency in Python and SQL.
- Expertise in building ETL/ELT pipelines and ADF workflows.
- Experience architecting data lakes and implementing data governance frameworks.
- Hands-on experience with CI/CD, DevOps, and Git-based development.
- Ability to translate business requirements into technical architecture.
- Technical Expertise
Big Data: Hadoop, Spark, Kafka, Hive
Cloud Platforms: Azure (ADF, Databricks, Azure OpenAI), AWS
Data Warehousing: Redshift, SQL Server
ETL/ELT Tools: SSIS
qualifications:
Required Educational Background
Bachelor's degree in Computer Science, Information Technology, Information Systems, Engineering, or a related field.
6+ years of experience in data engineering or .NET development.
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.
At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact
Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including: medical, prescription, dental, vision, AD&D, and life insurance offerings, short-term disability, and a 401K plan (all benefits are based on eligibility).
This posting is open for thirty (30) days.