Data Architect

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - W2
Contract - Independent
Contract - 12 Month(s)
No Travel Required

Skills

Data Architect
Power BI
Python
Foundation Models
AI
dashboards
Bedrock
Sagemaker
ETL
AWS Bedrock
Data Lakes
AWS
WS QuickSight
Kiro

Job Details

Job: Data Architect

Hybrid

Start: ASAP

End: Ongoing contract - no end date

Location: Hybrid in Trenton, NJ -- MUST BE LOCAL TO THE AREA, NO RELO

Info: this person will be doing data architecture, working with data using Power BI and Python. Must have a strong understanding of data structures. Must have an understanding of Foundation Models. NJ is building a data platform to integrate with an AI solution. This is a data analytics person who has strong Python and Power BI and has worked with the cloud.

Job Description

We are in need of a Data Architect to research, design, develop, and evaluate data that support the creation of reports and dashboards which will be utilized for analytics. This would include the use of visualization tools, artificial intelligence, and machine learning. This is a 6-month contract opportunity with the possibility of extension. The position is hybrid in Trenton, NJ.

The Data Architect will work with Power BI and Python and have a strong understanding of data structures. This position will work with Bedrock, Sagemaker or similar tools to work on our clients AI solution. ETL and data analysis experience are key to the success in this role.

This position does not offer work authorization sponsorship now or in the future. Candidates requiring sponsorship will not be considered.

What You'll Do

Build intuitive and insightful reports and dashboards using visualization platforms like Power BI

Integrate and process structured and unstructured data from various sources

Build, deploy or integrate intelligent models and generative AI solutions into business workflows using Amazon SageMaker and AWS Bedrock

Work with technical and non-technical stakeholders

What You'll Need

Required:

5 years experience with writing Python

5 years experience with building data pipelines

3 years experience in Data Lakes in an AWS environment

3 years experience with reporting tools: PowerBI

2 years experience with SQL

Knowledge and experience with some of the major CSPs (AWS and Azure)

Prior experience with AWS Bedrock and WS QuickSight or similar

2 years experience with using AWS SageMaker or similar

Experience designing reports, charts, and dashboards using tools such as PowerBI, QuickSight

Proficient in Python, SageMaker, Bedrock, Kiro, and other AI services

Understands data ingestion from various data sources (APIs, databases, csv, etc.), database structures

Data-driven professional with a strong analytical mindset and hands-on experience in the full data lifecycle from ingestion and transformation to visualization and advanced analytics

Proficient in integrating and processing structured and unstructured data from various sources, leveraging tools such as SQL, Python, or ETL frameworks

Expertise in building intuitive and insightful reports and dashboards using visualization platforms like Power BI is essential

Ability to communicate findings effectively to both technical and non-technical stakeholders

Solid understanding of AI and machine learning concepts, with practical experience using platforms like Amazon SageMaker and AWS Bedrock

A combination of technical acuity, problem-solving ability, and business awareness is critical for success in this role

Preferred:

Prior experience with AWS Kiro nice to have

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.