Data Architect Hybrid in NYC, NY or Clinton, NJ Locals Only

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12 Month(s)
50% Travel
Able to Provide Sponsorship

Skills

Agile
Amazon Web Services
Analytics
Cloud Computing
Continuous Delivery
Continuous Integration
Data Architecture
Data Engineering
Data Governance
Data Modeling
Data Quality
Databricks
GitHub
Meta-data Management
Python
SQL
Semantics
Unity
Workflow
Data Architect
modern data architecture
analytics engineering
cloud-native data platforms.
data platforms
dbt Cloud
AWS
Delta Lake
Unity Catalog
CI/CD pipelines
data quality checks
GitHub Actions
dbt Cloud jobs
AWS certifications

Job Details

Data Architect

  • Job Title: Data Architect
  • Location: Hybrid in NYC, NY or Clinton, NJ (3 days onsite)
  • Notes: This is a local-only position for candidates in NJ or NYC.

Job Summary

We are seeking a Data Architect to join a high-priority, hybrid role in NYC or Clinton, NJ. The ideal candidate has hands-on experience in modern data architecture, analytics engineering, and cloud-native data platforms. You will be responsible for designing and implementing scalable data pipelines using dbt Cloud, Databricks, and AWS. This role requires a professional with a product mindset, who can bridge data modeling, transformation, and platform operations.

Key Responsibilities

  • Design and implement scalable analytics engineering pipelines using dbt Cloud.
  • Support the design of data workflows on Databricks using Delta Lake and Unity Catalog.
  • Partner with Data Product teams to translate business needs into data models and semantic layers.
  • Implement data quality checks and CI/CD pipelines.
  • Document lineage, metadata, and access patterns to ensure data governance and discoverability.

Required Skills & Experience

  • 5+ years of experience in data architecture, analytics engineering, or data engineering roles (client specifies 12+ years overall experience).
  • Practical experience building transformation pipelines using dbt Cloud.
  • Hands-on experience with Databricks and AWS.
  • Strong proficiency in SQL, with familiarity using Python in data workflows.
  • Solid understanding of data modeling principles and data quality techniques.
  • Experience with agile delivery practices and CI/CD pipelines using tools like GitHub Actions or dbt Cloud jobs.
  • dbt, Databricks, or AWS certifications are a huge plus.

Aditya Jain

Email: Direct: EXT: 482

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.