Data Architect- NJ or NYC- Local

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - W2
Able to Provide Sponsorship

Skills

Data Architecture
Amazon Web Services
Data Modeling
Databricks
Data Quality
Data Management
Data Engineering
Analytical Skill
Collaboration
Cloud Computing
DevOps
Python
Meta-data Management
Macros
Orchestration
GitHub
Documentation
Continuous Integration
Testing
Communication
Design Review
Regulatory Compliance
SQL
Unity
Version Control

Job Details

We are seeking a Data Architect with hands-on experience in modern data architecture, analytics engineering, and cloud-native data platforms. You will help shape and deliver high-quality, scalable data products using dbt Cloud, Databricks, and AWS, supporting analytics and business initiatives across the enterprise.

This role is ideal for an experienced data professional who understands how to bridge data modeling, transformation, and platform operations with a product mindset. You will work closely with Data Engineers, Analytics Engineers, and Product Owners to design and build the next generation of governed, discoverable, and reusable data assets.

What You ll Do:

  • Design and implement the architecture for scalable analytics engineering pipelines using dbt Cloud, ensuring modularity, reusability, testing, and governance across domains.
  • Support the design of scalable and performant data workflows on Databricks using Delta Lake and Unity Catalog.
  • Partner with Data Product teams and Value Stream teams to translate business needs into data models, semantic layers, and shared assets.
  • Implement data quality checks, CI/CD pipelines, and version control processes across analytics engineering projects.
  • Contribute to architecture and design reviews across data initiatives to ensure alignment with best practices.
  • Document lineage, metadata, and access patterns to promote data discoverability and compliance.
  • Collaborate with infrastructure and platform teams on AWS services, Databricks, and DevOps to support scalable processing and automated workflows.
  • Promote a product-thinking mindset in the development and delivery of analytics-ready datasets.
  • Identify opportunities for improvement and innovation within the existing data infrastructure.
  • Promote best practices in data architecture and data management across the organization.
  • Stay up to date on modern data stack tooling and contribute to evaluating and implementing improvements.

What You ll Bring:

  • 5+ years of experience in data architecture, analytics engineering, or data engineering roles.
  • Practical experience building transformation pipelines using dbt Cloud (models, tests, documentation, macros).
  • Hands-on experience with Databricks and AWS.
  • Strong proficiency in SQL, with familiarity using Python in data workflows.
  • Solid understanding of data modeling principles, data quality techniques, and pipeline orchestration.
  • Experience with agile delivery practices and CI/CD pipelines using tools like GitHub Actions, dbt Cloud jobs, or Databricks repos.
  • Strong analytical and communication skills with a collaborative mindset.
  • Ability to lead with empathy, embrace diverse perspectives, and foster a culture of openness and continuous learning.

Certifications in dbt, Databricks, or AWS are a plus

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.