Overview
On Site
Up to $120,000
Full Time
Skills
Data Architect
ETL
SQL
DATA Governance
Job Details
Job Title: Data Architect
Location: Berkeley Heights, NJ
Position Summary:
We are seeking an experienced and hands-on Data Architect to lead the design, implementation, and optimization of modern, scalable data architecture solutions. The ideal candidate will have a strong foundation in distributed databases, data integration, cloud-based platforms, and modern data tools. This role will collaborate closely with engineering, DevOps, and analytics teams to support enterprise-wide data initiatives.
Required Qualifications:
- Bachelor s or Master s degree in Computer Science, Engineering, or related discipline.
- 7+ years of experience in data architecture, engineering, or related technical roles.
- Proven hands-on experience with Snowflakeand modern data warehouses.
- Strong knowledge and experience with distributed SQL databasessuch as YugabyteDB.
- Proficiency in ETL development, especially using Qlik Replicateor similar tools.
- Solid experience in relational data modelingand data normalization techniques.
- Strong understanding of data governance, security, and compliance practices.
- Experience working with cloud platforms(e.g., AWS, Azure, or Google Cloud Platform) for data solutions.
- Strong programming skills in Pythonand Java.
- Experience with API integrationsfor data ingestion and exposure.
- Familiarity with CI/CDpractices and collaboration with DevOps teams.
- Experience using Sigmaor similar reporting and analytics tools.
Preferred / Nice-to-Have Skills:
- Experience with Rocket ETLor similar ETL tools.
- Hands-on experience with PySparkor distributed data processing frameworks.
- Exposure to AI/ML workflowsand tools such as Snowflake ML.
- Knowledge of Infrastructure as Code (IaC)tools like Terraform.
- Experience with version control systems and agile methodologies.
Key Responsibilities:
- Architect and implement scalable, secure, and high-performance data platforms using modern technologies.
- Design end-to-end data architecture solutions to support real-time, near-real-time, and batch data processing needs.
- Lead data modeling efforts including normalization, relational design, and support for distributed SQL databases.
- Integrate diverse data sources including internal systems, external APIs, and third-party feeds.
- Develop and manage ETL/ELT pipelines, including near real-time replication solutions using tools like Qlik Replicate.
- Apply best practices in data governance, metadata management, and data quality frameworks.
- Collaborate with DevOps teams to build and maintain automated CI/CD pipelines for data workloads.
- Optimize cloud-based data solutions leveraging platforms such as Snowflake and YugabyteDB.
- Work with API-based data ingestion and delivery for both internal and external applications.
- Support reporting and dashboarding using tools like Sigma.
- Contribute to the development of data-intensive applications using Python and Java.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.