Azure Data Architect

  • Raritan, NJ
  • Posted 4 hours ago | Updated 3 hours ago

Overview

On Site
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)

Skills

Databricks and Azure cloud
Databricks

Job Details

Azure Data Architect

Raritan, NJ (Onsite/Hybrid) , Candidates nearby within tri-states who are ready to relocate are also fine.

The ideal candidate should have hands-on experience with Databricks and Azure cloud, as expertise in Databricks is a critical requirement for this position

For the Architect position, we require someone with comprehensive expertise in Databricks.

Must-Have Skills:

  • SAP Data Experience
  • Proficiency in Databricks and Azure Cloud, Databricks Asset Bundles, Holistic vision on the Data Strategy.
  • Proficiency in Data Streaming and Data Modeling
  • Experience in architecting at least two large-scale big data projects
  • Strong understanding of data scaling and its complexities
  • Data Archiving and Purging mechanisms.

We have an immediate requirement for resources to support a critical project focused on improving our manufacturing site assessments and performance metrics. The objective of this initiative is to combine assessment scores with actual performance data to derive meaningful insights, identify key correlations, and recommend actionable steps for enhancing performance.

About Syren Cloud

Syren Cloud Technologies is a cutting-edge company specializing in supply chain solutions and data engineering. Their intelligent insights, powered by technologies like AI and NLP, empower organizations with real-time visibility and proactive decision-making. From control towers to agile inventory management, Syren unlocks unparalleled success in supply chain management.

Role Summary

An Azure Data Architect is responsible for designing, implementing, and maintaining the data infrastructure within an organization. They collaborate with both business and IT teams to understand stakeholders' needs and unlock the full potential of data. They create conceptual and logical data models, analyze structural requirements, and ensure efficient database solutions.

Job Responsibilities

  • Act as subject matter expert providing best-practice guidance on data lake and ETL architecture frameworks suitable for handling big data for unstructured and structured information.
  • Drive business and Service Layer development with the customer by finding new opportunities based on expanding existing solutions and creating new ones.
  • Provide hands-on subject matter expertise to build and implement Azure-based Big Data solutions.
  • Research, evaluate, architect, and deploy new tools, frameworks, and patterns to build sustainable Big Data platforms for our clients.
  • Facilitate and/or conduct requirements workshops.
  • Responsible for collaborating on the prioritization of technical requirements.
  • Collaborates with peer teams and vendors on the solution and delivery.
  • Has overall accountability for project delivery.
  • Works collaboratively with the Product Management, Data Management, and other Architects to deliver for the cloud data platform and Data as a Service.
  • Consults with clients to assess current problem states, define desired future states, define solution architecture, and make solutions recommendations.
  • Design and implement scalable and reusable data models to support analytical and operational use cases, ensuring compatibility with various business applications.
  • Optimize data models for performance and scalability, addressing key challenges in data latency, volume, and complexity in a big data environment.
  • Ensure data quality and consistency by establishing robust data validation and governance mechanisms during the modeling process.
  • Collaborate with data scientists, engineers, and business stakeholders to translate business requirements into effective and efficient data models.

Job Requirements

  • Degree in computer science or equivalent preferred
  • Demonstrable experience in architecture, design, implementation, and/or support of highly distributed applications with Azure cloud and Databricks.
  • 8+ Years of Hands-on experience with data modelling, database design, data mining, and segmentation techniques.
  • Working knowledge and experience with "Cloud Architectures" (e.g., SaaS, PaaS, IaaS) and the ability to address the unique security considerations of secure Cloud computing
  • Should have architected solutions for Cloud environments such as Microsoft Azure and/or Google Cloud Platform
  • Experience with debugging and performance tuning in distributed environments
  • Strong analytical skills with the ability to collect, organize, analyze, and broadcast significant amounts of information with attention to detail and accuracy
  • Experience dealing with structured, unstructured data.
  • Must have Python, Pyspark experience.
  • Experience in ML or/and graph analysis is a plus
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.