Azure Architect (Hybrid)

  • Boston, MA
  • Posted 10 hours ago | Updated 10 hours ago

Overview

Hybrid
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12 Month(s)

Skills

Accessibility
Amazon Web Services
Apache Airflow
Apache HTTP Server
Cloud Computing
Collaboration
Command-line Interface
Continuous Improvement
Data Architecture
Data Engineering
Data Flow
Data Governance
Data Lake
Data Management
Data Processing
Data Security
Data Warehouse
Databricks
Decision-making
Emerging Technologies
Good Clinical Practice
Google Cloud Platform
Grafana
Innovation
Management
Microsoft Azure
Optimization
Orchestration
Performance Tuning
Process Automation
Productivity
PySpark
Python
Regulatory Compliance
SQL
Snow Flake Schema
Storage
System Monitoring
Unity
Workflow

Job Details

Title: Azure Architect

Location: Boston, MA (Hybrid)

Term: Contract

Job title : ARCHITECT

Job summary : Join our dynamic team as an Architect, where youll leverage your 10 years of experience to design and implement cutting-edge data solutions.

Youll work with technologies like Azure Synapse, AWS, and Databricks to drive innovation and efficiency. This hybrid role offers the opportunity to make a significant impact on our data strategy and contribute to our companys success.

Experience : 10Yrs

Required Skills : ,Python,Grafana,Datadog,Snowflake,Azure Databricks,PySpark,Apache Airflow, AWS, Apache Hudi, Data Build Tool, DataLake, Airflow, Databricks CLI, Azure Synapse DataWarehouse, Snowflake SQL, Snowflake Tasks, Databricks Unity Catalog Admin, Databricks Delta Lake, Apache Iceberg, Fivetran, AWS / Google Cloud Platform / Azure Cloud, Databricks Assistant

Responsibilities : -

- Design and implement scalable data architectures using Azure Synapse and DataLake to optimize data processing and storage solutions.

- Collaborate with cross-functional teams to integrate AWS, Google Cloud Platform, and Azure Cloud services, ensuring seamless data flow and accessibility.

- Develop and maintain data pipelines using PySpark and Python, enhancing data processing efficiency and reliability.

- Utilize Grafana and Datadog for monitoring and performance tuning, ensuring optimal system performance and availability.

- Implement and manage Apache Airflow for orchestrating complex data workflows, improving automation and process efficiency.

- Leverage Apache Hudi and Apache Iceberg for data versioning and time travel capabilities, enhancing data management practices.

- Utilize Data Build Tool and Fivetran for data transformation and integration, streamlining data workflows.

- Optimize Snowflake Tasks and Snowflake SQL for efficient data querying and analysis, driving data-driven decision-making.

- Administer Databricks Unity Catalog and CLI for data governance and management, ensuring data security and compliance.

- Implement Databricks Delta Lake for reliable data lakes, improving data consistency and reliability.

- Utilize Databricks Assistant and Azure Databricks for collaborative data engineering, enhancing team productivity.

- Ensure data solutions align with business objectives, contributing to the companys strategic goals and societal impact.

- Stay updated with industry trends and emerging technologies, driving continuous improvement and innovation.

Qualifications:

- Possess extensive experience with Azure Synapse and DataLake, demonstrating expertise in data architecture design.

- Have a strong background in AWS, Google Cloud Platform, and Azure Cloud services, showcasing proficiency in cloud integration.

- Demonstrate proficiency in PySpark and Python, highlighting skills in data processing and automation.

- Experience with Grafana and Datadog, emphasizing capabilities in system monitoring and performance tuning.

- Proficient in Apache Airflow, showcasing skills in workflow orchestration and process automation.

- Familiarity with Apache Hudi and Apache Iceberg, demonstrating knowledge in data versioning and management.

- Experience with Data Build Tool and Fivetran, highlighting skills in data transformation and integration.

- Proficient in Snowflake SQL and Snowflake Tasks, showcasing expertise in data querying and analysis.

- Experience with Databricks Unity Catalog and CLI, emphasizing skills in data governance and management.

- Familiarity with Databricks Delta Lake, demonstrating knowledge in data lake optimization.

- Experience with Azure Databricks, showcasing skills in collaborative data engineering.

Certifications Required : Azure Solutions Architect Expert, AWS Certified Solutions Architect, Databricks Certified Data Engineer Associate

Years of Experience: 10.00 Years of Experience

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Digitive LLC