Technical Architect - Azure Data Services (Full-time Only)

Overview

On Site
$160,000 - $170,000
Full Time

Skills

SQL Server
Azure SQL Database
ADF
Databricks
PySpark
Data Lake
Data warehouse
Data Marts
Spark
Scala
Kafka
HIVE
HBase
Machine Learning
R
Python
WAF
Azure SQL DW
Synapse
Azure Data Lake

Job Details

Technical Architect

Santa Clara, CA 95054 (First Preference) / REMOTE (Second)

Direct-Hire / FTE

Job/ Role Description:

We are hiring a Technical Architect for Enterprise Data Platform, Data Engineering (Data/AI). Experience working with customer Business/IT team; Excellent verbal and written communication. Should be able to speak confidently throughout the conversation; Should be able to work independently as the overall Technical in-charge for the project; Should be able to understand requirements, proactively clarify any doubts/assumptions with the team members and also proficient in client interactions; Strong team player with an ability to guide Leads and developers; self-motivated and passionate to become technology expert; Good experience with working with customer teams. Strong team player. Proactive and adaptive. Should be flexible with working hours (sometimes during deployments, customer calls)

Expertise with minimum 6+ projects (10+ years of experience)

  • SQL Server, Azure SQL Database, ADF, Databricks, PySpark; Knowledge and experience in ETL architectures, designing ETL pipelines, Database schema designs, Data modeling techniques, Data Quality stages in various data environments like Data Lake, Data warehouse, Data Marts leveraging the Medallion Architecture; Databricks Unity Catalog, Lakehouse
  • Designing DB schema, data models on RDBMS and DW leveraging the Medallion Architecture on RDBMS and Azure Data Platform; Programming and optimizing DB objects, Views, Stored Procs, Functions

Experience with minimum 2+ projects (3+ years of experience)

  • Azure SQL DW/Synapse, Azure Data Lake or Blob storage; Experience working with high volume data, large objects
  • Deep knowledge of the various Data Engineering components within Microsoft Fabric is preferable (including Lakehouse)
  • Designing a storage and data pipeline solution using Azure Data Services

Other Must-Have Skills:

  • Experience working with high volume data, large objects and variety of data types; leveraging the Azure Well-Architected Framework (WAF) for Assessments is preferable
  • Following the design/coding best practices; Experience of working in a challenging environment with unclear requirements and contributing collaboratively with the team to refine the requirements
  • Ability to offer innovative ideas/solutioning, that are modern and effective

Secondary Skills/Good To have

  • DP-203 Certified and DP-700 Certified
  • Experience with Power BI, Experience with Databricks with scala or python
  • Experience with Big Data technologies (Spark, Scala, Kafka, HIVE, HBase, etc.)
  • Experience with Stream Analytics and IoT; Machine Learning, R/Python
  • Experience with C# development web, and services
  • Exposure to Reporting/Dashboard applications using Power BI or any other BI reporting tool
  • Exposure to Machine Learning techniques, R/Python in a Data Engineering context is preferable
  • Exposure to any Data Governance processes and tools is preferable
  • Should have decent knowledge of the Azure Cloud Infrastructure and Environment
  • Proven experience working in Agile and Waterfall development methodologies is preferable
  • Experience working in DevOps environments is preferable

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.