Solution Architect L3

Dallas, TX, US • Posted 9 hours ago • Updated 9 hours ago
Full Time
Travel Required
On-site
Depends on Experience
Company Branding Image
Fitment

Dice Job Match Score™

⏳ Almost there, hang tight...

Job Details

Skills

  • Microsoft Azure
  • Machine Learning (ML)
  • Data Engineering
  • Data Modeling
  • Databricks
  • Kubernetes
  • Solution Architecture

Summary

As a Solution Architect (L3), you will be the technical authority bridging business requirements and engineering execution. You will design end-to-end data and application architectures on Microsoft Azure, leveraging Databricks for large-scale data engineering and Power BI for enterprise analytics and reporting. You will lead cross-functional technical discussions, define data platform strategy, and ensure that solutions are scalable, resilient, secure, and aligned with organisational technology goals. This is a high-impact senior individual contributor role requiring deep cloud and data engineering expertise alongside strong stakeholder communication skills.

Key Responsibilities

Solution Architecture & Design

  • Define and own end-to-end solution architecture for complex, data-intensive, multi-service systems on Microsoft Azure.
  • Create and maintain architecture blueprints, HLDs, LLDs, data flow diagrams, and Architecture Decision Records (ADRs).
  • Evaluate build-vs-buy decisions and drive technology selection with clear trade-off analysis across the Azure ecosystem.
  • Ensure solutions satisfy non-functional requirements: performance, scalability, reliability, data freshness SLAs, and security.

Data Engineering & Platform Architecture

  • Architect end-to-end data pipelines using Azure Databricks spanning ingestion, transformation (Bronze / Silver / Gold layers), and serving.
  • Design and govern data lakehouse solutions using Azure Data Lake Storage Gen2 (ADLS Gen2) and Delta Lake.
  • Define data modelling standards (star schema, data vault, OBT) aligned to business reporting requirements in Power BI.
  • Architect real-time and batch ingestion patterns using Azure Event Hubs, Azure Data Factory (ADF), and Databricks Structured Streaming.
  • Drive adoption of Databricks Unity Catalog for data governance, lineage tracking, and fine-grained access control.
  • Design cost-efficient Databricks cluster strategies (job clusters, instance pools, spot instances) and monitor DBU consumption.
  • Establish data quality frameworks using Great Expectations, Databricks Lakehouse Monitoring, or equivalent tooling.

Analytics & Reporting Architecture

  • Design scalable semantic layer and data models powering Power BI enterprise reports and dashboards.
  • Define DirectQuery vs Import vs Composite model strategies for optimal Power BI performance and data freshness.
  • Architect Power BI Premium / Fabric workspace strategies, including capacity planning and deployment pipelines (Dev / Test / Prod).
  • Guide teams on DAX optimisation, row-level security (RLS), and Power BI paginated reports for operational reporting.
  • Integrate Power BI with Azure Databricks via Partner Connect or Lakehouse connector for live analytics on Delta tables.

Azure Cloud & Infrastructure

  • Architect cloud-native solutions leveraging Azure services: Azure Synapse Analytics, Azure Purview, Azure Key Vault, Azure Monitor, and APIM.
  • Define Infrastructure-as-Code (IaC) standards using Terraform or Bicep for repeatable, auditable deployments.
  • Design Azure networking topology: VNets, Private Endpoints, NSGs, and hub-spoke patterns to secure data flows.
  • Guide teams on containerisation (Docker, Azure Kubernetes Service) and Databricks-native MLflow for ML model management.
  • Define disaster recovery, backup, and high-availability strategies across the Azure data platform stack.

Technical Leadership

  • Serve as the primary technical point of contact across data engineering, analytics, product, and business teams.
  • Provide technical mentorship to senior data engineers, analytics engineers, and tech leads.
  • Lead architecture reviews, proof-of-concepts, and data platform migration spikes.
  • Drive engineering best practices: modular pipeline design, idempotency, schema evolution, and CI/CD for data workflows.

Stakeholder Engagement & Governance

  • Partner with data owners, business analysts, and C-suite stakeholders to translate analytical requirements into robust data products.
  • Present platform architecture, roadmaps, and data strategy to executive leadership with clarity and confidence.
  • Establish data governance frameworks covering cataloguing (Azure Purview), data classification, and lineage.
  • Ensure compliance with data privacy regulations (PDPB, GDPR) and security standards (ISO 27001, SOC 2) across the data platform.

Required Qualifications

Experience

  • 8 12 years of overall IT experience, with at least 4 years in a Solution Architect, Principal Data Engineer, or Senior Data Architect role.
  • Proven track record of architecting and delivering large-scale data platforms on Microsoft Azure in production.
  • Hands-on experience with Azure Databricks including workspace administration, cluster management, Delta Live Tables, and Unity Catalog.
  • Strong experience delivering Power BI enterprise analytics solutions semantic modelling, RLS, Premium capacities, and deployment pipelines.
  • Experience in Agile / SAFe delivery environments with strong DataOps and CI/CD practices for data pipelines.

Azure Data Platform Proficiency

  • Deep expertise in Azure Data Lake Storage Gen2, Delta Lake, and the Medallion (Bronze / Silver / Gold) architecture pattern.
  • Proficient in Azure Data Factory for pipeline orchestration, and Azure Event Hubs / Azure Service Bus for streaming ingestion.
  • Strong knowledge of Azure Synapse Analytics, Azure SQL Database / Managed Instance, and Cosmos DB.
  • Hands-on with Azure Purview (Microsoft Purview) for data cataloguing, classification, and lineage.
  • Experience with Azure Key Vault, Azure Active Directory / Entra ID, and Azure Private Link for platform security.
  • Familiarity with Azure Monitor, Log Analytics, and Databricks monitoring integrations for observability.

Databricks & Data Engineering

  • Expert-level Spark programming in PySpark and/or Scala for large-scale batch and streaming transformations.
  • Deep knowledge of Delta Lake features: ACID transactions, time travel, Z-ordering, liquid clustering, and schema enforcement.
  • Experience with Databricks Workflows (formerly Jobs), Delta Live Tables (DLT), and Databricks Asset Bundles for CI/CD.
  • Familiarity with Databricks MLflow for experiment tracking and model registry (ML workloads are a plus, not mandatory).
  • Ability to design cost-optimised cluster policies, autoscaling strategies, and spot-instance configurations.

Power BI & Analytics

  • Strong Power BI development skills: data modelling (star schema), DAX measures, calculated columns, and report/dashboard design.
  • Experience with Power BI Premium / Fabric capacity management, paginated reports, and dataflows Gen2.
  • Proficiency in defining composite model strategies (DirectQuery + Import) and optimising query performance with aggregations.
  • Knowledge of Power BI REST APIs, XMLA endpoints, and Tabular Editor for advanced semantic model management.

Soft Skills

  • Exceptional written and verbal communication ability to author crisp architecture documents and present confidently to diverse audiences.
  • Strong analytical and problem-solving mindset with the ability to navigate ambiguity and competing priorities.
  • Collaborative leader who can influence without authority across data engineering, analytics, product, and business stakeholders.

Education & Certifications

  • E. / B.Tech / M.Tech in Computer Science, Information Technology, or a related field (or equivalent industry experience).
  • Microsoft Certified: Azure Data Engineer Associate (DP-203) required.
  • Databricks Certified Associate / Professional Data Engineer strongly preferred.
  • Microsoft Certified: Azure Solutions Architect Expert (AZ-305) or Power BI Data Analyst (PL-300) a strong plus.

Preferred Qualifications
  • Experience with Microsoft Fabric (OneLake, Fabric Data Engineering, Real-Time Analytics) the strategic evolution of the Azure data platform.
  • Exposure to dbt (data build tool) for SQL-based transformation layers and analytics engineering workflows.
  • Familiarity with Apache Iceberg or Apache Hudi as alternative open table formats alongside Delta Lake.
  • Prior experience with Azure Machine Learning or Databricks Model Serving for operationalising ML models.
  • Background in FinTech, HealthTech, Retail, or Manufacturing verticals with complex regulatory data requirements.
  • Contributions to open-source data projects or published technical content (blogs, conference talks, white papers).
  • Exposure to data mesh principles and federated data ownership models.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91095132
  • Position Id: 8943412
  • Posted 9 hours ago

Company Info

About Zapcom Group

Zapcom bridges the gap between imagination and realization. With precision product engineering, we shape ideas into tangible solutions, driving your aspirations forward with every innovation.

We are passionate about building digital products and platforms that can bend revenue and cost curves. We design, build, operate and optimize technology for our clients by leading their digital transformation journey.

About_Company_OneAbout_Company_Two
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Dallas, Texas

9d ago

Full-time

Irving, Texas

Today

Easy Apply

Full-time

Irving, Texas

Today

Full-time

USD 133,365.00 - 156,900.00 per year

Dallas, Texas

Today

Full-time

USD 90,000.00 - 150,000.00 per year

Search all similar jobs