Data Architect

Remote • Posted 5 hours ago • Updated 5 hours ago
Full Time
Remote
Depends on Experience
Fitment

Dice Job Match Score™

📋 Comparing job requirements...

Job Details

Skills

  • DATA ARCHITECT
  • DATA ENGINEER
  • DATA LAKES

Summary

Title: Data Architect
Location: Remote

Ekman Associates, Inc. is a Southern California based company focused on the following services: Management Consulting, Professional Staffing Solutions, Executive Recruiting and Managed Services.

Summary: As a Data Architect, you will lead the strategy, design, and implementation of a centralized data ecosystem that powers analytics and operational platforms across the organization. This role emphasizes Microsoft Fabric as the core platform to unify real-time event streams and historical/curated datasets across an enterprise landscape. You''ll define target-state architectures, data modeling standards, and governance patterns while partnering with engineering, product, security, and analytics teams to ensure data is trusted, scalable, and actionable.

Responsibilities :
  • Microsoft Fabric Architecture & Strategy: Define the end-to-end architecture for Fabric (Lakehouse/Warehouse, OneLake, Data Factory, Real-Time Analytics/Eventstream, semantic models) to support cross-domain Operations & Technology use cases.
  • Enterprise Data Centralization: Drive consolidation of fragmented datasets into a centralized, discoverable platform—designing domain-aligned data products and shared datasets to reduce duplication and improve time-to-insight.
  • Real-Time + Historical Data Design: Architect solutions that blend streaming data (operational telemetry, events, logs) with historical data (transactions, master/reference data) to enable both operational visibility and long-term analytics.
  • Lakehouse & Warehouse Patterns: Establish patterns for bronze/silver/gold layers, Delta-based designs, data virtualization where appropriate, and performance strategies (partitioning, optimization) for scalable consumption.
  • Data Modeling & Semantic Standards: Define canonical models (dimensional, data vault, or hybrid), conformed dimensions, and semantic layer practices to ensure consistent reporting and self-service analytics across teams.
  • Integration & Pipeline Architecture: Guide ingestion and orchestration patterns across APIs, databases, files, SaaS platforms, and event streams—ensuring resiliency, observability, and cost-aware scaling within Fabric.
  • Data Governance, Security & Quality: Partner with security, privacy, and governance stakeholders to implement access controls, lineage, data contracts, quality frameworks, and lifecycle management to build trust and compliance.
  • Technical Leadership & Enablement: Serve as a hands-on architectural leader—reviewing designs, mentoring engineers, setting standards, and translating business needs into scalable technical blueprints and roadmaps.

Qualifications:
  • Bachelor''s degree in Computer Science, Engineering, Information Systems, or a related field, or equivalent experience.
  • 6+ years of experience in data engineering and/or data architecture roles, including ownership of enterprise-scale data platforms.
  • Strong experience designing lakehouse and/or warehouse architectures, including layered data patterns (raw → curated → consumable).
  • Expertise in SQL and strong proficiency with Python (and/or PySpark) for data engineering and automation.
  • Experience architecting data solutions on Microsoft Fabric and/or Azure (or equivalent cloud platforms), including security and operational best practices.
  • Proven ability to design for both real-time (streaming/event-driven) and batch pipelines, and to align architecture to operational and analytical outcomes.
  • Strong communication skills—able to align stakeholders, document architectures, and influence technical direction across multiple teams.
  • Deep hands-on experience with Microsoft Fabric capabilities (OneLake, Lakehouse/Warehouse, Data Factory, Eventstream/Real-Time Analytics, semantic modeling/Power BI).
  • Experience with event streaming and messaging architectures (e.g., Kafka, Kinesis, Event Hubs) and patterns for CDC and near-real-time ingestion.
  • Experience with data governance tooling and practices (cataloging, lineage, data contracts, privacy classification, retention).
  • Familiarity with CI/CD and infrastructure automation (e.g., GitHub Actions/Azure DevOps, Terraform/Bicep) and environment promotion strategies.
  • Experience building robust observability (logging, alerting, SLAs/SLOs, data quality monitoring) for enterprise pipelines.
  • Background supporting large-scale Operations & Technology environments (service operations, workplace technology, infrastructure, endpoint management, identity, security, network).
  • Media, entertainment, or consumer technology experience—especially where reliability, scale, and cross-domain data integration are critical. Strong presentation skills, with the ability to create compelling data narratives.
Qualified Candidates Only : If you wish to learn more about this opportunity and additional qualifications/responsibilities, please submit your resume. To learn more about Ekman Associates, Inc. please visit our website at
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
  • Dice Id: 91010724
  • Position Id: 26-00015
  • Posted 5 hours ago
Create job alert
Set job alertNever miss an opportunity! Create an alert based on the job you applied for.

Similar Jobs

Remote

20d ago

Easy Apply

Full-time

Depends on Experience

Remote

3d ago

Easy Apply

Full-time

$160,000 - $180,000

Remote

Today

Full-time

USD 156,640.00 per year

Remote or Richmond, Virginia

Today

Full-time

USD80 - USD110

Search all similar jobs