Overview
Skills
Job Details
Job Description:
Job Title: Data Architect
Location: Snoqualmie, WA (5 Days Onsite in a week)
Mandatory Skills: Data Architecture, Data Pipeline, data lakes
Data Architect (7+ Years Experience)-1 position
We are looking for a senior Data Architect with 7+ years of proven experience in designing, implementing, and optimizing data architecture for complex and large-scale systems. The ideal candidate will own the end-to-end architecture across structured, semi-structured, and unstructured data, and will work closely with engineering, analytics, and business teams to enable scalable data solutions.
This role combines strategic leadership, hands-on technical design, and cross-functional collaboration to deliver modern, secure, and performant data platforms.
________________________________________
Key Responsibilities
- Define and implement enterprise-level data architecture strategy
- Lead the design and implementation of data pipelines, data lakes, and data warehouses
- Build architectural patterns for data ingestion, transformation, governance, and cataloging
- Design schemas and optimize performance for analytical and operational workloads
- Define data quality, lineage, and metadata management frameworks
- Ensure compliance with privacy and security regulations (GDPR, CCPA, etc.)
- Collaborate with business stakeholders to gather data needs and translate them into scalable solutions
- Provide technical leadership to data engineers and guide best practices across systems
________________________________________
Required Skills & Experience
- 7+ years in Data Architecture, including hands-on work with modern data stacks
- Deep experience with:
o Relational databases (PostgreSQL, SQL Server, MySQL, etc.)
o Big Data platforms (Spark, Hadoop, Hive)
o Cloud-native data tools (Azure Synapse, Databricks, Snowflake, AWS Redshift)
o Data lakehouse designs and lake formats (Delta Lake, Parquet, Avro)
- Strong background in ETL/ELT tools (e.g., Apache Airflow, Azure Data Factory, dbt)
- Solid understanding of API-based data integration, real-time streaming (Kafka, Kinesis), and batch pipelines
- Experience with data modeling techniques (3NF, star schema, data vault, etc.)
- Familiar with data governance tools like Collibra, Alation, or Microsoft Purview
- Proficient in at least one programming language (Python, Scala, Java)