Job Title: Data Architect Snowflake & Enterprise Data Platform
Location: Richmond, VA; Hybrid
Duration: Long-Term Contract
Experience Required: 7+ Years
Job Overview
We are looking for a hands-on Data Architect with strong expertise in Snowflake, Oracle Exadata, and modern data engineering frameworks. This role will lead the design and implementation of scalable, high-performance data platforms supporting enterprise analytics, BI, and AI/ML initiatives.
You will define architecture standards, build robust data pipelines, and ensure the data ecosystem is secure, governed, and optimized for performance and cost.
Key Responsibilities
Data Architecture & Engineering
- Design and implement scalable ETL/ELT pipelines for batch and incremental processing
- Architect end-to-end data solutions across on-prem and cloud (Azure/AWS)
- Build high-performance pipelines for structured and semi-structured data
- Lead data ingestion from Oracle Exadata to Snowflake
Snowflake Platform Leadership
- Define standards for RBAC, warehouse sizing, multi-cluster strategy, and performance tuning
- Implement advanced Snowflake features (Streams, Tasks, Data Sharing)
- Ensure platform governance, security, and cost optimization
Data Modeling & Analytics Enablement
- Design analytical data models (Star Schema, Snowflake Schema, Data Vault)
- Develop semantic layers and curated datasets for BI and self-service analytics
- Support advanced analytics, AI/ML pipelines, and data products
Governance & Best Practices
- Establish standards for:
- Data transformation and schema evolution
- Error handling and reprocessing
- Metadata management and versioning
- Implement DataOps practices (CI/CD, testing, observability, data quality checks)
Collaboration & Leadership
- Act as technical authority for data engineering and ETL architecture
- Mentor engineers and review pipeline designs and performance
- Collaborate with business, analytics, and engineering teams
Required Skills
- 7+ years in Data Architecture / Data Engineering / Enterprise Data Platforms
- Strong hands-on experience with:
- Snowflake (Streams, Tasks, RBAC, Performance Tuning, Data Sharing)
- Oracle Exadata
- Expertise in SQL and ETL/ELT frameworks
- Experience with tools like ADF, Airflow, dbt, Databricks, Informatica, or Talend
- Strong knowledge of data modeling (Star/Snowflake Schema, Data Vault, Semantic Layer)
- Experience with Azure or AWS cloud platforms
Nice to Have
- Experience with Dataiku (ML/analytics pipelines)
Certifications in Snowflake, Azure, Databricks, or Informatica