Technical Architect for Snowflake Data Pipelines (Real-Time Focus)

  • Posted 10 hours ago | Updated 10 hours ago

Overview

Remote
$100,000 - $110,000
Full Time

Skills

Amazon Web Services
Artificial Intelligence
Advanced Analytics
Agile
Analytics
Apache Kafka
Batch Processing
Cloud Computing
Collaboration
Communication
Data Architecture
Data Engineering
Data Integration
Data Loading
Data Modeling
Data Security
ELT
Evaluation
Extract
Transform
Load
FOCUS
MySQL
ODS
Oracle
POC
Performance Tuning
PostgreSQL
Privacy
Real-time
Regulatory Compliance
SQL
Scalability
Scrum
Snow Flake Schema
Streaming
Technical Drafting
Fivetran

Job Details

Job Title:
Technical Architect for Snowflake Data Pipelines (Real-Time Focus)
Location, Length & Start Date:
Remote (U.S. Based Only) with September 1st start
Position Overview:
We are seeking a Technical Architect with strong expertise in designing and implementing data pipelines in Snowflake to lead the integration and transformation of data from various source systems, including Oracle ODS, Postgres, and MySQL. This role is critical in transitioning from batch processing to near real-time data pipelines, enabling AI agents and advanced analytics with minimal latency.
The ideal candidate has hands-on experience with Snowflake s real-time capabilities (Snowpipe Streaming, Dynamic Tables. Kafka Connector) and modern data integration tools (OpenFlow, Fivetran). You will architect scalable, high-performance data pipelines while ensuring security, reliability, and efficiency. You will be responsible for the technical design and oversight of end-to-end data ingestion and transformation pipelines, ensuring optimal performance, security, and maintainability.
Key Responsibilities:
- Design and implement near real-time data ingestion pipelines in Snowflake, transitioning from batch processing.
- Evaluate and recommend solutions for real-time data loading (Snowpipe Streaming, Dynamic Tables, Kafka Connector, OpenFlow, Fivetran).
- Lead proof-of-concept (POC) efforts for real-time data integration using Fivetran, OpenFlow, or Snowflake-native streaming.
- Optimize ETL/ELT processes for low-latency data movement from Oracle ODS, Postgres, and MySQL.
- Define best practices for streaming data architecture in Snowflake.
- Collaborate with AI/analytics teams to ensure data pipelines support agentic AI and real-time analytics.
- Monitor and optimize pipeline performance, cost, and scalability.
- Ensure compliance with data security, governance, and privacy policies.
Required Qualifications:
- Must be located and authorized to work in the United States.
- 7+ years in data engineering/architecture, with 3+ years in Snowflake.
- Hands-on experience with real-time/near real-time data pipelines in Snowflake.
- Expertise in Oracle ODS, Postgres, and MySQL as source systems.
- Experience with Snowflake OpenFlow, Snowpipe Streaming, Kafka connector and Dynamic Tables.
- Knowledge of Fivetran (preferred for POC evaluation).
- Proficiency in SQL, data modelling, and performance tuning for low-latency workloads.
- Familiarity with cloud platforms (AWS) and modern data stack tools.
- Strong communication skills with the ability to lead technical discussions on streaming architectures.
- Experience working in Agile/Scrum environments.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Blue.Cloud