Data Solution Architect || Remote Work

Overview

On Site
Contract - W2

Skills

Data Lake
Databricks
Apache Kafka
Microsoft Power BI
Snow Flake Schema
Data Flow
Documentation
Automated Testing
Onboarding
Legacy Systems
Data Governance
Enterprise Architecture
Technical Writing
Prototyping
Streaming
Analytics
Data Quality
Scalability
Risk Management
Auditing
Disaster Recovery
Computer Science
Solution Architecture
Data Engineering
Microsoft Azure
Amazon Web Services
Google Cloud
Google Cloud Platform
Cloud Computing
Microservices
Kubernetes
API
ELT
Data Modeling
Extract
Transform
Load
Orchestration
Meta-data Management
Data Warehouse
Real-time
Data Processing
Data Integration
Management
Privacy
Regulatory Compliance
Encryption
Access Control
Penetration Testing
Analytical Skill
Problem Solving
Conflict Resolution
Leadership
Agile
DevOps
Continuous Integration
Continuous Delivery
Project Management
Collaboration
JIRA
Financial Software
Payment Gateways
Blockchain

Job Details

Job Title: Data Solution Architect

Location: Buffalo, NY Remote Work

Duration: Long Term Contract

What You'll Be Doing:

  • Architect and deliver cloud-based data platforms and scalable data pipelines (e.g., Azure Data Lake, Databricks, Kafka).
  • Develop and implement ETL/ELT frameworks to ingest, transform, and integrate data from diverse sources.
  • Design and deploy data warehouses, lakehouses, and analytics environments supporting Power BI and Snowflake.
  • Define and enforce standards around data governance, quality, metadata, and compliance.
  • Collaborate with Enterprise Architecture to align solutions with strategic reference models and standards.
  • Lead evaluations and proof-of-concepts for emerging data technologies and architectures.
  • Produce deliverables including architecture diagrams, data flow maps, security models, and documentation.
  • Guide Agile/DevOps teams through CI/CD pipelines, automated testing, and infrastructure-as-code implementations.

Expected Deliverables:

  • End-to-end solution architectures for data platforms, including detailed diagrams, technology stack specifications, and integration patterns.
  • ETL/ELT pipeline designs and implementation plans for onboarding new data sources and modernizing legacy systems.
  • Data governance frameworks, including policies for data quality, lineage, and access control, aligned with Enterprise Architecture standards.
  • Technical documentation packages: architecture blueprints, operational runbooks, and security protocols.
  • Prototypes and proof-of-concept implementations demonstrating new data engineering capabilities (e.g., streaming analytics, automated data quality checks).
  • Performance benchmarks and scalability assessments for deployed solutions.
  • Risk mitigation plans, including security audit findings and disaster recovery strategies.

What You'll Need to Have:

  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 5+ years of experience in solution architecture or technical implementation, including 2+ years in data engineering or enterprise data platforms.
  • Relevant certifications such as AWS/Azure Solutions Architect or Certified Data Engineer.
  • Hands-on experience with cloud platforms such as Azure, AWS, or Google Cloud Platform.
  • Proficiency in cloud-native development, including microservices and Kubernetes.
  • Experience with API design (REST) and integration frameworks (ETL/ELT).
  • Strong knowledge of data modeling, data pipeline orchestration, metadata management, and data lifecycle practices.
  • Familiarity with data lakes, data warehouses, and lakehouse architectures.
  • Experience with real-time and batch data processing, data integration, and automation.
  • Understanding of financial systems such as payment networks, liquidity management, and settlement processes.
  • Working knowledge of global data privacy and compliance standards (e.g., GDPR, CCPA).
  • Familiarity with security practices including data encryption, access controls, and penetration testing.
  • Strong analytical problem-solving abilities for complex data systems.
  • Proven leadership in Agile/DevOps environments, including CI/CD pipelines and infrastructure-as-code.
  • Proficiency with project management and collaboration tools such as Jira.
  • Demonstrated success deploying scalable financial systems such as digital wallets, payment gateways, or blockchain-based platforms.

I look forward hear from you in positive !

Best Regards,

Faiz Ahmad | Sr. Resource Coordinator
590 Enterprise Dr | Lewis Center, OH 43035
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.