Mid-Level Data Architect

Overview

Remote
$60 - $65
Contract - W2

Skills

HVR installation
HVR configuration
SAP
HVR troubleshooting
CDC
HVR
Oracle
SQLServer
SAPHANA
PostgreSQL
MySQL
Snowflake
Databricks
BigQuery
Redshift
AWS
Azure
GCP

Job Details

Role Overview:

As a Big Data Architect contractor, you will be responsible for designing and implementing large-scale data solutions tailored to business needs. This role involves hands-on architecture, data replication, optimization, and cloud-based integration to enable secure, high-volume, and scalable data systems.


Key Responsibilities:

  • Design and develop scalable big data architectures to handle large volumes of structured and unstructured data.

  • Collaborate with business and technical stakeholders to gather data requirements and translate them into technical solutions.

  • Implement robust data integration, processing, and storage solutions using big data technologies.

  • Ensure compliance with data governance, quality, and security standards.

  • Optimize architectures for cost-efficiency, performance, and scalability.


Primary Skill:

SAP Advanced level (6 9 years experience required)


Platform Management & Technical Responsibilities:

  • Install, configure, and upgrade HVR (now Fivetran HVR) including hubs, agents, and all supporting components.

  • Set up data source connections for various systems (Oracle, SQL Server, SAP HANA, PostgreSQL, MySQL) and cloud platforms (Snowflake, Databricks, BigQuery, Redshift).

  • Configure data replication pipelines, transformation rules, and channel definitions.

  • Optimize system performance batch sizes, parallel execution, network tuning.

  • Manage high-volume data replication and Change Data Capture (CDC) strategies.

  • Monitor system health and ensure high availability of all agents and endpoints.


Required Skills & Qualifications:

  • HVR/Fivetran HVR expertise 5+ years of hands-on experience with installation, configuration, and troubleshooting.

  • Strong proficiency with databases: Oracle, SQL Server, SAP HANA, PostgreSQL, MySQL.

  • Experience with cloud data platforms: AWS, Azure, Google Cloud Platform and warehouses like Snowflake, BigQuery, Redshift.

  • Deep understanding of ETL/ELT, data warehousing, and data integration patterns.

  • Scripting skills (e.g., Shell scripting, Python) for automation and operations.

  • Solid understanding of CDC mechanisms and network/security best practices.

  • Strong problem-solving skills and ability to work with cross-functional teams.

  • Excellent verbal and written communication skills.

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Nineteen Eleven Solutions