Overview
Remote
Depends on Experience
Contract - W2
Contract - Independent
Contract - 12 Month(s)
Skills
HVR
Oracle
SQL Server
SAP HANA
PostgreSQL
MySQL
AWS
Azure
GCP
Snowflake
BigQuery
Redshift
ETL
Shell
Python
Job Details
Role: Big Data Architect
Work Location: (Remote) 2200 Ross Avenue, Dallas TX
Duration: 12 Months
Level Required for Primary Skill : Advanced (6-9 years experience)
Big Data Architect :
Platform Management: Install, configure, and upgrade HVR (now Fivetran HVR) software, including the Hub, agents (HVAs), and associated components, across source and target systems.
Connection & Replication Setup: Establish and manage data source connections for diverse databases (e.g., Oracle, SQL Server, SAP HANA, PostgreSQL, MySQL) and cloud platforms (e.g., Snowflake, Databricks, BigQuery, Redshift). Configure robust replication channels, defining source/target locations, selecting tables, and setting up data transformations.
Performance Optimization: Fine-tune HVR configurations to maximize data replication performance, including adjusting batch sizes, parallelization, and network settings. Proactively identify and address system capacity bottlenecks (e.g., storage, CPU, memory, network bandwidth) by recommending improvements.
High-Volume CDC Implementation: Develop and execute strategies for high-volume data replication and efficient log-based Change Data Capture (CDC).
Operational Health: Ensure HVR agents are running optimally and maintaining reliable connectivity to all source and target endpoints.
Key Skills & Qualifications
HVR/Fivetran Expertise: Minimum of 5 years of hands-on experience with HVR (or Fivetran HVR) administration, configuration, and advanced troubleshooting.
Database Proficiency: Strong command of various database technologies (e.g., Oracle, SQL Server, SAP HANA, PostgreSQL, MySQL) and their respective Change Data Capture (CDC) mechanisms.
Cloud Data Integration: Proven experience with major cloud platforms (AWS, Azure, Google Cloud Platform) and cloud data warehouses (Snowflake, BigQuery, Redshift).
Data Concepts: Solid understanding of data warehousing concepts, ETL/ELT processes, and common data integration patterns.
Problem-Solving: Excellent analytical and problem-solving abilities to diagnose and resolve complex data flow issues.
Communication: Strong communication and collaboration skills to work effectively with diverse technical and business teams.
Automation: Scripting proficiency (e.g., Shell scripting, Python) for automating tasks and enhancing operational efficiency.
Networking & Security: Fundamental knowledge of network configurations and security principles as they apply to data replication.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.