Senior Hybrid Cloud Data Engineer

Overview

On Site
Hybrid
Depends on Experience
Contract - W2
Contract - 18 Month(s)

Skills

on prem
hybrid cloud
aws
amazon web services
terraform
python
docker
kubernetes
ci/cd
cloud data engineer

Job Details

Job Title: Senior Hybrid Cloud Data Engineer

Overview: We are looking for a highly skilled Hybrid Cloud Data Engineer to bridge the gap between legacy systems and modern cloud infrastructures. The ideal candidate will be responsible for leveraging the scalability and flexibility of the cloud while maintaining the control and reliability of on-prem systems, supporting critical decision-making and innovation.

Key Responsibilities:

Data Pipeline Development:

  • Build and manage ETL (Extract, Transform, Load) pipelines to move data between on-premises systems and cloud platforms.
  • Ensure pipelines are efficient, scalable, and capable of handling large volumes of data.

System Integration:

  • Design and implement solutions that enable interoperability between on-prem systems and cloud platforms in hybrid cloud models.
  • Facilitate data synchronization, ensuring consistency and availability across both environments.

Data Storage and Management:

  • Manage storage solutions for both on-prem and cloud systems, balancing performance, cost, and reliability.
  • Optimize the use of cloud-based storage (e.g., Amazon S3, Azure Blob Storage) with on-prem database systems.

Security and Compliance:

  • Implement robust security measures to safeguard data as it moves between on-prem and cloud environments
  • Ensure compliance with regulatory and organizational policies, particularly in industries like finance or healthcare.

Performance Optimization:

  • Monitor and fine-tune data processing workflows to minimize latency and maximize efficiency.
  • Leverage cloud-native tools (e.g., AWS Glue, Azure Data Factory) alongside on-prem tools for streamlined operations.

Skills and Expertise:

  • Cloud Platforms: Proficiency in AWS, Azure, or Google Cloud, with knowledge of hybrid deployment patterns
  • On-Prem Systems: Strong understanding of traditional database systems (e.g., SQL Server, Oracle) and data warehouses.

Programming:

  • Proficient in Python programming.

Containerization:

  • Familiarity with Docker and Kubernetes for managing hybrid workloads.

Networking:

  • Understanding of VPNs, firewalls, and other networking principles for hybrid connectivity.

Big Data Tools:

  • Knowledge of distributed processing tools like Hadoop, Spark, or Kafka.

*Please note this position cannot sponsor and candidates must be able to work on a W2 only*

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.