Overview
On Site
Depends on Experience
Contract - W2
Contract - 12 Month(s)
Skills
Change Data Capture
Terraform
Docker
Kubernetes
GCP
Data Modeling
SQL
Job Details
Role: Google Cloud Platform Developer
Location: Dallas, TX (Onsite)
Duration: 12 Months
Responsibilities:
Design, build, and maintain reliable batch and streaming data pipelines to support analytical and operational workloads.
Work on implementing Change Data Capture (CDC) mechanisms to stream data from OLTP sources into data lakes or data warehouses.
Develop and optimize data workflows for ingestion, transformation, and storage using SQL, Spark, or cloud-native tools.
Engineer data platforms and infrastructure using Terraform, Docker, and Kubernetes in AWS, Azure, or Google Cloud Platform environments.
Integrate observability and logging into data pipelines using tools like New Relic, Datadog, or Prometheus for monitoring pipeline health and performance.
Ensure high test coverage using unit/integration tests to maintain data quality and system reliability.
Collaborate with data scientists, analysts, and platform engineers to deliver scalable and secure data infrastructure. Educational & Required Qualifications: Must have 5+ years of experience in Application Data Modeling, Stored Procedures, Entity-Relationship Diagrams, Document Models, SQL, creating DDLs and DMLs
Must have 2+ years with cloud-native data engineering on AWS, Azure, or Google Cloud Platform (e.g., Glue, Synapse, BigQuery, Dataflow).
Strong hands-on experience with SQL, DDLs, DMLs, Data Migration, Database Management, and large-scale data ingestion pipelines.
Strong working experience in databases & data tools such as Microsoft SQL Server, SSMS, Azure Data Studio, PostgresSQL & MongoDB
Strong hands-on data processing programs experience using languages such as Java, Python, Node.js or C#
Proficient in SQL Server Change Data Capture (CDC), and working with streaming platforms (e.g., Kafka, Azure Event Hub, AWS Kinesis).
Experience working with data lakes, data warehouses, and structured/unstructured data.
Exposure to containerization using Docker and orchestration with Kubernetes.
Infrastructure provisioning and deployment automation using Terraform and CI/CD tools.
Familiarity with observability tools for pipeline performance and operational visibility.
Optional: Experience with Java, .NET, or Node.js for supporting data platform services or utilities.
Exposure to Google Maps ODRD or Orleans Framework is a plus.
Design, build, and maintain reliable batch and streaming data pipelines to support analytical and operational workloads.
Work on implementing Change Data Capture (CDC) mechanisms to stream data from OLTP sources into data lakes or data warehouses.
Develop and optimize data workflows for ingestion, transformation, and storage using SQL, Spark, or cloud-native tools.
Engineer data platforms and infrastructure using Terraform, Docker, and Kubernetes in AWS, Azure, or Google Cloud Platform environments.
Integrate observability and logging into data pipelines using tools like New Relic, Datadog, or Prometheus for monitoring pipeline health and performance.
Ensure high test coverage using unit/integration tests to maintain data quality and system reliability.
Collaborate with data scientists, analysts, and platform engineers to deliver scalable and secure data infrastructure. Educational & Required Qualifications: Must have 5+ years of experience in Application Data Modeling, Stored Procedures, Entity-Relationship Diagrams, Document Models, SQL, creating DDLs and DMLs
Must have 2+ years with cloud-native data engineering on AWS, Azure, or Google Cloud Platform (e.g., Glue, Synapse, BigQuery, Dataflow).
Strong hands-on experience with SQL, DDLs, DMLs, Data Migration, Database Management, and large-scale data ingestion pipelines.
Strong working experience in databases & data tools such as Microsoft SQL Server, SSMS, Azure Data Studio, PostgresSQL & MongoDB
Strong hands-on data processing programs experience using languages such as Java, Python, Node.js or C#
Proficient in SQL Server Change Data Capture (CDC), and working with streaming platforms (e.g., Kafka, Azure Event Hub, AWS Kinesis).
Experience working with data lakes, data warehouses, and structured/unstructured data.
Exposure to containerization using Docker and orchestration with Kubernetes.
Infrastructure provisioning and deployment automation using Terraform and CI/CD tools.
Familiarity with observability tools for pipeline performance and operational visibility.
Optional: Experience with Java, .NET, or Node.js for supporting data platform services or utilities.
Exposure to Google Maps ODRD or Orleans Framework is a plus.
Soft Skills:
Strong team collaboration and clear communication abilities.
Willingness to learn and adapt to new technologies and tools.
Strong team collaboration and clear communication abilities.
Willingness to learn and adapt to new technologies and tools.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.