Overview
On Site
0.0
Contract - W2
Skills
Data Engineering
Analytics
Apache Kafka
Amazon Web Services
Microsoft Azure
Apache Flink
Apache Spark
Amazon Kinesis
Streaming
Google Cloud
Google Cloud Platform
Data Flow
Extract
Transform
Load
ELT
Informatica
Cloud Computing
Management
CQRS
Sourcing
Grafana
Terraform
Continuous Integration
Continuous Delivery
Conflict Resolution
Problem Solving
Communication
Kubernetes
Orchestration
Database
SQL
NoSQL
Real-time
Analytical Skill
Job Details
Data Infrastructure Engineer
6 Month Contract to Hire
San Antonio, TX (Hybrid 4 days onsite)
THIS IS A DATA ENGINEER WITH CLOUD/PIPELINE AUTOMATION EXPERIENCE
Need:
Fivetran/Informatica IDMC/ Azure Data Factory (or similar automation experience)
ETL/ELT
REQUIRED:
Data Engineering
Cloud Engineering
Infrastructure Engineering
Event Platforms/Data Streaming
IaC
Project Overview
Building and managing cloud-based data pipelines
Working with streaming platforms like Kafka, Kinesis, or Event Hubs
Using modern ETL/ELT tools like Fivetran and Informaticas IDMC
Supporting event-driven architectures and real-time analytics
Leveraging Infrastructure as Code, CI/CD, and Kubernete
Evaluation Notes
Senior candidates should show depth across multiple tools and architectural understanding, and ideally have led delivery or designed resilient, scalable systems.
Preferred Qualifications:
* Extensive Experience with Cloud and Data Platform Infrastructure Engineering.
* Extensive experience with event platforms on-prem or Cloud (Kafka, Redpanda, AWS Kinesis, Azure Event Hubs, Google Cloud Platform Pub/Sub), stream processing frameworks (Flink, Spark Streaming, Amazon Kinesis Data Streams, Google Cloud Dataflow ).
* Experience with ETL/ELT tools like Fivetran, and/or Informatica's cloud product, IDMC; demonstrated ability to design, implement, and optimize data pipelines. This includes experience with and automating connector configuration, data transformation, and schema management.
* Understanding of event-driven architectures, CQRS, and event sourcing.
* Proficiency in monitoring and observability tools (Datadog, Prometheus, Grafana, or similar).
* Strong experience with Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation).
* Proven ability to automate tasks and implement CI/CD pipelines.
* Strong problem-solving and communication skills.
* Experience with Kubernetes and container orchestration.
* Experience with various database technologies (SQL and NoSQL).
* Experience in supporting Real-time analytical need.
6 Month Contract to Hire
San Antonio, TX (Hybrid 4 days onsite)
THIS IS A DATA ENGINEER WITH CLOUD/PIPELINE AUTOMATION EXPERIENCE
Need:
Fivetran/Informatica IDMC/ Azure Data Factory (or similar automation experience)
ETL/ELT
REQUIRED:
Data Engineering
Cloud Engineering
Infrastructure Engineering
Event Platforms/Data Streaming
IaC
Project Overview
Building and managing cloud-based data pipelines
Working with streaming platforms like Kafka, Kinesis, or Event Hubs
Using modern ETL/ELT tools like Fivetran and Informaticas IDMC
Supporting event-driven architectures and real-time analytics
Leveraging Infrastructure as Code, CI/CD, and Kubernete
Evaluation Notes
Senior candidates should show depth across multiple tools and architectural understanding, and ideally have led delivery or designed resilient, scalable systems.
Preferred Qualifications:
* Extensive Experience with Cloud and Data Platform Infrastructure Engineering.
* Extensive experience with event platforms on-prem or Cloud (Kafka, Redpanda, AWS Kinesis, Azure Event Hubs, Google Cloud Platform Pub/Sub), stream processing frameworks (Flink, Spark Streaming, Amazon Kinesis Data Streams, Google Cloud Dataflow ).
* Experience with ETL/ELT tools like Fivetran, and/or Informatica's cloud product, IDMC; demonstrated ability to design, implement, and optimize data pipelines. This includes experience with and automating connector configuration, data transformation, and schema management.
* Understanding of event-driven architectures, CQRS, and event sourcing.
* Proficiency in monitoring and observability tools (Datadog, Prometheus, Grafana, or similar).
* Strong experience with Infrastructure as Code (IaC) tools (e.g., Terraform, CloudFormation).
* Proven ability to automate tasks and implement CI/CD pipelines.
* Strong problem-solving and communication skills.
* Experience with Kubernetes and container orchestration.
* Experience with various database technologies (SQL and NoSQL).
* Experience in supporting Real-time analytical need.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.