Data Engineer -- ONLY Alpharetta,GA candidates / No relocation candidate

  • Alpharetta, GA
  • Posted 21 hours ago | Updated 20 hours ago

Overview

On Site
Depends on Experience
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 12 Month(s)
No Travel Required

Skills

We have built job frameworks to run large scale ETL pipelines with Kafka
ElasticSearch ELK
Snowflake
Hadoop. Our applications run both on-perm and on the cloud. There are hundreds of dashboards built for business and operations to provide insight and actionable items at real-time. We are looking for streaming
data engineer - Understand distributed systems architecture
design and trade-off. - Design and develop ETL pipelines with a wide range of technologies. - Able to work on full cycle of development including defining requirement
design
implementation
testing
deployment. - Strong communication skills to collaborate with various teams. - Able to learn new technologies and work independently. Requirements: - 5 years of application development experience
at least 2 years data engineering with Kafka - Working experience writing and running applications on Linux - 5 years of coding experience with at least one of the languages: Python
Ruby
Java
C/C++
GO - SQL and database experience Optional: - AWS or other cloud technologies - ElasticSearch ELK
Amazon Web Services
Apache Hadoop
Apache Kafka
Application Development
C
C++
Cloud Computing
Collaboration
Communication
Dashboard
Data Engineering
Database
Elasticsearch
Extract
Transform
Load
Linux
Python
Real-time
SQL
Snow Flake Schema
Streaming
Systems Architecture
Writing
Kafka
Data Engineer

Job Details

We have built job frameworks to run large scale ETL pipelines with Kafka, ElasticSearch ELK, Snowflake, Hadoop.
Our applications run both on-perm and on the cloud. There are hundreds of dashboards built for business and operations to provide insight and actionable items at real-time.
We are looking for streaming, data engineer
- Understand distributed systems architecture, design and trade-off.
- Design and develop ETL pipelines with a wide range of technologies.
- Able to work on full cycle of development including defining requirement, design, implementation, testing, deployment.
- Strong communication skills to collaborate with various teams.
- Able to learn new technologies and work independently.
Requirements:
- 5 years of application development experience, at least 2 years data engineering with Kafka
- Working experience writing and running applications on Linux
- 5 years of coding experience with at least one of the languages: Python, Ruby, Java, C/C++, GO
- SQL and database experience
Optional:
- AWS or other cloud technologies
- ElasticSearch ELK

Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Zealogics