Overview
On Site
Accepts corp to corp applications
Contract - Long term
Skills
Python
Elasticsearch
Amazon Web Services
Terraform
EMR
Git
Nosql
SQL
Database
Kafka
Continuous Integration/Delivery
Agile
Big Data
PostgreSQL
Docker
Redis
Amazon Kinesis
Data Pipelines
Streaming
Identity and Access Management
Real-Time
Datasets
Workflow Management
OWL
Job Details
Title: Data Engineer
Location: Chicago, IL (only locals - onsite interview)
What is in it for you?
As a Data Engineer with deep expertise in Python, AWS, big data ecosystems, and SQL/NoSQL technologies, driving scalable, real-time data solutions with CI/CD and stream-processing frameworks.
Responsibilities: -
- Proficient developer in multiple languages, Python is a must, with the ability to quickly learn new ones.
- Expertise in SQL (complex queries, relational databases preferably PostgreSQL, and NoSQL database - Redis and Elasticsearch).
- Extensive big data experience, including EMR, Spark, Kafka/ Kinesis, and optimizing data pipelines, architectures, and datasets.
- AWS expert with hands-on experience in Lambda, Glue,Athena, Kinesis, IAM, EMR/PySpark, Docker,
- Proficient in CI/CD development using Git, Terraform, and agile methodologies.
- Comfortable with stream-processing systems (Storm, Spark-Streaming) and workflow management tools (Airflow).
- Exposure to knowledge graph technologies (Graph DB, OWL, SPARQL) is a plus. .
Experience: -
- 8+ Years
Location: -
- Remote
Educational Qualifications: -
- Engineering Degree BE/ME/BTech/MTech/BSc/MSc.
- Technical certification in multiple technologies is desirable.
Skills: -
Mandatory skills
- Python, PySpark, AWS
Good to have skills: -
- EMR, Spark, Kafka/ Kinesis
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.