Chicago, Illinois
•
Today
Requirements 8 or more years of experience in Big Data architecture and distributed data warehousing solutions. Proficiency in Spark and Scala for high-performance data processing and framework development. Experience with Kafka and Aerospike for real-time data streaming and low-latency storage. Experience in AWS cloud infrastructure including EKS, EMR, Redshift, Athena, and S3. Experience with Java and Python programming for building scalable data pipelines. Experience in Hadoop ecosystems, SQL
Easy Apply
Third Party, Contract
Depends on Experience
















