Senior Big Data Engineer- Remote

Bid Data, AWS, Kafka, Kubernetes
Full Time
Depends on Experience

Job Description

We are looking for a Senior/Staff Software/Platform Engineer that can design and code critical components for a platform that will interconnect data pipelines in a team of 5-10 engineers. Our customer is one of the world's largest CPG companies based in Beaverton, Oregon with operations worldwide. The ideal candidate is a proactive, driven, and technology-proficient software engineer with a solid data engineering background, experience in event messaging system design, and strong coding experience. The candidate must quickly design, build and extend services from scratch.



  • Develop and extend a recently started data platform to support big data pipelines in the consumer data space
  • Drive/remain responsible for the development of end-to-end for specific components
  • Contribute to project discussions, collaborate directly with architect team and present results to key stakeholders
  • Design, build and continuously enhance the project codebase
  • Act as an onsite-timezone force multiplier for a distributed team of engineers and managers
  • Write detailed design documentation, present decisions and motivate these
  • Work inside a team of industry experts on the cutting edge Big Data technologies to develop solutions for deployment at massive scale
  • Design data infrastructure with privacy and security being cross-cutting concerns
  • Set coding and deployment best practices



  • 6+ years of experience designing and coding platform solutions for Big Data pipelines
  • 3+ years of experience working with event-messaging systems - Kafka is a big plus
  • 2+ years coded and deploying services running on Kubernetes
  • Python and Spark knowledge is required
  • Experience working with AWS
  • Experience with enterprise data warehouse
  • Strong understanding of the challenges in building end-to-end big data pipelines for a large variety of use-cases at scale
  • Strong communication skills


What will be a big plus:

  • Experience with Scala
  • Experience with EMR
  • Experience with Snowflake
  • Understanding Microservices and how to architect/design scalable solutions on Kubernetes
  • Understanding the challenges of working with many disjunct big data technologies
  • Worked with big data pipelines at terabyte/petabyte scale
  • Worked with HDFS
  • Understanding how to run Spark on Kubernetes
  • Experience working with Big Data scheduling technologies and their APIs - Airflow
  • Experience with JVM build systems (Gradle, Maven)

We offer:

  • Opportunity to work on bleeding-edge projects
  • Work with a highly motivated and dedicated team
  • Competitive salary
  • Flexible schedule
  • Medical insurance
  • Benefits program
  • Corporate social events


Placement and Staffing Agencies need not apply. We do not work with C2C at this time.

At this moment, we are not able to process H1B transfers.


About us:

Grid Dynamics is the engineering services company known for transformative, mission-critical cloud solutions for retail, finance and technology sectors. We architected some of the busiest e-commerce services on the Internet and have never had an outage during the peak season. Founded in 2006 and headquartered in San Ramon, California with offices throughout the US and Eastern Europe, we focus on big data analytics, scalable omnichannel services, DevOps, and cloud enablement.

Dice Id : RTX145791
Position Id : 7523594
Originally Posted : 2 weeks ago
Have a Job? Post it

Similar Positions

Senior Java Engineer (Remote, Java, Kafka, Big Data)
  • Motion Recruitment
  • Boston, MA, USA
Sr Big Data Engineer - Stores Planning
  • Target
  • Brooklyn Park, MN, USA
Senior Big Data Engineer II
  • Motion Recruitment
  • Itasca, IL, USA
Senior Big Data Engineer
  • Applied Resource Group
  • Alpharetta, GA, USA
Big Data/Machine Learning Engineer - Sr
  • Apex Systems
  • Chicago, IL, USA
BigData Architect - Remote till Covid ::: Cary, NC
  • Intone Networks Inc.
  • Cary, NC, USA