Role: Google Cloud Platform Senior Data Engineer (Onsite Bentonville, AR)
Location: Onsite Bentonville, AR (they can work remote 1-2 months prior to relocating)
Duration: 6-month contract to hire
Interview Process: 2 rounds
Work Authorization: Independent visa holder
Engagement Type: Exclusive requirement
Pay Rate: $60-$65/Hr on W2
Open for relocation
Role Overview
We are seeking a highly skilled Senior Data Engineer to join a growing data engineering organization and contribute to the design, development, and scaling of modern, cloud-based data platforms. This role will work closely with engineering leadership and cross-functional teams to drive high-impact data solutions, supporting large-scale, low-latency, and data-intensive applications.
This position requires strong hands-on engineering expertise, architectural thinking, and ownership mindset, with direct exposure to the final decision maker.
Key Responsibilities
- Design, build, and maintain scalable, high-performance data systems and services.
- Develop and support microservices-based applications using modern programming languages and frameworks.
- Architect and optimize data pipelines and storage solutions across SQL and NoSQL technologies.
- Collaborate with teams to support continuous integration and continuous delivery (CI/CD) best practices.
- Implement and manage high-throughput, low-latency messaging and streaming solutions.
- Leverage public cloud platforms and cloud-native services to deliver reliable, production-grade systems.
- Participate in technical discussions, reviews, and documentation to ensure best practices and knowledge sharing.
- Take ownership of deliverables while working effectively within a collaborative team environment.
Required Skills & Experience
- Strong experience with Java and Python, including application and service development.
- Solid understanding of microservices architecture and application design principles.
- Hands-on experience with SQL and NoSQL databases, such as:
- BigQuery
- Cosmos DB
- Cassandra
- PostgreSQL
- Experience with DevOps practices, including CI/CD pipelines and automated deployments.
- Experience working with Kafka or other high-volume, low-latency messaging systems.
- Deep experience with public cloud platforms and cloud-native services.
- Strong background in big data technologies, including Hadoop, Hive, and/or Spark.
- Excellent analytical and problem-solving skills.
- Strong technical writing, verbal communication, and presentation abilities.
- Demonstrated ability to take ownership and thrive in a fast-paced, team-oriented environment.
- Passion for technology, continuous learning, and engineering excellence.
Additional Notes
- This is an exclusive requisition with positioning.
- Candidate will have exposure to the final decision maker.
- Onsite presence in Bentonville is required