Overview
Remote
$60 - $70
Accepts corp to corp applications
Contract - Independent
Contract - W2
Contract - 18 Month(s)
Skills
React
Java
Kafka
Job Details
Visa: USC
Remote Role
Must have a LinkedIn page with a photo. LinkedIn experience must match the resume
60/40 (React/JAVA) or more at least 2-3 of years of Java but heavy react minimum 8 years react heavy
Has to work CST hours
8+ years at least
Role Context & Strategic Importance
- Department is undergoing a major architectural shift:
- Rewriting a Salesforce-based client experience that was a proof of concept 3 years ago.
- Moving client data gathering functionality outside of Salesforce due to API limits and misalignment between product and engineering visions.
- Building new integrations between Salesforce, Java services, and Kafka.
- The new experience will likely involve a new database, React front-end, and Java/Kafka back-end.
- This hire is critical to executing that vision.
Ideal Candidate Profile
Hiring Manager is looking for someone who is:
- Technically Hands-On
- Not just a lead or architect she wants someone who codes daily.
- Able to mentor a junior Java engineer who isn t senior-level.
- Comfortable being the only React developer on the team initially.
- 8+ years of experience preferred.
- Strong in React, Java, and Kafka these are non-negotiable core skills.
- Experience with Avro (Kafka serialization) is a plus, but not required.
- Should be able to troubleshoot using Datadog and ELK logs.
- Willing to learn Salesforce enough to be on-call (no deep Salesforce experience needed).
- Flexible & Collaborative
- Willing to work across multiple platforms and sync data between them.
- Open to new patterns and modernization (e.g., moving away from WSDLs).
- Comfortable with DevOps practices, even if the team is still maturing in that area.
Technical Stack & Environment
- Frontend
- React: Crucial for the new client data gather experience.
- UX is already handled this role is about building components, not designing flows.
- Backend
- Java (Spring Boot): REST APIs, Kafka integrations.
- Kafka: JSON and Avro formats. RabbitMQ is used minimally.
- Integration Layer: Between Salesforce and other experiences.
- EIS (Enterprise Integration Services): Moving toward Kafka-based integrations.
- DevOps & Deployment
- OpenShift: Main deployment platform (Docker experience is acceptable).
- GitHub Actions: Migrating from Bitbucket.
- Playwright: Used for automation testing.
- CI/CD Pipelines: Already established no need to reinvent.
- Monitoring & Logging
- Transitioning from ELK to Datadog.
- Candidate should be able to read logs and follow troubleshooting cookbooks.
- Cloud & Security
- Hybrid cloud: OpenShift on-prem + AWS (S3 buckets for future data).
- Azure: Used for REST API authentication (nice to have).
Interview Process
Step 1: Technical Interview
- Conducted by Principal Engineer and Salesforce Lead.
- Includes live coding exercises in Java and React.
- Focused on technical depth, not resume walkthrough.
Step 2: Interview with Department Head
- Focus on leadership competencies, team fit, and personality.
- Looking for someone who can mentor, collaborate, and drive modernization.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.