JOB DESCRIPTION: Only W2 (No C2C) Job Title: Java Developer with Kafka Experience Location: Arizona, TX and CA Job Type: Contract W2 Need 8+ years of experience Client is seeking a skilled Java Developer with expertise in Kafka to join our dynamic team. The ideal candidate will have strong experience in developing and maintaining high-performance, scalable Java applications and integrating them with Kafka for efficient real-time data streaming and processing. Qualifications: Bachelor s degree i
Confluent Kafka AdministratorAbility to install and maintain all confluent platform components - Kafka Clusters, Control Center, Schema Registry, Security Plug-insKubernetes - Preference is Google KubernetesStrong background in Terraform toolCritical to be fluent in CI/CD pipelines and development environmentsRequires good understanding of overall Data Loss Prevention (DLP) Overall strong DevOps background with ability to support multiple initiatives and programs. Strong communication with abi
Hi, We are recruiting for one of our major clients: Role:Java Developer with AWS /Docker/Jenkins/Kafka(W2 opportunity) Location: 100% Remote Duration: 4+ Months We are seeking a highly skilled Java Developer to join our team on a remote basis. The ideal candidate will have a strong skill set in Java, Kafka, Elasticsearch, Jenkins, Docker, and DevOps with AWS. The client is primarily looking for a Java developer, but a candidate with a combination of Java and DevOps skills, particularly with Kafk
Job Description: Performance Testing experience of 7+ years Good at understanding the requirement from client and estimate the work model Expert in Jmeter and Loadrunner and performance testing execution. Knowledge of Python Experience working in KAFKA
Genesis10 is currently seeking a Solutions Engineer with our client in the financial industry located in Richmond, VA and Chandler, AZ. This is a 12 + month contract position. Responsibilities: Responsible for matching current technology with the current needs Engage in the evaluation and installation of software, hardware, and other types of support equipment into a workable network that supports a variety of functions Responsibilities include providing problem determination and resolution, ins
Urgent Requirement - Confluent Kafka Architect - 100% Remote Role : Confluent Kafka ArchitectLocation : 100% RemoteClient : Cognizant/Life Science Client Job Type : Contract Responsibilities:Solid experience and knowledge in the deployment of Kafka(Apache/ConfluentPhysical Deployment across multiple environmentsOptimization and Tuning based on performance metricsSetup best practices, Standards, Patterns Development and automate process for onboarding.Develop playbooks for troubleshooting and sup
Job summaryWe are seeking a dedicated Program Manager with a background as a Business Associate to join our team.The ideal candidate will have 6 to 10 years of experience and a strong proficiency in Kafka Admin.This role involves managing complex projects, ensuring timely delivery, and collaborating with cross-functional teams to drive business success.Experience in Python and Shell scripting is a plus.Roles & ResponsibilitiesCarry out SRE duties for Kafka Streaming Platform Have thorough unders
Job Title: Enterprise architect AWS with Kafka experience/ AWS Kafka Architect Client: Dell/Wells Fargo Location: Dallas, TX Hybrid(3 days a week) Remote is also fine Experience:15+ years Duration:2+ years This role involves collaborating with business leaders, application managers, and other architects to ensure data solutions align with the company's architecture standards and policies. Key Responsibilities: Design and implement Kafka architecture and data pipelines. Collaborate with business
Hello All, Direct Client: Confluent Kafka Platform Engineer/ Developer /AdminLocation: Tampa, FLDuration: Long TermRate: Open Senior confluent Kafka Admin who has hands on experience with the Kafka connectors, establish connectivity, sort out firewall issues, setting up the environments following the industry standards and guide the teams in some development in Python. Job Description: Confluent Kafka Platform Installation, Configuration, Administration and Support.Troubleshoot Kafka platform in
Senior Confluent Kafka Admin/Developer Tampa, FL Duration: Long Term Hybrid Hands-on experience with the Kafka connectors, establishing connectivity, sorting out firewall issues, setting up the environments following the industry standards and guiding the team's development in Python. Thanks & Regards, Srinivas Astrosoft Technologies
Kafka Consultant Pleasanton, CA or Plano, TX ( Hybrid -3 days in office ) Client is seeking a Kafka Expert with strong experience in Kafka broker components and Confluent platform 7.x. The role involves building clusters with CI/CD tools, automating repetitive tasks, and managing Kafka infrastructure. Responsibilities: Expertise in Kafka broker and related components Experience with Confluent platform 7.x Build and manage Kafka clusters using CI/CD tools Automate tasks like topic and connector c
Golang Technical Lead Position 3 Location: Phoenix, AZ (Onsite from Day 1) rate: $60/hr. Job Description Below are the MUST HAVE requirements: 8+ years working in OO programming, strong hands-on experience working with GolangExperience and knowledge working with event driven architectures, Kafka, coding and experience with Security frameworks/methods, reactive programming, data structure etc.Experience using NoSQL database, well verse with deployments in cloud.Experience working on products at s
Role : Java Developer with Devops & Kafka Duration : 6+ Months with extension Location: Plano,TX is the preference Interview: One Round of Technical Coding Except : H1b, OPT TN Kafka Stream Migration. Major migration Must: JAVA Kafka AWS -S3, Clusters , Security groups, Kafka clusters Linux & Unix Devops , CI CD, Jenkins (CI CD Knowledge is ok, but they have to learn in the project should be willing for that ) MICROSERVICE
Job Description: Confluent Kafka Platform Installation, Configuration, Administration and Support. Troubleshoot Kafka platform in multiple type of environments. Design and implement Streaming solutions on Kafka platform. Integrate Kafka connectors with various data sources. Desired Candidate Profile: Installation, configuration and administration of Confluent Kafka in Hortonworks Data Platform. Installation and Integration of Kafka Connectors with various sources of data. Experience in container
Must have skills IBM ACE IIB 10 IBM MQ JD for IIB ACE ADM Application development and Support Competencies Hands on experience IBM Integration Bus 10 0 ACE Have work in BAU and Support projects Strong development experience of Message Models Applications and Message Flows using IIB ACE In depth knowledge in ESQL
Job Description: Must have skills IBM ACE IIB 10 IBM MQ JD for IIB ACE ADM Application development and Support Competencies Hands on experience IBM Integration Bus 10 0 ACE Have work in BAU and Support projects Strong development experience of Message Models Applications and Message Flows using IIB ACE In depth knowledge in ESQL In depth knowledge of standards technologies that enable messaging including XML WSDL SOAP JMS HTTP and SSL Understanding of communication protocols FTP s sFTP AS2 HTTP
Our enterprise-level client is seeking to add a Middleware Kafka Architect to their team in Irving, TX. Please see below for full details-- Job Notes: -6 months contract -ASAP start date -Onsite in Irving, TX -Drug & Background required. Scope Details: A Java/Python Architect is responsible for designing, implementing, and maintaining Java-based applications. They collaborate with cross-functional teams to define and execute project requirements and timelines. You will: Developing and testing so
To design, develop, implement and maintain new IT solutions, or changes/enhancements to existing solutions that align with business initiatives and corporate strategies. Required: Established track record with Kafka technology, with hands-on production experience and a deep understanding of the Kafka architecture and internals of how it works Design, develop, and manage Kafka-based data pipelines Recent experience with large Event driven financial transactions to external sources (on-prem to