Overview
On Site
$100,000 - $120,000
Full Time
Skills
Mulesoft
kafka
Hadoop or Informatica
API
DataWave
ETL
Job Details
Job Description
Skill: Mulesoft kafka ETL Developer
Must Have Technical/Functional Skills:
Primary Skill: Mulesoft
Secondary: kafka, Hadoop or Informatica.
Experience: Minimum 8 years.
Roles & Responsibilities:
MuleSoft:
- Strong knowledge of MuleSoft Anypoint platform, including Anypoint Studio IDE, Design Center, API Manager, and Runtime.
- Experience with API-led connectivity and microservices architecture.
- Experience with MuleSoft connectors for various data sources and systems.
- Experience with DataWave scripting and RAML specifications.
Kafka:
- Experience with Kafka brokers, producers, and consumers.
- Knowledge of Kafka concepts such as topics, partitions, and offsets.
- Experience to read/write messages to kafka using any ETL tool like informatica/Hadoop.
Hadoop:
- Experience with HDFS, MapReduce, and Hadoop ecosystems.
- Understanding data storage and processing techniques in Hadoop.
Informatica Integration:
- Work with Informatica PowerCenter or other Informatica products for ETL (Extract, Transform, Load) processes and data integration.
- API-Driven Connectivity: Focus on building APIs that enable seamless data flow between systems, following API-led connectivity principles Drive value-delivery and continuously improve the product by effectively utilizing data such as feedback and metrics like quality, delivery rate, etc., to identify opportunities.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.