Senior IBM Master Data Management (MDM) & Integration Developer

  • Lansing, MI
  • Posted 1 hour ago | Updated moments ago

Overview

On Site
BASED ON EXPERIENCE
Full Time
Contract - W2
Contract - Independent

Skills

MASTER DATA MANAGEMENT
MDM
DATE WAREHOUSE
J2EE
JAVA

Job Details

Job Title : Senior IBM Master Data Management
Location :- Lansing MI
Duration: 1 year with possible extension.


Description
Top Skills & Years of Experience:
- Strong hands-on experience with IBM InfoSphere MDM v11.x, including MDM data models, entities, transactions, batch processing, configuration, customization, and troubleshooting
- Strong understanding of MDM concepts and IBM MDM product capabilities, with the ability to understand, maintain, and enhance an existing MDM implementation built by a third-party vendor
- 8+ years of software development experience supporting enterprise-scale systems
- 8+ years of Java / J2EE experience, including Java, JSP, and REST/SOAP-based services, with strong production troubleshooting skills
- 5+ years of experience working with Linux/Unix operating systems, including command-line usage, log analysis, and system-level troubleshooting

Skill Descriptions:

  • 8+ years of software development experience supporting enterprise-scale systems
  • Strong hands-on experience with IBM InfoSphere MDM v11.x, including MDM data models, entities, transactions, batch processing, configuration, customization, and troubleshooting
  • Strong understanding of MDM concepts and IBM MDM product capabilities, with the ability to understand, maintain, and enhance an existing MDM implementation built by a third-party vendor
  • 8+ years of Java / J2EE experience, including Java, JSP, and REST/SOAP-based services, with strong production troubleshooting skills
  • 5+ years of experience working with Linux/Unix operating systems, including command-line usage, log analysis, and system-level troubleshooting
  • Hands-on experience with Apache Kafka, including topic creation and configuration, producer and consumer development, message flow troubleshooting, and understanding of Zookeeper and Kafka KRaft concepts using strimzi operator.
  • Experience with batch processing using WildFly and Kafka
  • Experience working with application servers such as WildFly, including application deployments, startup, configuration, health checks, and runtime issue resolution
  • Experience integrating Kafka with WildFly-based applications and IBM MDM
  • Proficiency in writing and optimizing SQL queries in Oracle and Microsoft SQL Server environments
  • Strong Linux/Unix shell scripting skills (Bash), including automation and operational scripting
  • Experience creating, scheduling, and maintaining CRON jobs for batch and scheduled processes
  • Experience with CI/CD pipelines using Azuredevops, azure git repo, or similar tools
  • Experience deploying and supporting applications in cloud and OCP containerized environments
  • Familiarity with cloud platforms such as AWS and/or Azure
  • Exposure to OpenShift and Kubernetes is a strong plus
  • Experience with ELK / Elastic Stack for log monitoring and troubleshooting, including correlating logs across Kafka, WildFly, and MDM
  • Experience working with data and supporting mission-critical production systems
  • Ability to independently own, maintain, and support complex systems with minimal external dependency
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.

About Syntricate