What are the responsibilities and job description for the Need Senior IBM Master Data Management & Integration Developer - Hybrid work in Lansing, MI position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Sita Consulting Services, LLC, is seeking the following. Apply via Dice today!
Hi Candidates,
I've a below position, if you've qualifies for the below position then share resume to
Note: This position work on C2C, 1099 and W2 as well.
Position: Senior IBM Master Data Management (MDM) & Integration Developer
Client: State of Michigan
Location: Lansing, MI
Work mode: Hybrid
Interview: Video
Hours: 40 hours/week
Note: This is a hybrid position, accepting local candidates and non-local candidates willing to relocate at their own expenses with no delays. Position will be hybrid, in office 2 days a week upon start
REQUIRED SKILLS (RANK IN ORDER OF IMPORTANCE):
Hi Candidates,
I've a below position, if you've qualifies for the below position then share resume to
Note: This position work on C2C, 1099 and W2 as well.
Position: Senior IBM Master Data Management (MDM) & Integration Developer
Client: State of Michigan
Location: Lansing, MI
Work mode: Hybrid
Interview: Video
Hours: 40 hours/week
Note: This is a hybrid position, accepting local candidates and non-local candidates willing to relocate at their own expenses with no delays. Position will be hybrid, in office 2 days a week upon start
REQUIRED SKILLS (RANK IN ORDER OF IMPORTANCE):
- 8 years of software development experience supporting enterprise-scale systems
- Strong hands-on experience with IBM InfoSphere MDM v11.x, including MDM data models, entities, transactions, batch processing, configuration, customization, and troubleshooting
- Strong understanding of MDM concepts and IBM MDM product capabilities, with the ability to understand, maintain, and enhance an existing MDM implementation built by a third-party vendor
- 8 years of Java / J2EE experience, including Java, JSP, and REST/SOAP-based services, with strong production troubleshooting skills
- 5 years of experience working with Linux/Unix operating systems, including command-line usage, log analysis, and system-level troubleshooting
- Hands-on experience with Apache Kafka, including topic creation and configuration, producer and consumer development, message flow troubleshooting, and understanding of Zookeeper and Kafka KRaft concepts using strimzi operator.
- Experience with batch processing using WildFly and Kafka
- Experience working with application servers such as WildFly, including application deployments, startup, configuration, health checks, and runtime issue resolution
- Experience integrating Kafka with WildFly-based applications and IBM MDM
- Proficiency in writing and optimizing SQL queries in Oracle and Microsoft SQL Server environments
- Strong Linux/Unix shell scripting skills (Bash), including automation and operational scripting
- Experience creating, scheduling, and maintaining CRON jobs for batch and scheduled processes
- Experience with CI/CD pipelines using Azuredevops, azure git repo, or similar tools
- Experience deploying and supporting applications in cloud and OCP containerized environments
- Familiarity with cloud platforms such as AWS and/or Azure
- Exposure to OpenShift and Kubernetes is a strong plus
- Experience with ELK / Elastic Stack for log monitoring and troubleshooting, including correlating logs across Kafka, WildFly, and MDM
- Experience working with data and supporting mission-critical production systems
- Ability to independently own, maintain, and support complex systems with minimal external dependency