What are the responsibilities and job description for the Programmer Senior IBM Master Data Management (MDM) & Integration Developer position at Global Information Technology?
Job Description
Job Location: Lansing, MI (Hybrid)
Job Type: Contract
Job Description:
-
Provide production and non-production application support for MPI, MIDSP, and related systems
-
Write high-level functional and technical requirement documents
-
Participate in the full software development life cycle including requirements gathering, design, development, testing, and deployment
-
Work in Linux and Unix operating systems for development and troubleshooting
-
Develop and maintain PowerShell and shell scripts
-
Use Git and GitHub for source control and repository management
-
Collaborate with database administrators to develop and execute SQL scripts
-
Generate monthly reports for MDM applications
-
Resolve client tickets and operational issues related to MPI and MIDSP
-
Create, deploy, and manage certificates on Linux and Windows systems
-
Work closely with infrastructure, server, and network teams for maintenance and issue resolution
-
Assist with Enterprise Architecture Solution Assessments, hosting documentation, and firewall rule requests
-
Design and execute test plans, test cases, and test reports
-
Support UAT, post-implementation testing, and defect tracking
-
Assist with Request for Change (RFC) processes
-
Maintain and update system documentation
-
Monitor applications across environments
-
Support mission-critical production systems
Requirement:
-
8 or more years of enterprise software development experience
-
Strong hands-on experience with IBM InfoSphere MDM version 11.x
-
Deep understanding of MDM concepts, data models, entities, transactions, batch processing, configuration, and customization
-
Ability to support and enhance MDM solutions built by third-party vendors
-
8 or more years of Java and J2EE experience including REST and SOAP services
-
5 or more years of Linux or Unix experience including command line usage and log analysis
-
Hands-on experience with Apache Kafka including producers, consumers, topic configuration, and message flow troubleshooting
-
Experience with Kafka integration using WildFly and IBM MDM
-
Experience with batch processing using WildFly and Kafka
-
Experience deploying and supporting applications on WildFly application server
-
Strong SQL skills in Oracle and Microsoft SQL Server
-
Strong Bash shell scripting skills
-
Experience creating and managing CRON jobs
-
Experience with CI and CD pipelines using Azure DevOps or similar tools
-
Experience supporting cloud and containerized environments
-
Familiarity with AWS and or Azure
-
Exposure to OpenShift and Kubernetes is a plus
-
Experience using ELK or Elastic Stack for log monitoring and troubleshooting
-
Strong ability to independently own and support complex production systems