What are the responsibilities and job description for the Senior Data & Integration Architect position at Jobs via Dice?
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Rapsys Technologies, is seeking the following. Apply via Dice today!
REQUIREMENT_CITY - New Jersey (White House Station) Must be able to come in for F2F interview and be willing to work in office for 3-4 days a week.
Role Name - Data Architect
ROLE_DESCRIPTION -
We are seeking a Senior Data & Integration Architect to lead data model design and downstream integration strategy for a large-scale policy administration system modernization. You will reverse engineer legacy mainframe data structures, forward engineer them into modern SQL/MongoDB document schemas, and define how data flows to downstream consumers. You will leverage purpose-built AI agents to accelerate reverse engineering, model generation, and documentation — bringing human judgment to validate and refine AI-produced outputs.
Key Responsibilities
Experience Required: 10 & Above
REQUIREMENT_CITY - New Jersey (White House Station) Must be able to come in for F2F interview and be willing to work in office for 3-4 days a week.
Role Name - Data Architect
ROLE_DESCRIPTION -
We are seeking a Senior Data & Integration Architect to lead data model design and downstream integration strategy for a large-scale policy administration system modernization. You will reverse engineer legacy mainframe data structures, forward engineer them into modern SQL/MongoDB document schemas, and define how data flows to downstream consumers. You will leverage purpose-built AI agents to accelerate reverse engineering, model generation, and documentation — bringing human judgment to validate and refine AI-produced outputs.
Key Responsibilities
- Design and maintain technology-agnostic Logical Data Models (entities, relationships, cardinality, PK/FK)
- Transform LDMs into modern physical schemas applying aggregate-oriented and DDD patterns
- Reverse engineer IMS hierarchical segments and DB2 tables — extract business entities from physical storage structures without original design documentation
- Interpret COBOL copybooks as data structure definitions and map legacy field types to modern equivalents
- Define embedding vs. referencing strategies, versioning patterns, and collection boundaries for target database platforms
- Design downstream integration patterns — REST APIs, event streaming (Kafka/MQ), Change Data Capture (CDC), and data distribution to consuming systems
- Direct and validate AI agent pipelines for automated reverse engineering, ERD generation, data dictionary synthesis, and schema artifact production
- Produce data dictionaries, ERD diagrams, ETL field mapping specifications, and integration contracts
- Collaborate with SMEs to validate models and integration flows against undocumented business logic
- 8 years experience in data architecture and system integration within OLTP / transactional domains (insurance, banking, billing, or similar)
- Hands-on experience with IBM IMS, DB2 for z/OS, and COBOL copybooks — able to read a segment hierarchy or copybook independently
- 5 years designing physical data models for modern relational or document-oriented databases
- Strong grasp of logical modeling: ERD notation, composition vs. reference, cardinality, key design
- Proven experience designing integration architectures: REST APIs, event streaming (Kafka, MQ), CDC pipelines, and message-based data distribution
- Experience with ELT processing, including designing and implementing ELT workflows, data transformation, data cleansing, and data validation
- Experience with real-time data processing, including designing and implementing real-time data processing pipeline with event-driven architectures
- Comfort working in an AI-augmented workflow — directing LLM-based agents, reviewing AI-generated artifacts, and applying domain expertise to close gaps AI cannot resolve
- Scripting proficiency (Python or equivalent) for schema validation and artifact generation
- Ability to abstract legacy physical data structures into business-oriented target models — separating IMS/DB2 storage implementation details from true business keys and domain entities
- Insurance domain knowledge (policy lifecycle, coverages, LOBs, premium rating)
- Domain-Driven Design (DDD) - aggregates, bounded contexts, event-driven design
- Experience synthesizing a unified model from multiple heterogeneous sources (IMS DB2 application logic)
- Prior experience working with AI coding assistants (Claude, GitHub Copilot, or similar) in a software engineering or data architecture context
Experience Required: 10 & Above