How To Migrate AS400 (IBM i) to Java or .NET

AS/400 (IBM i) Application and Data Migration to Java or .NET

IBM i (commonly referred to as AS/400) platforms have earned a reputation for stability, predictable batch processing, and durable business logic. Many organizations still run critical order management, finance, manufacturing, and claims systems on IBM i because the systems work and the risk of change feels high. 

The challenge is that the surrounding ecosystem has shifted. Integration expectations are now API-first, UI expectations are browser and mobile, and operational expectations include CI/CD, automated testing, observability, and rapid change. Meanwhile, IBM i estates often rely on tightly coupled application logic, unique data structures, and platform-specific artifacts that make modernization appear more complex than it needs to be. 

 At Core, we approach AS/400 migration as a design preservation and forward engineering exercise. We capture the legacy intent, separate concerns cleanly, and rebuild the system into a modern stack, typically Java (Spring Boot, MyBatis, Angular/React) or .NET (C#/.NET 8, Dapper/EF where appropriate, React/Angular). The end state is a cloud-ready, supportable application with a modern data platform and a testable codebase. 

What makes AS/400 migration different

AS/400 modernization is not only about converting RPG or COBOL syntax. It is about migrating a set of platform-specific concepts that are often intertwined: 

  • Programs and service programs (RPG, COBOL, CL) and their call structures 
  • Display files and printer files (DDS based UI and reporting artifacts) 
  • DB2 for i physical files and logical files (including multi-member files) 
  • Data areas, data queues, message queues (inter-process coordination and integration patterns) 
  • Commitment control and journaling semantics 
  • Job and subsystem behavior (batch, interactive, spool, scheduling) 
  • Security model (profiles, authorities, adopted authority patterns) 

A successful migration preserves the business rules and behavior while translating these concepts into modern equivalents such as REST services, message queues, schedulers, relational schemas, and role-based security. 

Typical drivers for modernizing IBM i

Organizations usually modernize for one or more of these reasons: 

  • Skills risk, diminishing availability of RPG and CL developers 
  • Pressure to expose functionality via APIs and integrate with SaaS platforms 
  • Green-screen UX limitations and training overhead 
  • Difficulty scaling change delivery without automated builds and tests 
  • Data access constraints, reporting demands, and analytics modernization 
  • Infrastructure strategy (cloud, hybrid, standardization on Windows or Linux) 

Core’s migration philosophy

Core migrations are guided by three principles: 

  1. Preserve design and intent: capture business rules, validations, and workflow that matter. 
  2. Separate concerns: isolate data access, business logic, and presentation, then rebuild each cleanly. 
  3. Automate wherever possible: use a repository-driven approach to reduce risk, improve repeatability, and accelerate delivery. 

This is especially important for IBM i where logic, data structure, and UI artifacts are often co-dependent. 

Target architectures: Java and .NET

Java target stack (common pattern) 

  • Spring Boot APIs and services 
  • Spring Batch for batch processing equivalents 
  • MyBatis for deterministic SQL mapping 
  • Angular or React front end 
  • DB2 LUW, Oracle, or SQL Server as the RDBMS 

.NET target stack (common pattern) 

  • .NET 10 APIs and services 
  • Background services or Hangfire/Quartz for batch equivalents 
  • Dapper for high performance SQL access, EF where it adds value 
  • React or Angular front end 
  • SQL Server or Oracle as the RDBMS 

High-level migration approach

Below is a typical end-to-end flow. This is intentionally similar in structure to how we frame OpenVMS and other platform migrations, but tailored to IBM i artifacts and data structures. 

AS400 Diagram1

Migrating AS/400 data structures and data formats

1) DB2 for i physical files and logical files 

 IBM i “files” often blend storage and access strategy: 

  • Physical files map closely to tables, but may include member concepts, record formats, and legacy constraints not explicitly declared. 
  • Logical files can represent indexes, filtered views, joins, alternate keys, and access paths used directly by programs. 

 Migration strategy: 

  • Convert physical files to relational tables with explicit keys, constraints, and normalized types. 
  • Convert logical files to a combination of indexes, views, and query patterns. 
  • Identify program dependencies that rely on access-path behavior (read next, keyed reads, setll, reade) and preserve semantics in the DAO layer. 

Key considerations we address: 

  • Primary key inference where legacy designs rely on alternate keys 
  • Packed decimal and zoned decimal conversions to decimal/numeric types 
  • Date and time fields stored as numeric or character formats 
  • Record format changes over time and backward compatibility 
  • Referential integrity that exists in code rather than in the database 

 

AS400 Diagram2
AS400 Diagram 3

2) DDS artifacts (Display files and Printer files 

 DDS artifacts usually represent two modernization opportunities: 

  • User interface modernization: translate green-screen workflows into a modern web UI. 
  • Reporting modernization: translate spool and printer files into modern report generation. 

Approach: 

  • Model screens as workflows: inputs, validations, prompts, function keys, and navigation. 
  • Rebuild the UI as React/Angular pages that call APIs. 
  • Rebuild printed output as PDF templates, server-side report generation, or BI outputs. 

3) Packed, zoned, and legacy numeric formats 

 IBM i applications frequently use: 

  • Packed decimal (COMP-3 in COBOL terms) 
  • Zoned decimal 
  • Overpunch sign representations 
  • Character numeric fields with implied decimals 

We define a deterministic mapping layer: 

  • Preserve precision and scale exactly. 
  • Avoid floating point for money and high precision values. 
  • Use explicit conversion utilities and unit tests against real extracts. 

4) Data areas, data queues, and message handling 

Data areas and queues are frequently used for: 

  • Parameter passing between jobs 
  • Lightweight state storage 
  • Batch coordination and integration 

 

Modern equivalents: 

  • Database tables for durable state 
  • Redis or distributed cache for ephemeral state (where appropriate) 
  • Message queues (Azure Service Bus, RabbitMQ, Kafka) for asynchronous processing 
  • Structured logging and correlation IDs replacing message queues used as diagnostics 

5) Journaling and commitment control 

 IBM i journaling is often used for: 

  • Recovery 
  • Auditing 
  • Synchronization patterns 
  • Incremental extracts 

We preserve intent by mapping to: 

  • RDBMS transaction semantics 
  • Change tracking (where required) 
  • Audit columns and audit tables 
  • CDC tooling when needed 

Program migration: RPG, COBOL, and CL 

RPG and COBOL conversion 

 The goal is to preserve business rules while improving maintainability: 

  • Extract program structure, files used, and call chains. 
  • Preserve the decision logic, validations, and computations. 
  • Rebuild into service methods with explicit inputs and outputs. 

Core modernization typically results in: 

  • A service layer implementing the business rules 
  • A DAO layer implementing the database access patterns 
  • A web UI and/or API endpoints implementing user interaction

CL command language and job control

CL often encodes: 

  • Batch orchestration 
  • Environment setup 
  • File overrides 
  • Subsystem job behavior and scheduling 

We translate this into modern orchestration: 

  • Scheduler jobs (Quartz, Hangfire, Windows Task Scheduler, Kubernetes CronJobs) 
  • Parameterized batch runners 
  • Environment configuration and secrets management 
  • Repeatable deployment pipelines 

CL command language and job control

AS400 Diagram 5

There are several proven patterns, and selection depends on volume, downtime tolerance, and integration complexity. 

Option A: One-time extract and load (small to medium systems) 

  • Extract DB2 for i data to flat files or staging tables 
  • Transform and load into the target RDBMS 
  • Validate with reconciliation and functional tests 

 Option B: Parallel run with incremental synchronization (medium to large systems) 

  • Initial bulk load 
  • Ongoing incremental loads driven by journaling or CDC 
  • Cutover when parity is achieved 

Option C: Service strangler with phased domain cutover (complex systems) 

  • Wrap legacy with APIs where needed 
  • Migrate domain-by-domain 
  • Retire legacy components gradually 

Testing and validation that reduces risk

IBM i migrations fail when testing is treated as an afterthought. We build validation into the migration lifecycle: 

  • Unit tests for conversion utilities (packed decimal, date formats, keyed reads) 
  • Golden data tests using known input and expected outputs 
  • Batch reconciliation comparing totals, counts, and key financial metrics 
  • UI workflow tests replicating function-key behavior and navigation paths 
  • Performance benchmarking for high-throughput batch windows 

A critical technique is comparing the legacy and modern system against the same input datasets, then proving equivalence in outputs and side effects. 

Security and operational modernization

A modern solution must match or exceed IBM i operational maturity: 

  • Role-based security mapped from profiles and authorities 
  • Centralized identity (Azure AD, Okta, etc.) 
  • Auditing of key business events and data changes 
  • CI/CD pipelines, automated builds, and repeatable deployments 
  • Observability: logs, metrics, traces, and alerting 
  • Production support model aligned to your DevOps processes 

A practical migration roadmap

Here is a typical sequence for an IBM i modernization project: 

1) Pre-Assessment 

   | 

2) Assessment and Inventory (programs, files, LFs, DDS, CL, queues) 

   | 

3) Design Preservation (neutral repository model) 

   | 

4) Data Model + Data Strategy (PF/LF mapping, conversions, CDC) 

   | 

5) Forward Engineering (DAO, services, UI, batch) 

   | 

6) Testing and Reconciliation (golden data, batch parity) 

   | 

7) UAT + Performance Tuning 

   | 

8) Cutover + Go-Live + Stabilization 

This roadmap supports iterative delivery. You can modernize one functional area at a time rather than waiting for a single big release. 

What you get at the end

A successful AS/400 migration produces: 

  • Modern Java or .NET application code with clean separation of concerns 
  • A relational data platform with explicit schema, constraints, and indexes 
  • API endpoints for integration and future extensibility 
  • Web-based UI replacing green-screen workflows 
  • Repeatable build and deployment pipelines 
  • A testing framework that proves functional equivalence and supports change 

How Core helps

Core brings an automation-first approach that is built for legacy complexity. We preserve your existing design, extract and model the rules, and forward engineer into a modern stack with less risk than manual rewrites. If your IBM i estate includes RPG, COBOL, CL orchestration, DDS screens, and DB2 for i data structures, we can help you modernize to Java or .NET in a controlled, auditable way. 

Scroll to Top