IBM i Global

IBM i Global

Connect, learn, share, and engage with IBM Power.

 View Only

Bringing AI to the Green Screen: How IBM i Can Talk to LLMs

By Gaurav Khanna posted 4 hours ago

  

Introduction: IBM i Isn’t Dead - It’s Evolving

When you think of AI, what likely comes to mind are cloud-native apps, fancy frontends, or Python-heavy data pipelines. But what if your reliable green-screen terminal - the IBM i (formerly AS/400) could also talk to AI? And not just talk, but get help, generate code, explain errors, or automate tasks?

IBM i has been the silent workhorse behind mission-critical systems in banking, logistics, insurance, and manufacturing. While it's often branded as a “legacy” system, that label is misleading. IBM i is alive and evolving, with support for open-source languages, APIs, and secure remote integrations.

This blog shows how IBM i can tap into the power of Large Language Models (LLMs) like ChatGPT, Claude, Gemini, or Watsonx to bring AI-driven enhancements-even to the most traditional 5250 systems.

Why This Blog Matters

Today, everyone uses AI tools like ChatGPT in their daily workflows. But most IBM i professionals still rely on manual processes, documentation digging, and traditional helpdesks.

This blog will show you:

  • How IBM i can connect directly to AI services.
  • Real use cases where LLMs save time and reduce cognitive load.
  • Whether you are a developer, operator, or architect-you’ll find examples you can try or build on.

It’s not just about calling an API. It’s about modernizing your mindset while keeping your IBM i systems running strong.

How Can IBM i Talk to LLMs?

IBM i can talk to any LLM that offers a web API. Integration depends on your setup. Here are a few practical methods:

Method

Tools Required

Suitable For

STRQSH + curl

QSH shell, curl installed

Quick one-off REST API calls from IBM i

Python on PASE

/QOpenSys/pkgs/bin/python3, requests lib

Scripted automation from IBM i

Java on IBM i

JT400, JDK

Java-based business apps needing AI

External Microservice

Node.js, Flask, .NET, etc. on another server

Rich web apps or central API proxies

You can trigger LLMs from IBM i (green screen), or use AI tools externally and bring insights back in. Both are valid and powerful.

AI Use Cases for IBM i – with Workflows and Getting Started Tips

Here’s a list of 10 high-value ideas where LLMs can assist various roles-from developers to operators to architects. Each idea includes a workflow so you know exactly what happens where.

Use Case

Who It Helps

Workflow (Manual & Automated)

How to Get Started

1. Error Message Explainer

Developers, Ops

Manual: User sees CPF9898 in job log → Opens laptop browser → Asks LLM for fix.
Automated: IBM i script parses job logs → Sends message to LLM API → Gets explanation and fix → Saves to spool/log/email.

Try a test query from browser: “What is CPF9898 in IBM i?”
Or build a CL+curl script to POST message to an LLM.

2. Code Modernizer

Developers

Manual: Copy RPG/CL code to browser → Ask LLM to convert to free-format or Java.
Automated: Script sends source to LLM via API → Gets updated/refactored code → Writes to IFS file.

Try prompt: “Convert this fixed-format RPG to free-format.”
Set up Python + API on IBM i for automation.

3. SQL Query Assistant

Business Analysts, Devs

Manual: User types “Show me all customers in NY” into browser → LLM returns SQL → Paste to STRSQL.
Automated: Prompt from 5250/command line → LLM returns SQL → Run it via QSYS2.EXEC_SQL.

Try natural queries in ChatGPT or Claude.
Use curl or Python to test integration with Db2 for i.

4. Job Log Summarizer

Operators

Manual: User downloads job log → Uploads to LLM in browser → Gets summary.
Automated: Job end exit point → Script parses log → Sends to LLM → Summary emailed to admin.

Use curl or Python script to POST job log text.
Prompt: “Summarize this IBM i job log.”

5. Spool File Summarizer

Admins, QA

Manual: Copy spool output to PC → Ask LLM to summarize.
Automated: MONMSG triggers parser → Sends spool text → Receives summary.

Use CPYSPLF to copy spool to IFS, then send it to LLM.

6. Command Helpdesk

Users, Devs

Manual: Type “What does CHGUSRPRF do?” into browser.
Automated: User types custom command CALLAI ‘Explain CHGUSRPRF’ → Output shown in display file or joblog.

Create a wrapper CL program that posts prompts to an API.
Use hardcoded or dynamic prompts.

7. Script Generator

Sys Admins

Manual: Ask “Give me a script to restore a SAVF from one LPAR to another.”
Automated: LLM generates script → Saved into source member.

Prompt: “Create CL script to copy & restore SAVF.”
Test with IFS-based file read/write.

8. System Health Checker

Ops, Admins

Manual: Type “How to check top CPU-consuming jobs on IBM i?”
Automated: Script runs WRKACTJOB → Sends output → LLM highlights top jobs.

Create automation to run WRKSYSSTS or WRKACTJOB → Export → Analyze.

9. Security Scanner Assistant

Security Admins

Manual: Ask “How do I list users with *ALLOBJ?”
Automated: Script runs DSPUSRPRF → Sends list to LLM → Gets risky profile summary.

Prompt: “Explain how to identify powerful users on IBM i.”

10. PTF Advisor

Architects

Manual: Ask “What’s latest PTF for 7.5?”
Automated: LLM fetches IBM site data (or from cached DB) → Lists missing or new PTFs.

Manually query IBM Fix Central and try LLM summary.
Use scheduled curl + JSON response parsing.

Security Best Practices

If you're building LLM integrations into IBM i, keep these in mind:

  • Never hardcode API keys in source code. Store securely using environment variables or encrypted configs.
  • Use HTTPS for all endpoints.
  • Use a proxy service (Python/Node) between IBM i and the LLM API to isolate and secure interactions.

Example: Calling an LLM from IBM i Using curl

Here’s a simple way to send a prompt to an LLM using curl (assuming you're using an OpenAI-style endpoint):

curl -X POST https://api.your-llm-provider.com/v1/chat/completions \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
        "model": "gpt-4",
        "messages": [{"role": "user", "content": "Explain CPF9898 error in IBM i"}]
      }'

Try This On:

  • Your laptop terminal to test first.
  • Then from IBM i using: QSH or PASE if curl is installed (/QOpenSys/usr/bin/curl).

Important: Replace "YOUR_API_KEY" with a real API key and secure it properly (don’t hardcode it in production scripts).

Final Thoughts: IBM i + AI Is a Power Combo

AI isn’t just for modern frontend apps. Your green screen can talk to ChatGPT, Claude, or Watsonx-whether it’s via curl, CL, Python, or external services. The result? Faster fixes, smarter logs, easier modernization.

You don’t need to rewrite your whole platform. You just need the right hooks and a shift in mindset.

So don’t just read this blog-try one idea this week. Ask your IBM i to explain something. Automate one script. Add an AI helper. You'll be amazed at what’s possible.

0 comments
1 view

Permalink