Decision Management & Intelligence (ODM, DI)

Decision Management & Intelligence (ODM, DI)

Connect with experts and peers to elevate technical expertise, solve problems and share insights

 View Only

Decision Intelligence Series: Key UI Features and User Guide

By Horo Zhang posted 23 hours ago

  

Abstract: This post explores the key UI features of the Decision Assistant (DA) and serves a dual purpose: as a practical user guide for prospects and clients, and as a showcase of UI design practices for developers and designers. Building on the introduction from Blog #1, we walk through the modular layout, reusable components, conversational interface, and personalization options that make the Assistant approachable. By grounding the experience in the IBM Carbon Design System, we highlight how these features support clarity, interactivity, and accessibility for end-users, while also reflecting the design principles that guide our frontend development. This article helps readers understand not only how to engage with the Decision Assistant day to day but also the craft that went into creating it. 
 

In Blog #1, we introduced the Decision Assistant (DA) — its purpose, design philosophy, and the technologies powering its frontend. In this post, we’ll take a guided tour of the key UI features that make the Assistant approachable and effective for users working with complex decision models. Think of this article as a practical user guide to help you understand the main areas of the interface and how they work together. 
 
A Modular UI Built for Clarity 

One of the most common challenges in decision automation is information overload. In Decision Intelligence, users often work with complex decision models that involve hundreds of rules, data mappings, and interrelated components. Without a thoughtfully designed interface, it’s easy to lose track of context and progress. That’s why the information architecture is carefully structured to promote clarity and usability, helping users stay oriented and focused as they build and manage decision logic. 

 

  • Left navigation panel: Acts as the anchor point for switching between major features such as decision models, data models, and the conversational assistant. Inspired by IDEs and productivity tools, it allows users to move between contexts without losing orientation. 

  • Central workspace (canvas): This is where users spend most of their time. Here, they visualize decision models or explore generated recommendations. The canvas is optimized to minimize visual noise and highlight only what’s relevant to the current task. 

  • Right contextual panel: Provides on-demand access to supporting details, advanced configurations, and conflict resolution tools. Instead of crowding the main workspace with advanced settings, these are tucked into a flexible side panel that appears only when needed. 

  • Global header: Offers universal access to search, theme switching, and user account preferences. It also provides breadcrumbs, helping users track their current location within the model hierarchy. 

Fig.1. Canvas View of Decision Assistant 

Where the Decision Assistant comes in is its seamless integration within the Decision Intelligence rather than being a separate application, the Decision Assistant is available as a dedicated panel inside the same environment. Users can open it alongside their workspace to ask questions, review recommendations, or generate new rules without breaking focus. 

 

  • Assistant panel: A dedicated conversational space that users can expand or collapse as needed. It hosts both the chat thread and decision cards, allowing users to interact naturally with the Assistant without leaving the Designer. 

  • Chat view: Displays a running conversation between the user and the Assistant. Styled message bubbles differentiate prompts from responses, and the chat history doubles as a lightweight navigation log of what has been asked and answered. 

  • Decision card view: Presents AI-driven outputs such as proposed rules, validation checks, or conflict resolutions. Cards are compact, scannable, and actionable, with buttons to apply or reject changes directly. 

  • Contextual awareness: The Assistant is aware of the user’s current workspace context. For example, if the user is editing a decision model, suggestions will be tailored to that model rather than generic recommendations. 

This modularity ensures that the Assistant scales gracefully with the environment. Beginners can use the Assistant to create a decision service from a policy. 
 

Smart Components That Drive Interactivity 

The interface is powered by reusable, interactive components that provide consistency and efficiency. In practice, this means that whether a user is reviewing a decision card, opening a conflict resolution modal, or interacting with a dropdown, the experience feels familiar. 

 

  • Cards and decision cards: Summarize insights in a compact, visual format. For example, when the Assistant suggests changes to a decision model, a card might present the key recommendation alongside options to Apply or Reject. This avoids burying important insights in long text blocks. 

  • Modals: Provide focused environments for specific tasks. A good example is conflict resolution: when two users propose different changes to a model, the modal presents both sides and guides the user toward resolving the conflict. 

  • Dropdowns, tooltips, and walkthroughs: Support quick actions and contextual help. Imagine a new user exploring the canvas for the first time: a guided walkthrough introduces them to navigation, editing tools, and decision card interactions. 

For the Assistant specifically, smart components take on an even more conversational form: 

 

  • Message bubbles differentiate between user prompts and Assistant responses, creating a clear and scannable dialogue history. 

  • Inline actions are embedded directly into the Assistant’s replies — for example, buttons like Regenerate, Edit, or Apply appear in context, so the user can take action immediately without navigating elsewhere. 

  • Decision cards generated by the Assistant not only summarize AI-driven suggestions but also link directly back to the relevant part of the model, making the connection between natural language input and structured logic explicit. 

  • Workspace tabs (Data Model, Decision Model, Timeline bar): The Assistant integrates tightly with the broader UI by surfacing the correct workspace view in context. If a response involves creating new entities, the Assistant can prompt users to switch to the Data Model tab; if it’s about adding rules, it points to Decision Model; and if reviewing history or conflicts, the Timeline tab highlights relevant changes. This ensures navigation remains effortless. 

  • “View details” button: Some Assistant responses include a View details button that opens the full canvas view. Instead of overloading the chat with complexity, the Assistant offers a quick preview and then lets users explore the full structure, hierarchy, or rule logic inside the dedicated workspace. 

Behind the scenes, these components rely on React state management and dynamic rendering. When a user accepts a recommendation from the Assistant, the UI updates instantly across the chat, the decision card view, and the model workspace. This creates a feedback loop where conversation, visualization, and action remain perfectly aligned. 

Fig.2. Chat View of Decision Assistant 

  

Conversational Interface 

Perhaps the most distinctive feature of the DA is its conversational interface. Instead of forcing users to write rules or queries directly, the Assistant enables them to speak in natural language. 

The chat panel includes several key capabilities: 

 

  • Message history: Users can scroll back through previous interactions, giving them a narrative of how their decision model evolved over time. 

  • Actionable responses: When the Assistant generates a rule, the user doesn’t just read it — they get actionable buttons to regenerate, edit, or accept the rule. 

  • Visual cues: Icons, highlights, and confirmations help users understand the outcome of each interaction. If the Assistant regenerates a rule, the interface highlights the updated section, making the change explicit. 

For instance, imagine a business analyst asks: “Generate a decision model for loan approval based on income and credit score.” The Assistant replies with a draft model, displayed as decision cards on the canvas. The analyst can then refine it through follow-up questions like “Add an additional rule for employment history” or “Show me conflicts between income thresholds.” 

This conversational design transforms technical complexity into approachable dialogue, bridging the gap between natural language and structured decision logic. 

Fig.3. Conversational Interface 

Guidance, Feedback, and Personalization 

The Decision Assistant is not just a passive tool — it actively guides users through the decision modeling process as a step-by-step conversational partner. Its dialogue adapts to each stage of building a decision service, ensuring that users remain oriented and confident. 
 

  • Step 1: Setting the scope 
    The Assistant begins by helping users define the boundaries of their project. Through natural language prompts, it may ask clarifying questions such as, “What decision are you trying to automate?” or “Which inputs should be considered?” The chat history becomes a narrative of the scope definition, making it easy to revisit decisions later. 

  • Step 2: Building the structure 
    Once the scope is clear, the Assistant augments the data model drafted in Step 1 and suggests a structure for the discovered decision model.  

  • Step 3: Defining the rules 
    As users move deeper, the Assistant adds rules in natural language (near) to accomplish the mission of each decision node. 

  • Step 4: Finalizing the service 
    In the final stage, it formalizes the rules and finalizes the specification of the decision service. 

  • Personalization throughout the journey 
    The Assistant adapts its tone and level of detail to the user. Beginners receive more explanation and context, while experienced users see streamlined confirmations and shortcuts. Preferences such as theme, font size, and panel layout extend this personalization beyond conversation, making the experience both efficient and comfortable. 

Fig.4. Step-by-step Conversational Partner of Decision Assistant 

By embedding guidance, feedback, and personalization directly into the conversational journey, the Assistant transforms the four-step modeling process into a collaborative dialogue. Instead of navigating a complex tool, users feel like they are partnering with an AI coach that explains, suggests, and validates along the way. 

 

Accessibility in the Assistant

Accessibility is not an afterthought — it is a core principle in the design of the Decision Assistant. Because the Assistant relies heavily on conversational interaction, it must be usable by everyone, regardless of ability. 
 

  • Semantic HTML and ARIA roles: Chat messages, buttons, and decision cards are built with semantic markup and ARIA attributes so that screen readers can interpret the dialogue flow accurately. 

  • Keyboard navigation: All Assistant interactions — from typing prompts to applying decision card actions — can be performed using only the keyboard, ensuring that users who cannot rely on a mouse are fully supported. 

  • Screen reader compatibility: The chat transcript is accessible to screen readers, allowing visually impaired users to follow the conversation history and act on Assistant recommendations. 

By ensuring that its conversational UI is fully accessible, the Assistant makes decision automation inclusive and usable for all audiences. This commitment to accessibility aligns with the broader mission of enterprise-grade software: empowering every user, not just the technically advanced. 

 

What’s next? 

This post gave us a guided tour of the core UI features of the Decision Assistant. From modular panels and smart components to chat-driven workflows and personalization, each piece of the interface is designed to balance power with usability. 

In the next installment, we’ll step behind the scenes with a Tech Stack and Architecture Overview — exploring the frameworks, design systems, and architectural principles that power the Assistant’s frontend. 

0 comments
13 views

Permalink