Global AI and Data Science

Global AI & Data Science

Train, tune and distribute models with generative AI and machine learning capabilities

 View Only
  • 1.  Function Models - a new approach

    Posted 15 days ago

    The function model family includes models of most any type: neural nets, DNNs, transforms, etc. These models learn by modification of the function(s) in the model rather than parametric.

    The result is considerable savings in energy usage, one-shot streaming learning is a natural capability, cost is very low for these models and transparency/audit capabilities increase. Rollbacks are created and logged for any learning so learning can be reversed if needed. Many of these models using Kafka is a great option. 

    See:

    https://zenodo.org/uploads/18056053

    Two documents - one an overview the other is detailed with mathematical justification



    ------------------------------
    John Harby
    CEO
    Autonomic AI, LLC
    Temecula CA
    9513835000
    ------------------------------


  • 2.  RE: Function Models - a new approach

    Posted 11 days ago
      |   view attached
     
    Two direct mappings (healthcare, agent systems) using the Function Model's two-layer structure-base function f₀ + patch layer g-so we can see exactly how to apply it in each domain.   Healthcare: "Guideline engine + patient-specific overrides"
    What f₀ is
    A stable clinical logic layer:
    • evidence-based guidelines (e.g., HTN/DM algorithms)
    • drug–drug interaction rules
    • risk calculators (ASCVD, CHA₂DS₂-VASc)
    • your organization's standing orders / pathways
    What g (patches) are
    Deterministic overrides learned from real cases, expert review, or patient-specific facts:
    • "For this patient, ACE inhibitor caused angioedema → never suggest again."
    • "This patient's baseline creatinine trend is stable despite high value → don't auto-flag as AKI unless delta criteria met."
    • "This clinic uses pathway v3.2; prior rule v3.1 no longer applies."
    Input signature (key idea)
    Instead of raw "x," use a context signature:
    • (patient_id, problem_id, setting, guideline_version, meds_hash, labs_hash, timeframe)
      This keeps patches precise and auditable.
    How learning happens (streaming)
    Each reviewed decision becomes an event:
    • (time, signature, output, clinician_id, reason_code, evidence_link)
      Stored as patch. No retraining. Immediate effect.
    Why it's valuable
    • Eliminates "drift" from silent model updates
    • Supports "never forget this contraindication" memory
    • Creates an auditable trail for QA, medico-legal review, and safety governance FunctionModel_Simple

    Agent systems: "Policy/skills + exact state-action corrections"
    What f₀ is
    The agent's general policy stack:
    • planner (task decomposition)
    • tool-use policy (when to call a tool)
    • general heuristics (prioritization, safety checks)
    • base model prompting / templates
    What g (patches) are
    Hard corrections to stop repeated failure modes:
    • "When user asks X, do NOT do Y; do Z."
    • "If tool call fails with error E, use fallback F."
    • "If this customer environment has limitation L, skip action A."
    Input signature
    Use a compact, hashed state representation:
    • (agent_goal, tool_context, user_intent, environment_flags, failure_code)
      Optionally add a "customer_id / deployment_id" to prevent cross-tenant leakage.
    Learning events



    Attachment(s)

    pdf
    FunctionModel_Simple.pdf   172 KB 1 version


  • 3.  RE: Function Models - a new approach

    Posted 11 days ago

    I have established a contract with IBM as a partner but am waiting to connect with the representative on the IBM side. I'll be able to provide all the changes I've made that yield 6 sigma accuracy in code generation. With the entire architecture in place I've found on a preliminary basis that it also resolves issues with mQSAR, (multidimensional Quantitative Structure-Activity Relationship), sort of a shelved idea. I think many of the issues with mQSAR prior to function models were due to its complex parameters that caused processing issues in the parametric approach. The function model avoids all of this. QSAR of course, has enjoyed much success having played a role in drug discovery, examples being the COVID vaccine, cancer drugs, etc.

    I have contacted Schrodinger, a manufacturer of QSAR systems on possible partnering and further development. Being a sole proprietor this sounds like an effort IBM is best fit to commandeer.



    ------------------------------
    John Harby
    CEO
    Autonomic AI, LLC
    Temecula CA
    9513835000
    ------------------------------



  • 4.  RE: Function Models - a new approach

    Posted 10 days ago
      |   view attached

    FYI this is the more formal document pertaining to the function model with mathematical justifications. Migration from a traditional model is possible and I have example migration plans. Also we can consider the hybrid approach, mixing parametric and function model updates for different training units.



    ------------------------------
    John Harby
    CEO
    Autonomic AI, LLC
    Temecula CA
    9513835000
    ------------------------------

    Attachment(s)



  • 5.  RE: Function Models - a new approach

    Posted 3 days ago

    ChatGPT output when asked about IBM and GPUs:

    IBM

    • CPUs: POWER remains CPU-first

    • AI: Relies on external accelerators (GPUs, AI cards)

    • Client AI: Not a focus

    • Reality: Classic high-end CPU plus attach strategy

    Function Models Are Not GPU-Dependent

    A recurring assumption in AI architecture is that high-performance inference and learning require GPUs.

    That assumption is tightly coupled to parameter-centric models, not to intelligence itself.

    Function Models break that coupling. In traditional neural architectures:

    • Learning is implemented as repeated parameter updates

    • Inference depends on dense linear algebra

    • GPUs are essential because they accelerate large-scale tensor operations

    In Function Models:

    • Learning is expressed as function transformation, not parameter churn

    • Parameters are externalized, sparse, or eliminated

    • Inference evaluates a stable function, often with structured control flow

    • Updates are applied as Δf (function deltas), not gradient sweeps

    Once parameter churn is removed, the primary computational bottleneck disappears. Dense SIMD parallelism is no longer the dominant requirement,

    which makes GPUs optional rather than mandatory.

    This maps naturally onto:

    • CPUs (branching, control flow, symbolic execution)

    • NPUs (bounded, low-latency inference)

    • Edge and offline systems where energy and determinism matter

    The energy reduction follows directly from the model, not from optimization:

    • No backprop loops

    • No repeated tensor sweeps

    • No retraining cycles for small behavioral changes

    GPUs remain valuable where dense numeric workloads are intrinsic. But for architectures based on function change rather than parameter change,

    GPU dependence is a modeling artifact, not a necessity.

    Curious how others are thinking about this shift, especially in the context of edge inference, energy efficiency, and heterogeneous compute.



    ------------------------------
    John Harby
    CEO
    Autonomic AI, LLC
    Temecula CA
    9513835000
    ------------------------------



  • 6.  RE: Function Models - a new approach

    Posted 6 days ago
      |   view attached

    This is an addendum to the Function Model. It concerns continuity and differentiability of the function. As you can likely see from the example I gave in the overview document the original continuous function sin(x) becomes something like { sin(x) x != 2; 3 x = 2 }. This is only piece wise continuous. Traditional models the function is still continuous. This document explores possibilities for handling this. Using Functor Models stood out to me as best but smoothing may be established in a domain and desirable. 



    ------------------------------
    John Harby
    CEO
    Autonomic AI, LLC
    Temecula CA
    9513835000
    ------------------------------

    Attachment(s)

    pdf
    ContinuousFunctor.pdf   177 KB 1 version