SPSS Statistics

SPSS Statistics

Your hub for statistical analysis, data management, and data documentation. Connect, learn, and share with your peers! 

 View Only
Expand all | Collapse all

Mutual Information or Entropy value calculation

  • 1.  Mutual Information or Entropy value calculation

    This message was posted by a user wishing to remain anonymous
    Posted Fri November 15, 2024 12:36 PM
    This post was removed


  • 2.  RE: Mutual Information or Entropy value calculation

    Posted Fri November 15, 2024 02:35 PM

    I posted this yesterday, but I don't see the post now.

    if your goal is to pick out the best small number of variables out of 1000 purely on statistical grounds, remember that you will get a lot of false positives checking each one.  You are likely to find a set by chance that will not generalize, so you need to have estimation and testing samples or at least do cross validation.  How much data do you have?
    For starters with so many variables, I suggest using the naive Bayes procedure to pick out a small number of the best individual variables.  From there, there are many statistical procedures depending on the nature of the variables and the amount of data you have


    ------------------------------
    Jon Peck
    Data Scientist
    JKP Associates
    Santa Fe
    ------------------------------



  • 3.  RE: Mutual Information or Entropy value calculation

    Posted Tue November 19, 2024 08:57 AM

    Thank you Jon! I received your reply.

    Kameron



    ------------------------------
    Kameron Yiu
    ------------------------------