SPSS Statistics

 View Only
  • 1.  Mutual Information or Entropy value calculation

    This message was posted by a user wishing to remain anonymous
    Posted 17 days ago
    This post was removed


  • 2.  RE: Mutual Information or Entropy value calculation

    Posted 17 days ago

    I posted this yesterday, but I don't see the post now.

    if your goal is to pick out the best small number of variables out of 1000 purely on statistical grounds, remember that you will get a lot of false positives checking each one.  You are likely to find a set by chance that will not generalize, so you need to have estimation and testing samples or at least do cross validation.  How much data do you have?
    For starters with so many variables, I suggest using the naive Bayes procedure to pick out a small number of the best individual variables.  From there, there are many statistical procedures depending on the nature of the variables and the amount of data you have


    ------------------------------
    Jon Peck
    Data Scientist
    JKP Associates
    Santa Fe
    ------------------------------



  • 3.  RE: Mutual Information or Entropy value calculation

    Posted 14 days ago

    Thank you Jon! I received your reply.

    Kameron



    ------------------------------
    Kameron Yiu
    ------------------------------