I apologize for the delay in response. Rulesets of a large size such as 26K+ rules will begin to require more memory for parsing and storing the rules within the JVM. This is especially true with multiple rulesets (or multiple versions of the same ruleset) being executed on the same RES as each ruleset is kept locally in cache assuming there are connections left within the pool's timeout value.
This is expected and using a 64 bit JVM is one possible solution. Another is to ensure memory is profiled well. Also, consider providing an RES instance specific to certain large or memory intensive rulesets.
Memory usage also depends on other factors such as input and output data size, other running applications on the JVM, pool size and timeout values (which should be tuned to ensure applicable usage) and more. Additional information on this subject can be read in the following Redpaper:
www.redbooks.ibm.com/redpapers/pdfs/redp....
IBM continues research into improving performance of rule execution (for example, around very large decision tables) and each release has a focus on rule engine performance improvement. It is always adventageous to utilize the latest release and test performance.