" if it achieved to a level where a reasonably idle server produce a consistent flat line graph then it gives a sharper indication on usage pattern, peak/memory leak situations"
IME, that gives no such indication. I think assumptions are being made about the relative “badness” of a sawtooth graph and the relative “goodness” of a smooth graph. In isolation, these are meaningless. The only useful evaluations, IME, are those done when the server is doing real work. If memory issues are not encountered during peak loads, then don’t fiddle with the settings. More often than not, what intuitively is supposed to make things better doesn’t materially make things better and can actually makes things worse.
I’m reminded of tech support story I heard years ago from a consultant/analyst. The policy at the company he was advising was that whenever a server encountered some sort of error, the first step was to get another memory card from the closet and put it in the server. If the symptom disappeared, ticket closed. If the symptom persisted, then additional triage was performed. The lesson: it was far cheaper to put more memory into the box than for the tech to spend time investigating root cause and tweaking the box to achieve max performance with the least hardware.
I don’t mention this to discourage experimenting and gaining additional understanding. My note of caution is just to be careful about what is inferred by the observations made. And be sensitive to spending time on a problem that may not be a problem at all. In all the projects I’ve worked at various companies, memory config tweaking of the IS has never been an necessary activity. The approach is usually just max it out and go.
#webMethods#Integration-Server-and-ESB#webMethods-Archive