COBOL

COBOL

COBOL

COBOL is responsible for the efficient, reliable, secure, and unseen day-to-day operations of the world's economy.

 View Only
  • 1.  ZoneCheck

    Posted Thu July 14, 2016 03:15 PM

    I have created a new document intended to help programmers and sites who are migrating from Enterprise Cobol V4.2 to V5 or V6.  It discusses issues related to invalid data and ZoneCheck.  I would appreciate any comments or suggestions from those of you who have either dealt with this or are planning on doing so.  Please see:  http://www.wmk2.com/IBM/Zonecheck%20Info.pdf

    wmklein


  • 2.  Re: ZoneCheck

    Posted Thu July 14, 2016 03:47 PM

    Hi William,

     

    Overall, fantastic document, and in line with what I've been recommending to customers. A couple thoughts for you:

     

    1) I don't recommend migrating to V4 first and then to V5/V6, especially for code that's already compiled with an earlier version of Enterprise COBOL. This is more a matter of personal preference though, but moving to V4 first adds extra work without much benefit (you'll want to test under V4 before moving to V5/V6, right? This adds a second round of testing). FLAGMIG4 doesn't provide a lot of utility; it warns about new reserved words in V5/V6, should you happen to use them already in a V4-or-earlier program (e.g. as a data item name). If you skip the FLAGMIG4 step, V5/V6 aren't going to happily compile code using reserved words incorrectly anyway, so you're protected there. In our experience, 99.9% of customer code that's OS/VS COBOL II using NOCMPR2 (e.g. the '85 COBOL Standard) compiles cleanly with V5/V6, and for the programs that don't, V5/V6 can be used as the way to identify old language that isn't supported.

     

    2) On page 4, I believe you meant to recommend RULES(NOEVENPACK) instead of RULES(EVENPACK).

    Mike Chase


  • 3.  Re: ZoneCheck

    Posted Fri July 15, 2016 01:28 AM

    At first reading I can only add one thing to what Mike has said.

     

    I never advise changing compiler options beyond the "unit testing" stage. Beyond that, for me, they must be the same as in Production.

    A person from another team approached me many years ago and explained that a program in System Testing was failing, yet back in the Unit Test set-up it worked, with exactly the same data. Same source. Same data. One fails, one works. That's not computing, is it?

    The obvious culprit was compile options.

    Now, before Mike has a heart attack, it is not the compile options in themselves that are the problem, it is that different options (those options that affect the generated code) generate different code. If you have a CALLed program or a wild subscript that is hitting executable code, you need to be hitting the same executable code (so that the program "works" or "doesn't work" in each environment).

    The classic is the IBM advice "test without OPT, turn OPT on and chuck it into Production". Sorry, but no way. Not because I have any fear of OPT (Mike's started breathing again) but because the generated code is different. Anything that is overwriting generated code is now overwriting something else (perhaps even no longer in this program).

    Separately, at least some sites follow "no recompiles when promoting up the line", so you have not choice.

    It was late in the day when I was reading your document, and I'll try to read it another couple of times over the weekend, but so far it scores well with me. I would put more emphasis on "if the data is correct, there are no issues with whatever compiler options you use" - too often people put blame on the compiler, rather than avoiding the obvious: the compiler didn't mess up their data.

    BillWoodger


  • 4.  Re: ZoneCheck

    Posted Mon July 18, 2016 03:04 AM

    I just want to say that I totally agree with  @BillWoodger 93bccaec-8df2-4d8e-81b7-2dca26a07c00​. A recompile of any source will discqualify all tests done on the old executable, especially if any compiler option is changed.

    Saying that, I sadly must confess, we have been forced to use the NOOPT option to be able to execute tests because the (third party) debugger not always could cope with optimized code. Always as a last resort when trying to find an elusive error.

    The above is actually one of the reasons that we are using a change management product that does not force recompiles when moving executables between different test and production environments but instead always uses a (hopefully verified) load module as the source of the copy.

    Regards Lars

    lbjerges


  • 5.  Re: ZoneCheck

    Posted Thu July 21, 2016 07:15 PM

    I have a comment on the DIAGTRUNC recommendation.  While it sounds like a great idea I find it gives way too many 'false positives".  There are many cases where a numeric input field is defined to be a size that is larger than the maximum value it will ever actually hold.  Just think about the whole "evenpack" situation.  Let's say there is a case where you have a packed-decimal field that will never hold more than two decimal digits, and you move it to a PIC 99 DISPLAY.  Should I code the sending item as PIC 99 PACKED-DECIMAL and get the "evenpack" warning, or should I code it as PIC 999 PACKED-DECIMAL and get the "diagtrunc" warning?  A conundrum!

    I compiled our largest program using DIAGTRUNC and got 397 (!!!) warnings, almost all of which were "IGYPA3228-W   High order digit positions in the sender may be truncated in the move to receiver".  Yikes!

    I just thought of a 'solution", and it appears to "work".  If you use "COMPUTE r = s", instead of "MOVE s TO r" you don't get the warning, and the compiler (from what I can see) generates exactly the same code.  Interesting!  But is that just "rearranging the deck chairs"? In any case I will not be changing those 390+ lines of code.  Smile

    Frank

    fswarbrick


  • 6.  Re: ZoneCheck

    Posted Sat July 23, 2016 07:17 AM

    Good point on DIAGTRUNC. There is an RFE to apply the DIAGTRUNC rules to arithmetic, so another reason not go change all those lines of code :-).

    COBOL is a language of truncation. Quiet truncation, high-order or low-order, is intrinsic. It is what is supposed to happen, with the presumption that the programmer knows what they are doing.

    I don't really see much use for DIAGTRUNC. If people want to use if and "fix" everything, fair enough. But reality is that messages tend to get left lying around, then people "learn" that a compiler of a program always produces RC=4, so the don't look at the messages if they get an RC=4, and there's something else in there... which they never see.

     

     

    BillWoodger