COBOL

COBOL

COBOL

COBOL is responsible for the efficient, reliable, secure, and unseen day-to-day operations of the world's economy.

 View Only
Expand all | Collapse all

Recommended COBOL compiler options

  • 1.  Recommended COBOL compiler options

    Posted Wed November 13, 2024 01:40 PM

    Given the restriction that you should only compile once (unless you need to change the code, obviously) so that the exact same executable module is used both for all testing and in Production; and a general principle that operability is to be preferred to fastest possible (so, for example, producing formatted dumps and something that tells you the failing line of the COBOL program are essential in the event of an abend because of the reduction in Mean Time To Recover that this provides), what compiler options would you recommend with Enterprise COBOL 6.3, and why?

    Obviously, some options are required for specific programs in order for them to behave as intended e.g. ARITH(EXTEND), or DYNAM/NODYNAM, because of the way they have been coded, but beyond that, would you aim to have the same core compiler options for all compiles, or have them on a case-by-case basis?



    ------------------------------
    Charles Gaskell
    ------------------------------


  • 2.  RE: Recommended COBOL compiler options

    Posted Wed November 13, 2024 02:07 PM
    Edited by Mike Chase Thu November 14, 2024 01:04 PM

    My #1 recommendation is that you don't change compiler options when migrating between versions, and that you generally don't change compiler options without careful consideration. Some definitely change the behaviour of your programs, such as TRUNC. Others, notably OPT and ARCH, do not, provided you are using valid data.

    The compiler assumes your data is valid, unless you use INVDATA, and that your sign codes match your NUMPROC setting (only preferred with NUMPROC(PFD), nonpreferred or preferred with NUMPROC(NOPFD)). The compiler may choose instructions that behave differently than previous versions, or different OPT or ARCH levels, when invalid data is used or non-preferred sign codes are used with NUMPROC(PFD). But as long as your data is valid and your sign codes conform to your NUMPROC setting, changing OPT or ARCH do not change behaviour.

    I do recommend programs in the same application all use the same compiler options, and for many clients, the options are fixed site-wide anyway. The only real exception outside of cases you mentioned, where the code requires specific options, is OPTIMIZE. For clients who generally use OPT(2), if you find that specific programs are taking a long time to compile, dropping the OPT level to OPT(1) or even OPT(0) for those programs may help (though if compilation time is excessive, please open a Case). That said, some clients only use OPT(0) anyway as that can provide a better debugging experience in some non-IBM debugging tools.

    I suggest using LIST instead of OFFSET, using ADATA, and for COBOL 6.4 users, using SMARTBIN. watsonx Code Assistant For z Code Optimization Advice (the Optimize step in wca4z) requires ADATA files and listings compiled with LIST to do its analysis; even if you don't currently intend to use wca4z Optimize, using options that support it will make things easier if you intend to use it in the future. A caveat: many clients use OFFSET instead of LIST, and I assume do so for good reason, as the output from LIST is more useful in troubleshooting. Perhaps some non-IBM tools require listings compiled with OFFSET? Definitely make sure it's safe to switch.

    SMARTBIN stores additional data in a no-load segment along with the executable. This increases the size of your executables on disk but doesn't affect runtime memory usage or performance. That data is useful for programs that may use Automatic Binary Optimizer in the future, and z Code Optimization Advice may take advantage of it in the future as well. But again, this was only added in COBOL 6.4.



    ------------------------------
    Mike Chase
    Enterprise COBOL Developer
    mike.chase@ca.ibm.com
    ------------------------------



  • 3.  RE: Recommended COBOL compiler options

    Posted Wed November 13, 2024 02:12 PM

    We've always used OFFSET instead of LIST simply because it saved space (lines of print) on a printed listing.  Of course we don't print listing anymore, so I imagine we could change it to LIST.

     

    Frank


    The information contained in this electronic communication and any document attached hereto or transmitted herewith is confidential and intended for the exclusive use of the individual or entity named above. If the reader of this message is not the intended recipient or the employee or agent responsible for delivering it to the intended recipient, you are hereby notified that any examination, use, dissemination, distribution or copying of this communication or any part thereof is strictly prohibited. If you have received this communication in error, please immediately notify the sender by reply e-mail and destroy this communication. Thank you.





  • 4.  RE: Recommended COBOL compiler options

    Posted Thu November 14, 2024 02:35 AM

    Hi,

    For us we compile twice:

    • First compilation for test during writing code : LIST + OPTMIZE(0) + some other options for debuging help (like SSRANGE)
    • Second compilation for all other step in promotion workflow : OFFSET + OPTIMIZE(1) + some other options for performance (like NOSSRANGE), so same binary is used for qualification and production. 

    We also use « NOTEST(DWARF,NOSEPARATE) » for our debug tools (Macro4) instead of Listing : trace, dump, profiling : debug informations always available even in production. 



    ------------------------------
    Denis FALLAI
    BPCE SI, BPCE group.
    ------------------------------



  • 5.  RE: Recommended COBOL compiler options

    Posted Wed November 20, 2024 06:14 PM

    Compiling twice is an excellent choice!  OPT(0) for unit test, then OPT(2) for quality assurance test and production!  This method is extremely common, and many use SSRANGE and NUMCHECK in the dev/unit test compile as well, then turn those off for production.



    ------------------------------
    Tom Ross
    ------------------------------



  • 6.  RE: Recommended COBOL compiler options

    Posted Wed November 20, 2024 07:59 PM

    For those that recommend compiling twice, how do you ensure that the source used to do the second compile is the same as the source that you used to do the first compile? Even if you can guarantee that the main source hasn't changed (because you've stored it away in your SCM tool for example), I don't see how it's possible to ensure that Copybooks and DCLGENs which are pulled in haven't changed between the two compiles, potentially in ways that affect how the load load behaves, functionally. At what point in the Development Lifecycle is the second compile normally done - is it after all testing has been completed?

    Also my understanding whilst options such as SSRANGE and NUMCHECK will pick up invalid subscript range settings and invalid data at run time, this can be very much dependent on the quality of the test data, and just because these are set in the original compile, the condition will only be caught if the test data causes it to be triggered, so a clean set of tests doesn't mean that you won't get problems when you run it in Production, using Production data - it could simply be that the testing data used don't include invalid data, or don't cause the subscript range to be exceeded. Therefore unless also kept on in the Production version, SSRANGE and NUMCHECK are only of limited use if only used in "testing" versions of the load module.



    ------------------------------
    Charles Gaskell
    ------------------------------



  • 7.  RE: Recommended COBOL compiler options

    Posted Thu November 21, 2024 04:55 PM

    Charles,

      Wow, if people have to worry about their code staying safe in their SCM, they need a new SCM!

    Of course the code will not change between compiles, unless someone changes it!  That would be

    dumb of course.  Many COBOL users have changed to a 2-compile 2-test development process after

    learning about it when migrating to COBOL V6.  It is a beautiful thing!  It allows a more robust development

    environment and results in better code.  I don't think it is relevant to bring up the quality of testing data

    here, since that has always been a challenge, more for some clients than others.  Regardless of how much

    or how high-quality the data, it is still a good idea to develop with SSRANGE and NUMCHECK, and especially

    with the new LSACHECK compiler option.  None of these help until tests are run.

      By the way, copybooks and DCLGENS are also protected by source control, so they won't change without users being notified

    and aware.   I am surprised that you have not heard of this 2 compile development process, it is VERY common!

    Tom



    ------------------------------
    Tom Ross
    ------------------------------



  • 8.  RE: Recommended COBOL compiler options

    Posted Thu November 21, 2024 09:09 PM

    I didn't say that I hadn't heard of the 2-compiule development process, rather that where I work, we have a policy that what you test is exactly what you put live. A corollary of this is that if you need to recompile then  you either need to do sufficient retesting to prove that the program still functions the way that it did when you originally tested it, or that you sign off to accept the risk associated with not retesting it.

    I agree that it's a very small risk, especially if you use a very well-established SCM (as we do), but that risk is still non-zero.

    Our developers work in a series of shared environments containing multiple applications (continuous integration if you like) with copybooks typically representing the interfaces between different applications, each of which is being developed separately as well being tested to ensure interoperability. If application B is not allowed to start development of a copybook which they own but which is used by application A until application A has done their recompiles at the end of their testing, then this impacts on how fast application B can develop. If you have a principle that you only recompile when you have to (and recompilation triggers retesting) then as soon as Application A has compiled at the start of their testing, application B is free to start development of a new version of the copybook.

    You also don't say what happens if your SCM tool flags up that something has changed between the "testing" compile and the "production-ready" compile, because of a legitimate change that another team has done. Do you force everything to roll back to the versions of copybooks, DCLGENs and linked-in artefacts, and fail the process if there are differences? Or force some level of retesting? Or another approach? Hope does it help if users are notified and aware that something has change between the first compile and the second compile? What do they do differently as a result of the notification?

    You talk about a "2-compile, 2-test development process" What do you mean by "2-test"? Depending on the size of the development, one developers may well have at least 3 or 4 levels of 'functional' testing - Unit, Unit-regression, System, Integration, (Integrated) Regression, User Acceptance, each with a different focus. At which stage do you recommend doing the 2nd compile?

    I'm not saying that it's not a development paradigm that can work. As you say, plenty of people use it, but I would like to understand better the pros and cons of the approach, as it's radically different to how we currently work, which is why I started my original question with "Given the restriction that you should only compile once". It's a bit like asking the question "how do I get from A to B by public transport?" and be told "you should use private car". It may well be a valid way to get from A to B, but it's not answering the question that was asked.

    I've not come across the LSACHECK COBOL compiler option before. Is it brand new? Or a typo? Where can I find out more information about it?



    ------------------------------
    Charles Gaskell
    ------------------------------



  • 9.  RE: Recommended COBOL compiler options

    Posted Fri November 22, 2024 08:42 AM

    Charles,

    Not unlike you, I could not find LSACHECK in my April 2024 edition of the Programming Guide.  I did find it as an APAR with a CLOSED date of 21 November, which on my calendar is yesterday.

    I read the description, and to me it is not dissimilar to PARMCHECK in that it would generate a lot of overhead that can be avoided by testing.  When PARMCHECK was introduced in COBOL 6 back in 2017 we tried it because it sounded like a great idea.   But in fact it only reports the problem, it does not prevent it.  For me both of the options,  MSG and ABD, have the same effect in practice because if a subroutine overran your storage, there was not much you could do about it.   And maybe I'm missing something, but you would be very fortunate if the offending parameter happens to be the last item in your Working-Storage and not in the middle.

    This is a case where the two-compile method may be useful and I recommend it.  First, have the programmers test with SSRANGE and the other high-overhead options, and then when the problems are fixed, recompile without them.   Of course you need good test data.  I agree that the Integration Test version of the program is the one that should be moved up the line to Production.



    ------------------------------
    Jon Butler
    ------------------------------



  • 10.  RE: Recommended COBOL compiler options

    Posted Mon November 25, 2024 07:40 AM

    LSACHECK is a new option to detect the use of linkage section items that do not have addressability. It is planned to be included in the November monthend 6.3 compiler PTF, and in December for the 6.4 compiler.

    Unlike PARMCHECK, it has essentially no overhead. With LSACHECK, the BLL (base locator for linkage) cells which point to linkage section items are initialized to 7FFFF000 (rather than 00000000), a page that's not mapped. So if the program tries to read a linkage section item whose addressability has not been established, it will get an 0C4, rather than perhaps blundering on with bad values. (Writing to such an item will typically get an 0C4 in any case because low memory is write protected.)

    An LSACHECK 0C4 includes a message noting it's a BLL addressability failure. Like PARMCHECK, it doesn't fix things up.

    Bernie



    ------------------------------
    Bernie Rataj
    Technical Support Professional
    IBM Canada Ltd.
    Markham

    https://www.ibm.com/products/cobol-compiler-zos
    https://www.ibm.com/products/pli-compiler-zos
    ------------------------------



  • 11.  RE: Recommended COBOL compiler options

    Posted Wed December 11, 2024 09:34 AM

    For CICS programs, I recommend the WORD(CICS) parm, whereby COBOL words that are not supported under CICS are flagged with an error message. This can save time with tricky errors down the line.



    ------------------------------
    Kenny Smith
    Principal Consultant
    Strongback Consulting
    Lake Helen
    3862328746
    ------------------------------