I am pondering migrating our shop's applications from using (legacy) dynamic calls to use DLL calls. The first thing I am wondering is, is this a worthwhile endeavor. I can think of several small advantages, but I don't know if its worth the effort in the end.
With legacy dynamic calls you must either do a "CALL identifier" or use the DYNAM compile option (which converts "CALL literal" to a dynamic call). In neither case is there a bind-time check to make sure that the called module actually exists.
With DLL calls you can do "CALL literal" and a "standard" DLL call is done. At bind time there is a mapping of the literal name to its corresponding load module (the DLL module). While this does not guarantee that the load module (DLL) is actually available at runtime, it does warn if it can't find the DLL name associated with the called literal name.
With DLL calls you can use "program" names longer than 8 characters [PGMNAME(LONGUPPER) compile option] and also lower case [PGMNAME(LONGMIXED)].
Are there any other advances to using DLLs? Does a DLL "implicit call" have any significant performance advantage over a legacy dynamic call?
The first major pain point I've come upon with a gradual migration to DLLs is that:
- You cannot do a legacy dynamic call to a DLL.
- You cannot do a DLL dynamic call (CALL identifier) to a non-DLL.
The only workaround I've found to these limitations are:
- A "big bang" conversion of:
- All subroutines to be DLL routines
- All calls to these routines being DLL calls
There's no way we're going to take this path.
- Migrate all subroutines to DLLs (with new names), and also create "new" (non-DLL) subroutines with the legacy names, where the non-DLL subroutines simply do DLL calls to the "new" DLLs.
- Using >>CALLINTERFACE DLL, because they have to be compiled with NODLL
An example of option 2 is:
Old program (non-DLL) performs some business logic.
Old program is "converted" to "new" program (new module name) with existing business logic.
"New" program is created with old program's name. This simply does a DLL call to the new DLL (containing the old business logic).
While this all works, it seems to me to be a bit of overkill. To my mind it would be ideal if a legacy dynamic call could be made to a DLL. This would allow use to, after converting all sub-routines to DLLs, continue to call them using legacy dynamic calls, until such point as all "legacy callers" have been converted to DLL applications.
Currently trying to do a legacy dynamic call to a DLL results in the following condition:
IGZ0175S A dynamic call to module-name failed because the entry point
is a COBOL program compiled with the DLL compiler option.
But I have to wonder if this is truly a technical limitation, or could it be allowed? I have been successful with an LE-enabled assembler main using a CEEFETCH macro, followed by BASR 14,15, to call both non-DLL and DLL routines. This, even though it is explicitly documented "Do not use CEEFETCH for DLLs" (https://www.ibm.com/docs/en/zos/2.3.0? topic=macros-ceefetch-macro-dynamically-load-routine). The reason behind this limitation, and how it actually seems to work (in my simple test case), does not appear to be documented.
I guess I should mention that I do realize we can use the >>CALLINTERFACE feature to do legacy dynamic calls from DLL applications and DLL calls from NODLL applications. While useful in some contexts (such as the one described earlier), wrapping CALL statements with >>CALLINTERFACE is not a very scalable solution.
All thoughts are appreciated. Experiences are appreciated even more.
------------------------------
Frank Swarbrick
------------------------------