I have somewhat of a curious problem.
I have a set of SPSS macros that write and then read many data files. When the target directory is a cloud, occasionally the program will tell me a file does not exist, even though 1 or 2 lines earlier I have confirmation that the file was indeed written successfully. All this happens within SPSS. When this happens, and I rerun the program with no changes whatsoever, the program usually runs with no problem. To avoid this I point to a directory on my local machine, and do not run into the problem at all. But I have users of the macros that still use a cloud as their target and run into this problem of looking for a file that should be there, but that it is not.
More recently, in a newer macro that uses the HOST command to start an R session that creates a csv file, when SPSS reads the csv file 2 or 3 commands after HOST was done, it picked up 206 cases, but in looking at the csv file I see 5548 cases that should have been read by SPSS. There was no error or warning message reported in R or SPSS. I suspect this issue is related to the one described earlier.
Is there a way to prevent "this" from happening? I suspect "this" is the program/system sending a file to be written but remaining in some cached space waiting for the proper time to be saved, while subsequent commands are executed and failing to find the file that has yet to be written.
This e-mail and any files transmitted with it may contain privileged or confidential information. It is solely for use by the individual for whom it is intended, even if addressed incorrectly. If you received this e-mail in error, please notify the sender; do not disclose, copy, distribute, or take any action in reliance on the contents of this information; and delete it from your system. Any other use of this e-mail is prohibited.
Thank you for your compliance.
Sure, I can give that a try and see how it works.