webMethods

 View Only
Expand all | Collapse all

How to get on-prem IS to upload a file into Azure 'Blob' storage

  • 1.  How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 14, 2022 02:44 AM

    Hi – I need help having Integration Server 10.1 upload a simple CSV data file to an Azure ‘Blob’ storage container.

    That is, I need to make IS upload file.csv to https://<storage-account-name>.blob.core.windows.net/<container-name>. The IS version is 10.1 on-prem (extended support).

    I thought it would be easy, but Microsoft offers these two options to upload files to Azure blob storage:

    1. A horribly complicated REST API
      It comes complete with requirements for custom headers with hashed and digitally signed content – see here for a simple script-based query. This seems unworkable.
    2. The uploading client (IS 10.1) uses a Microsoft-authored Java package to upload files
      For this, Microsoft asks we add these entries to pom.xml. I know pom.xml is par for the course for Java ninjas. However, to a webMethods dev like me it’s an Alice in Wonderland-level rabbit hole.

    I favor option #2. So I tried something like this:

    1. Identify and download a suitable JAR (is the one below suitable - I have no idea; I just went clicking around?).
      https://repo1.maven.org/maven2/com/azure/azure-storage-blob/12.14.2/azure-storage-blob-12.14.2.jar
    2. Add it to a package’s /code/jars folder (I also added it to /code/classes)
    3. In the new package, add Java code like the following
      (adapted from this Microsoft sample code)
    import com.azure.storage.blob.*;
    /* Create a new BlobServiceClient using account key */
    BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
    .endpoint("https://<storage-account-name>.blob.core.windows.net")
    .credential (new StorageSharedKeyCredential("<accountName>", "<accountKey>") )
    .buildClient();
    
    /* Create a new container client */
    try {
    containerClient = blobServiceClient.createBlobContainer("<container-name>");
    } catch (BlobStorageException ex) {
    // The container may already exist, so don't throw an error
    if (!ex.getErrorCode().equals(BlobErrorCode.CONTAINER_ALREADY_EXISTS)) {
    throw ex;
    }
    }
    
    /* Upload the file to the container */
    BlobClient blobClient = containerClient.getBlobClient("file.csv");
    blobClient.uploadFromFile("file.csv");
    

    Unfortunately, the sample Java code fails to compile – I get this error:

    Has anyone got IS to upload to Azure ‘Blob’ storage before?

    Any thoughts on the error? Does my approach above sound reasonable?

    Other options:

    • I heard of some ‘Azure adapter’ for IS, but my site’s not licensed for it (and it may not be available for 10.1)
    • Microsoft has introduced an SFTP transfer option for Azure storage. That would be ideal, but it’s beta and only new subscriptions can use it (so my company’s containers cannot enable SFTP)

    Update


    #azure
    #webMethods
    #Integration-Server-and-ESB


  • 2.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 14, 2022 09:30 AM

    Hi Sonam ,

    You can use Azure Storage APIs to upload data to Azure Storage from On-Premise .

    For Azure Storage API , you need to handle the Authentication properly . There are multiple approach for authentication . For example in our scenario , we are using Authorize with Azure Active Directory via OAuth 2.0 tokens .

    We have a registered App on Azure AD associated with a user , which we use to get token . The App has been provided required access on Azure storage to do read/write operations .

    If you do not want to use Azure Storage APIs , you can map Azure Storage as SMB/CIFS share and then write files to it .

    To achieve above OAuth 2.0 based authentication , we have used flow services with pub.client:http calls .

    We are using both the above approaches to read/write files to Azure Storage and working fine for us .

    Let me know if further details required .


    #webMethods
    #Integration-Server-and-ESB
    #azure


  • 3.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 14, 2022 09:59 AM

    Thanks Rakesh – you seem to have this well architected in your infrastructure. My company’s IS servers run on Linux. Does your IS run on Windows? Would that make a difference in approach?

    I didn’t see any other option except the REST and Java APIs. When I checked out the REST API, it looked terribly convoluted (e.g., coding up the necessary content signing, etc). That’s why I went down the Java client route. I made some headway (see above) – I intend to find out next week if it actually works.

    An SMB/CIFS share sounds like a good third option. Even if an SMB/CIFS share is not directly mounted on Linux, it can be exposed via FTP/SFTP.

    Somehow this option never came up in discussions on my side. Looking at the link you share, this refers to ‘Azure Files’. This may be different from ‘Azure Blob storage’ , which is what I am dealing with.

    Can Azure Blob storage be exposed over SMB/CIFS? This may not be possible, based on what I read in this link.

    Thanks again for your input - it gives me a few things to think about…


    #azure
    #webMethods
    #Integration-Server-and-ESB


  • 4.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 14, 2022 10:52 AM

    Yes , our servers are on Linux Servers as well . Be it Windows or Linux it will not have any difference in approach .
    In my view it’s better to use API , also via Java code as well you are going to do the same thing for e.g. passing same set of parameters as required in the API call.

    If you go via the App registration approach on Azure AD , you can use it further for any other authentication/authorization request for Any Azure/Microsoft related services .

    See below link for registering an app on Azure and getting OAuth authentication token .

    The above link has section for Registering App on Azure AD
    or

    And use the Client Credentials based OAuth grant which is generally preferred to use for any Server to Server authentication .

    Yes , via NFS (Network file system) , you can do it for Azure Blob storage as well .

    [General Comment] CIFS shares can be directly mounted on Linux servers , we are using it since long for different integration requirements .

    This approach should work as well , We try to avoid using Java service unless unavoidable and mostly depend on Flow services , APIs in our landscape .

    Also , we have different Microsoft/Azure services in our environment so, we have built Azure OAuth common framework which we use across for authentication/authorization .


    #Integration-Server-and-ESB
    #webMethods
    #azure


  • 5.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 14, 2022 09:38 AM

    Progress (well, got it compiling at least)!

    Initially, I’d downloaded azure-storage-blob-12.14.2.jar. Later, I found its webpage lists four additional JAR package as dependencies. I downloaded these into the /code/jars folder and reloaded the package:

    azure-core-1.22.0.jar 
    azure-core-http-netty-1.11.2.jar 
    azure-storage-common-12.14.1.jar  
    azure-storage-internal-avro-12.1.2.jar
    

    …then kept fixing compile errors until I got this code to compile:

    import com.azure.storage.blob.models.*;
    import com.azure.storage.*;
    import com.azure.storage.common.*;
    import com.azure.core.exception.*;
    
    ...
    
    /* Create a new BlobServiceClient with a Shared Key */
    BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
    .endpoint("https://<storage-account-name>.blob.core.windows.net")
    .credential (new StorageSharedKeyCredential("<accountName>", "<accountKey>") )
    .buildClient();    
    
    BlobContainerClient containerClient =null;  
    /* Create a new container client */
    try {
    containerClient = blobServiceClient.createBlobContainer("<container-name>");
    } catch (BlobStorageException ex) {
    // The container may already exist, so don't throw an error
    if (!ex.getErrorCode().equals(BlobErrorCode.CONTAINER_ALREADY_EXISTS)) {
    throw ex;
    }
    }
    
    /* Upload the file to the container */
    BlobClient blobClient = containerClient.getBlobClient("file.csv");
    blobClient.uploadFromFile("file.csv");
    

    Next week I plan to progress further. One essential change I need is to make blobClient accept a stream as input.


    #azure
    #webMethods
    #Integration-Server-and-ESB


  • 6.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 14, 2022 06:05 PM

    Thanks for your guidance Rakesh. Your approach, with its approved app and AD authentication, seem well integrated.

    What concerns me about using the REST API is the complexity in constructing the special signature-based Authorization header that MS demands - here someone else built one for a simple query that just list objects in storage (no upload). This is a for simple GET - for a PUT, we’d hash and sign the payload too.

    # Build the signature string
    canonicalized_headers="${x_ms_date_h}\n${x_ms_version_h}"
    canonicalized_resource="/${storage_account}/${container_name}"
    
    string_to_sign="${request_method}\n\n\n\n\n\n\n\n\n\n\n\n${canonicalized_headers}\n${canonicalized_resource}\ncomp:list\nrestype:container"
    
    # Decode the Base64 encoded access key, convert to Hex.
    decoded_hex_key="$(echo -n $access_key | base64 -d -w0 | xxd -p -c256)"
    
    # Create the HMAC signature for the Authorization header
    signature=$(printf "$string_to_sign" | openssl dgst -sha256 -mac HMAC -macopt "hexkey:$decoded_hex_key" -binary |  base64 -w0)
    
    authorization_header="Authorization: $authorization $storage_account:$signature"
    
    • I am curious: did you do something equivalent to the above steps in your custom Azure app? Are equivalent HMAC cryptography calls available in WmPublic?

    • Yes, CIFS and SMB should be mountable on Linux. But I doubt Azure Blob storage (which is an object store) exposes its content via CIFS/SMB (block-based storage) - I couldn’t find any documentation for this. I’d be happy to be wrong though.


    #azure
    #webMethods
    #Integration-Server-and-ESB


  • 7.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 14, 2022 10:46 PM

    We didn’t use Shared Access Signature based authorization , but it’s possible .

    App Registration step is a which is used as a mechanism to use the authentication and authorization provided by Azure AD Identity platform , other use cases are also there but for this scenario let’s concentrate on Authentication & Authorization.

    So, we created a representational App on Azure AD for our webMethods servers , with which we can leverage OAuth features and connect to different Microsoft and Azure services seamlessly .

    I have used OAuth 2.0 based authentication with client credentials grant , It’s a different authentication than Shared Access signature and it’s a very simple method. You need to call the Azure AD Oauth 2.0 token endpoint to get the authorization token and then use the token for Azure Storage API Call .

    Step by Step which we have followed .

    1. Register Application on Azure AD
    2. Provide the App access on the Azure Storage account , according to the operations you will be performing
    3. Call OAuth 2.0 token endpoint to get the OAuth 2.0 token
    4. Use OAuth token to call the Azure Storage API and do the required operation .

    Token API Call .

    Once you get the token , you can use it for Azure Storage API Call
    Azure Storage API Call [Token from above API call is passed as Bearer Authorization token]





    Azure Blob supports NFS , which you can use to mount on Linux servers . link below


    #Integration-Server-and-ESB
    #webMethods
    #azure


  • 8.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Sat January 15, 2022 12:39 AM

    Thank you @Rakesh_Kumar3 - This is an excellent illustration and you have been very kind in documenting it so thoroughly.

    Summarising my understanding, your technique requires the organisation’s AD admins initially register the integration platform with its Azure AD identity provider. Then at runtime, IS first obtains an OAuth 2.0 token (using its registered identity), and uses the token as ‘Bearer’ Authorization (which simply passes the token) to make blob storage calls.

    Thanks also for the NFS3 link!

    I plan to discuss your approach with my AD admins next week. I also plan to give my Java code a push to see if it works, then update back here.

    Thanks again!


    #webMethods
    #Integration-Server-and-ESB
    #azure


  • 9.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Sat January 15, 2022 01:22 AM

    Ok sure , thanks .

    keep us posted , how it goes .

    Me/Someone else from the community will be happy to help.


    #Integration-Server-and-ESB
    #webMethods
    #azure


  • 10.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Sun January 16, 2022 11:34 PM

    Update: while Rakesh’s answer seems like the correct way to do things, I’m proceeding with the Java service I wrote for now. It compiles, but won’t run because the five azure-*.jar JAR files I downloaded into <package>/code/jars are not enough. Here, the code has indirect dependencies on packages like org.reactivestreams.*.

    Could not run 'deliverDataToAzureBlobStorageService'
    java.lang.reflect.InvocationTargetException: org/reactivestreams/Publish
    

    So it’s not enough for me to go poking Maven Central and manually noting dependencies is not enough - I need to download and use Maven to consolidate all JARs necessary for Azure blob storage operations.


    #azure
    #Integration-Server-and-ESB
    #webMethods


  • 11.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 21, 2022 12:36 PM

    Would you be able share the rationale? I’m a bit surprised at this choice.


    #azure
    #webMethods
    #Integration-Server-and-ESB


  • 12.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 21, 2022 01:00 PM

    Yes, @Nagendra_Prasad 's suggestion was to recreate post #12 How to get on-prem IS to upload a file into Azure ‘Blob’ storage - #12 by Sonam_Chauhan as Sonam understood it :slight_smile:
    I think both solutions are very useful depending on the use case, but since Rakesh’s answer was selected as Answer it will show up first here.

    Btw, after this is done I’ll remove that part of the discussion as it got off-topic.


    #webMethods
    #azure
    #Integration-Server-and-ESB


  • 13.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Mon January 31, 2022 08:50 PM

    @Nagendra_Prasad , @toni.petrov - Just did that below. Thanks again.


    #azure
    #Integration-Server-and-ESB
    #webMethods


  • 14.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Sun January 23, 2022 07:21 AM

    The rebuke of the wise! Thanks Rob.

    Look - you’re probably right: Rakesh’s approach seems architecturally more modern and may be better overall. My approach may boil down to taste - it is reminiscent of how old-style ‘adapters’ work, complete with 40 JAR files in a package code/jars folder.

    Another excuse: my Microsoft admin resource was busy, so I plugged away using what I obtained from him: a Shared Access Signature (‘SAS’) token and access URL.

    Also another thing which may or may not be valid:

    At runtime, Rakesh’s REST approach does this:

    • POST login.microsoft.com... to get an OAuth token
    • PUT <account>.blob.core.windows.net... using the OAuth token in a Bearer Authorization token

    My approach:

    • Java code uses stored token (‘SAS’ token) to execute operations on <account>.blob.core.windows.net

    Now both options may operate the same way behind the scenes. But from my (limited) perspective, the second approach uses one less moving part (no accessing login.microsoft.com). It also sidesteps concerns about OAuth token lifetime and renewal (but of course, a new OAuth token per operation may be perfectly feasible)


    #Integration-Server-and-ESB
    #azure
    #webMethods


  • 15.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Mon January 24, 2022 11:28 AM

    True, but a entire pile of JARs and a version dependency that may come back to bite you in the future.

    Everything is a trade off. :slight_smile: Glad you were able to get your solution working–which is the main thing that counts.


    #Integration-Server-and-ESB
    #webMethods
    #azure


  • 16.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Mon January 24, 2022 05:30 PM

    Yes - true: 40 JARs to upload a file! Even the SAP adapter uses 3 JARs. Makes me appreciate terms like ‘dependency bloat’ :stuck_out_tongue: Hopefully, with JARs isolated in their own package, it will be many upgrade cycles before SAG, Java (or Microsoft) have a problem with the codebase.

    Out of curiosity, I went googling and found this paper: “We study the evolution and impact of bloated dependencies in a single software ecosystem: Java/Maven”


    #webMethods
    #Integration-Server-and-ESB
    #azure


  • 17.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Mon January 24, 2022 11:48 PM

    Personal opinion :slight_smile: :
    While the growth of dependencies looks ominous , it is perhaps Separation of Concerns ( Separation of concerns - Wikipedia ) in action , but then it works perfectly only if all the moving parts work as expected.
    -NP


    #webMethods
    #Integration-Server-and-ESB
    #azure


  • 18.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 01:51 AM

    Nagendra - that’s a great point to consider.

    But the graph actually shows increase dependency “bloat” – a term with a narrow meaning the researchers defined to mean dependencies the calling code can actually never use (as analysed by their special tool)

    I suspect (but cannot prove) my Java code above has contributed to this effect: 40 JARs to upload a file is just :-1:


    #azure
    #Integration-Server-and-ESB
    #webMethods


  • 19.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 05:59 AM

    Hi Sonam,

    the jar dependencies of the product adapters like SAP, JDBC, WebSphere MQ et al. are mainly inferred by the corresponding systems and should not be considered as bloat in this case as these jars just provide the neccessary classes to connect to the respective servers.
    These APIs/libs are usually provided by the target system vendors like SAP, IBM or the DB system and usually do not have external dependencies despite native libraries delivered together with the provided jars.

    Regards,
    Holger


    #azure
    #Integration-Server-and-ESB
    #webMethods


  • 20.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 06:56 AM

    Oh, I absolutely agree Holger. For example, the SAP adapter uses just 3 JARs, downloaded from sdn.sap.com (so one vendor: SAP). I suspect their JARs are tightly optimised and there is little bloat there.

    In contrast, the Azure Java API code I just created referenced a Microsoft JAR, which pulled 39 more JARs (various vendors) into code/jars. All this just to upload a file into Azure Blob Storage. So I suspect bloat in my solution.


    #azure
    #webMethods
    #Integration-Server-and-ESB


  • 21.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Mon January 17, 2022 02:15 AM

    (I’ve made this a new forum question)

    I’ve managed to get all required JARs (39 of them!) into the package code/jars folder. But the Java service now fails with the dependency error below. Any ideas how to resolve?

    Basically, I need to compel my Java code (and classes it calls) to use the newly packaged Jackson version 2.x JARs (instead of Jackson version 1.x JARs packaged by IS).

    Could not run 'deliverDataToAzureBlobStorageService'
    java.lang.reflect.InvocationTargetException: Package versions: jackson-annotations=2.10.1, jackson-core=2.10.1, jackson-databind=2.10.1, jackson-dataformat-xml=unknown, jackson-datatype-jsr310=unknown, azure-core=1.22.0, Troubleshooting version conflicts: https://aka.ms/azsdk/java/dependency/troubleshoot
    

    Below, I outlined the steps taken to package up the JARs.

    
    
    1. #Install Maven and check version
    #Adapted from https://tecadmin.net/install-apache-maven-on-fedora/
    
    wget https://dlcdn.apache.org/maven/maven-3/3.8.4/binaries/apache-maven-3.8.4-bin.tar.gz
    sudo tar xzf apache-maven-3.8.4-bin.tar.gz -C /opt
    cd /opt && sudo ln -s apache-maven-3.8.4 maven
    sudo vi /etc/profile.d/maven.sh
    Add this content
    --------------------------
    export M2_HOME=/opt/maven
    export PATH=${M2_HOME}/bin:${PATH}
    --------------------------------
    source /etc/profile.d/maven.sh
    mvn -version
    
    
    2. # Go to home folder, initialize Maven and have it generate a dummy pom.xml
    #Adapted from https://maven.apache.org/guides/getting-started/maven-in-five-minutes.html
    
    cd 
    mvn archetype:generate -DgroupId=com.mycompany.app -DartifactId=my-app -DarchetypeArtifactId=maven-archetype-quickstart -DarchetypeVersion=1.4 -DinteractiveMode=false
    
    
    3. #Configure Maven to download all required JARs for Azure Blob storage locally
    #Adapted from https://technology.amis.nl/software-development/java/download-all-directly-and-indirectly-required-jar-files-using-maven-install-dependencycopy-dependencies/
    
    cd my-app
    vi pom.xml
    Add the following to the dependencies section:
    
    # Dependency adapted from: https://docs.microsoft.com/en-us/java/api/overview/azure/storage?view=azure-java-stable)
    # Only the first dependency is taken. The current version is used 
    #If not, you can expect a "Could not resolve dependencies"  error 
    -----------------------
    <dependency>
    <groupId>com.azure</groupId>
    <artifactId>azure-storage-blob</artifactId>
    <version>12.4.0</version>
    </dependency>
    ----------------------------
    mvn install dependency:copy-dependencies
    
    
    4. #Maven now downloads 39 JAR files to the 'my-app/target/dependency/' folder
    # Move these into the IS <package>/code/jars folder.
    #Reload your package
    cd /home/<user>/my-app/target/dependency/
    
    5. #At this point, running the service gets past the missing JAR errors, but returns this new dependency error below.
    -----------------------------
    java.lang.reflect.InvocationTargetException: Package versions: jackson-annotations=2.10.1, jackson-core=2.10.1, jackson-databind=2.10.1, jackson-dataformat-xml=unknown, jackson-datatype-jsr310=unknown, azure-core=1.22.0, Troubleshooting version conflicts: https://aka.ms/azsdk/java/dependency/troubleshoot
    -----------------------------
    
    #This is probably because IS 10.1 shows the 1.x version of the Jackson JAR loaded on it's 'About' page, while the JAR version used by Azure is 2.x:
    # /opt/SoftwareAG101/IntegrationServer/lib/jars/jackson-coreutils-1.8.jar
    

    #webMethods
    #Integration-Server-and-ESB
    #azure


  • 22.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 21, 2022 08:31 AM

    I’ve finally managed to get the Java-based solution running.

    Here’s the code. Note, a large number of dependencies need to be installed in the package’s code/jars folder. This is documented further below.

    import com.wm.data.*;
    import com.wm.util.Values;
    import com.wm.app.b2b.server.Service;
    import com.wm.app.b2b.server.ServiceException;
    import com.azure.storage.blob.*;
    import com.azure.storage.blob.models.*;
    import com.azure.storage.*;
    import com.azure.storage.common.*;
    import com.azure.core.exception.*;
    import com.azure.core.util.BinaryData;
    
    public final class deliverDataToAzureBlobStorageService_SVC
    
    {
    
    /** 
    * The primary method for the Java service
    *
    * @param pipeline
    *            The IData pipeline
    * @throws ServiceException
    */
    public static final void deliverDataToAzureBlobStorageService(IData pipeline) throws ServiceException {
    // pipeline
    IDataCursor pipelineCursor = pipeline.getCursor();
    Object	sourceBytes = IDataUtil.get( pipelineCursor, "sourceBytes" );
    String	sourceFilename = IDataUtil.getString( pipelineCursor, "sourceFilename" );
    String	destinationBlobName = IDataUtil.getString( pipelineCursor, "destinationBlobName" );
    String	accountName = IDataUtil.getString( pipelineCursor, "accountName" );
    String	accountSharedAccessSignature = IDataUtil.getString( pipelineCursor, "accountSharedAccessSignature" );
    String	destinationContainerName  = IDataUtil.getString( pipelineCursor, "destinationContainerName" );
    String	createContainer = IDataUtil.getString( pipelineCursor, "createContainer" );
    pipelineCursor.destroy();
    
    /* Input handling */
    //Check that either the binaryData or sourceFilename input was specified (binaryData has preference)
    BinaryData binaryData = null;
    if (sourceBytes != null ) {
    binaryData = BinaryData.fromBytes((byte[]) sourceBytes);
    } else {
    if (sourceFilename == null) {
    throw new ServiceException ("Error: sourceBytes and sourceFilename cannot both be null");
    }
    }
    // default createContainer to false
    if (createContainer == null ) createContainer = "false";  
    
    /* Authenticate and upload data */
    //Create a new BlobServiceClient with the input SAS token
    BlobServiceClient blobServiceClient = new BlobServiceClientBuilder()
    .endpoint("https://"+accountName+".blob.core.windows.net")
    .sasToken(accountSharedAccessSignature)
    .buildClient();    
    
    //Create new container client. Have it either create a container or attach to an existing container 	
    BlobContainerClient containerClient =null;  
    if (createContainer.equals("true")) {
    containerClient = blobServiceClient.createBlobContainer(destinationContainerName);
    } else {
    containerClient = blobServiceClient.getBlobContainerClient(destinationContainerName);
    }
    
    //Upload data to the container. If the sourceBytes input is available, data is sourced from that.    
    //If it is null, data is instead sourced from the sourceFilename input (representing a local file). 
    BlobClient blobClient = null;
    blobClient = containerClient.getBlobClient(destinationBlobName);
    if (binaryData == null ) {
    blobClient.uploadFromFile(sourceFilename);
    } else {
    blobClient.upload(binaryData);
    }
    
    // There is no output pipeline because the Microsoft BlobClient.upload* methods above do not return status.
    // The transfer is deemed successful if no exception was thrown running by this service.
    
    }
    

    Here’s the documentation accompanying this service. The NOTES section documents how dependencies that need to be installed in the package’s code/jars folder (40 JAR files in all) are sourced.

    INPUT
    ========
    sourceBytes - Optional. A byte array with content to be uploaded to Azure Blob Storage service. Either this input, or the sourceFilename input must be provided. 
    If both are specified, the sourceBytes input is preferred. 
    sourceFilename - Optional. Path to local file to be uploaded to Azure Blob Storage service. Either this input, or the sourceBytes inputs must be specified. 
    If both are specified, the sourceBytes input is preferred. 
    destinationBlobName - The name of blob containing the source data that is to be created in the Azure Blob Storage service. 
    accountName - The account name used to authenticate to the Azure Blob Storage service. 
    accountSharedAccessSignature - The Shared Access Signature (SAS) token used to authenticate to the Azure Blob Storage service. 
    E.g., 'sp=asdfghj&st=2022-01-20T01:55:17Z&se=2024-01-20T09:55:17Z&sv=2020-08-04&sr=c&sig=asdfghj%2Basdfghjk%2Basdfghjk%3D'
    destinationContainerName - The name of the Azure Blob Storage container where the blob will be created. This can include a path within the container. 
    E.g. 'inbound/webmethods'
    createContainer - true/false value (default is false). Whether the destination storage container must be created before blob upload is attempted. 
    
    
    OUTPUT
    =======
    (None)
    
    
    PROCESS
    =======
    This service accepts input data and uploads it to an Azure Blob Storage container. It does so using Azure Blob Storage Java library. It uses a Shared Access Signature 
    (SAS) token to authenticate to the Azure Blob Storage service. For details, please see notes. 
    
    If the sourceBytes input is specified, data is preferentially sourced from this input. Otherwise, data is sourced from the sourceFilename input (representing a local file).
    
    This service returns no output. This is because the Microsoft Azure Blob Storage API BlobClient.upload* methods do not return status. Instead, data transfer is deemed 
    succeessful if no exception was thrown when running by this service.
    
    
    NOTES
    ========
    1. Installing Azure Blob Storage Java Libraries
    This Java service uses external open-source Azure Blob Storage Java libraries published by Microsoft, and dependent other related open-source code. 
    For this service to work, several external JARs (40 of them!) must be sourced and installed into the package's code/jars folder. You can use the Maven 
    build management application for this. Follow the procedure below. 
    
    1.1. Install Maven 
    (Adapted from https://tecadmin.net/install-apache-maven-on-fedora/ )
    --------------------------------------------------------------------------------------------------
    wget https://dlcdn.apache.org/maven/maven-3/3.8.4/binaries/apache-maven-3.8.4-bin.tar.gz
    sudo tar xzf apache-maven-3.8.4-bin.tar.gz -C /opt
    cd /opt && sudo ln -s apache-maven-3.8.4 maven
    sudo vi /etc/profile.d/maven.sh
    # Add this content:
    _______________________________________________
    export M2_HOME=/opt/maven
    export PATH=${M2_HOME}/bin:${PATH}
    _______________________________________________
    source /etc/profile.d/maven.sh
    mvn -version
    --------------------------------------------------------------------------------------------------
    
    1.2. Initialize Maven and generate a dummy project
    (Adapted from https://maven.apache.org/guides/getting-started/maven-in-five-minutes.html )
    Carry out the following steps in a <work-folder> 
    --------------------------------------------------------------------------------------------------
    cd <work-folder>
    mvn archetype:generate -DgroupId=com.mycompany.app -DartifactId=my-app -DarchetypeArtifactId=maven-archetype-quickstart -DarchetypeVersion=1.4 -DinteractiveMode=false
    --------------------------------------------------------------------------------------------------
    
    1.3. Configure Maven to download all required JARs for Azure Blob storage locally
    (Adapted from https://technology.amis.nl/software-development/java/download-all-directly-and-indirectly-required-jar-files-using-maven-install-dependencycopy-dependencies/ )
    The dependency list was adapted from: https://docs.microsoft.com/en-us/java/api/overview/azure/storage?view=azure-java-stable
    Only the 'azure-storage-blob' dependency was chosen. The current version is used (otherwise you can expect a "Could not resolve dependencies"  error). 
    --------------------------------------------------------------------------------------------------
    cd my-app
    vi pom.xml
    # Add the following to the dependencies section:
    _______________________________________________
    <dependency>
    <groupId>com.azure</groupId>
    <artifactId>azure-storage-blob</artifactId>
    <version>12.4.0</version>
    </dependency>
    _______________________________________________
    --------------------------------------------------------------------------------------------------
    
    1.4. Download all JARs 
    Run this maven command in <work-folder> 
    --------------------------------------------------------------------------------------------------
    mvn install dependency:copy-dependencies
    --------------------------------------------------------------------------------------------------
    Maven now downloads 39 JAR files to the 'my-app/target/dependency/' subfolder
    
    1.5.  Move the 39 downloaded JAR files into the IS <package>/code/jars folder.
    --------------------------------------------------------------------------------------------------
    cd <work-folder>/my-app/target/dependency/
    ...
    cp *.jar <IS>/<package>/code/jars
    --------------------------------------------------------------------------------------------------
    1.6. Reload your package
    Reload the package and test the service. At this point, running the service in IS 10.1 gets past errors about missing JARs but returns this new dependency error below.
    --------------------------------------------------------------------------------------------------
    java.lang.reflect.InvocationTargetException: Package versions: jackson-annotations=2.10.1, jackson-core=2.10.1, jackson-databind=2.10.1, jackson-dataformat-xml=unknown, jackson-datatype-jsr310=unknown, azure-core=1.22.0, Troubleshooting version conflicts: https://aka.ms/azsdk/java/dependency/troubleshoot
    --------------------------------------------------------------------------------------------------
    This is probably because IS 10.1 shows the 1.x version of the Jackson JAR loaded on it's 'About' page. 
    --------------------------------------------------------------------------------------------------
    <SAG-10.1-folder>/IntegrationServer/lib/jars/jackson-coreutils-1.8.jar
    --------------------------------------------------------------------------------------------------
    However, the JAR version used by Azure is 2.x
    
    1.7 Configure webMethods package to prioritise local JARs
    Basically, the package's Java code (and the classes that it calls) needs to be persuaded to use the newly packaged Jackson version 2.x JARs 
    (instead of the Jackson version 1.x JARs packaged by IS). For this, edit the manifest.v3 file of the package to use the package class loader, as 
    described in documentation below 
    -------------------------------------------------------------------------------------------------
    [From 'webMethods Integration Server Administrator’s Guide Version 10.1' page 46]
    
    A package's manifest.v3 file controls a number of characteristics of a package,
    including whether the package's class loader defers to its parent class loader. The
    default is to defer to the parent class loader. However, Integration Server will use the
    package class loader instead, if the following is specified in the manifest.v3 file:
    ____________________________________________________________________
    <value name='classloader'>package</value>
    ____________________________________________________________________
    If a package uses its own class loader, the jar files containing the classes you
    want to make available must be in the Integration Server_directory\instances
    \instance_name \packages\packageName \code\jars directory.
    -------------------------------------------------------------------------------------------------
    
    1.8. Workaround circular dependency in slf4j packages.
    At this point, running the service returns the following error:
    --------------------------------------------------------------------------------------------------
    Could not run 'deliverDataToAzureBlobStorageService'
    java.lang.reflect.InvocationTargetException: loader constraint violation: when resolving method "org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;" 
    the class loader (instance of com/wm/app/b2b/server/PackageClassLoader) of the current class, org/slf4j/LoggerFactory, 
    and the class loader (instance of java/net/URLClassLoader) for the method's defining class, org/slf4j/impl/StaticLoggerBinder, have different Class objects for the type org/slf4j/ILoggerFactory used in the signature
    --------------------------------------------------------------------------------------------------
    
    Thankfully, I could wing it after coming across this article: 
    https://documentation.tribefire.com/tribefire.cortex.documentation/concepts-doc/features/tribefire-modules/troubleshooting/slf4j-api-linkage-error.html
    This article suggested installing slf4j-jdk14-1.7.32.jar. This is an unlisted dependency that accompanies the slf4j-api-1.7.32 JAR 
    dependency that Maven had automatically downloaded. Adding this JAR resolves a sort of crazy circular dependency in the slf4j package.
    
    1.9 Acknowledgements
    I was assisted greatly by these two forum posts. The first post suggests a possibly better architecture for this solution. The second suggested 
    the workaround in point 1.8 above.
    --------------------------------------------------------------------------------------------------
    https://tech.forums.softwareag.com/t/how-to-get-on-prem-is-to-upload-a-file-into-azure-blob-storage/254459/10
    https://tech.forums.softwareag.com/t/can-is-java-service-use-a-different-version-jar-than-one-provided-by-is/254533
    --------------------------------------------------------------------------------------------------
    
    
    2. SAS Token Expiry
    The 'Shared Access Signature' (SAS) input to this service is a token that enable it to exchange data with Azure Blob Storage infrastructure. A SAS token has a defined lifetime. 
    When it expires, integration breaks. To prevent this, a new token must generated and configured on both systems (Azure Blob Storage and webMethods integration) prior
    to the expiry date.
    
    For convenience, an SAS token can be made to expire far in the future (possibly as far as the year 9999). The reasons are:  
    1. Instead of the sensitive Storage Shared Key credential, the integration uses a SAS token signed by the key. The SAS token is designed to be a limited-access artifact. 
    2. Authorization rights granted to the SAS token used by this service may be revoked without affecting other applications. 
    3. System-to-system integration with Azure Blob Storage can operate indefinitely without an absolute future date before which the token must change.
    
    
    

    #azure
    #webMethods
    #Integration-Server-and-ESB


  • 23.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 21, 2022 08:46 AM

    @toni.petrov , The previous post has complete details on how to get a file onto Azure Blob storage , would it be possible to extract the post into a KB article, It is sure to help somebody.

    -Nagendra


    #azure
    #Integration-Server-and-ESB
    #webMethods


  • 24.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 21, 2022 10:24 AM

    Thanks Nagendra!
    @Sonam_Chauhan would you mind creating a new topic with that info only as Nagendra suggested? I’ll then move it to the Knowledge base section.


    #Integration-Server-and-ESB
    #azure
    #webMethods


  • 25.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Fri January 21, 2022 11:15 AM

    @toni.petrov - Happy to help, but unsure how I can. There are two solutions:


    #webMethods
    #azure
    #Integration-Server-and-ESB


  • 26.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 12:03 AM

    Hi Sonam, Nagendra,

    Did you check and try the webMethods CloudStreams connector for Azure storage? provided with webMethods CloudStreams product @ Microsoft Azure Storage

    Is there any particular reason of not trying this approach?

    regards,
    Suresh P.N.V.S Ganta
    Software AG Product Management


    #Integration-Server-and-ESB
    #webMethods
    #azure


  • 27.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 12:22 AM

    From the first post in this topic, I think perhaps it was a licensing thing?


    #azure
    #Integration-Server-and-ESB
    #webMethods


  • 28.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 02:08 AM

    @Nagendra_Prasad - You are correct. I ran Installer against Empower, and didn’t see any “Azure” adapters available. So I figured I’d write something basic. I didn’t figure on the rabbit hole this would take me down last week. This info from Suresh is news to me.

    @Suresh_Ganta - Thank you - I didn’t know about this. To someone like me with no knowledge of Cloudstreams, what does the ‘CloudStreams Provider for Microsoft Azure Storage’ package do ? As far as I can make out from the zip file and documentation, the package provides two flow services and dozens of data structures.

    Is the functionality it provides something akin to method signatures and connection pools? Would the code by @Rakesh_Kumar3 in post #7 use it?

    Also, what is the licensing model for WmMicrosoftAzureStorageProvider.zip?


    #webMethods
    #Integration-Server-and-ESB
    #azure


  • 29.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 02:50 AM

    Hi @Sonam_Chauhan

    webMethods CloudStreams - is a patented cloud connectivity framework that allows developers to build connetors by configuration (instead of coding). Offers following capabilities:

        • Connection Mgmt./Pooling
    
    • Multi Auth & Session Management
    
    • Runtime Governance
    
    • Streaming Listeners & Replay Event
    
    • Security
    
    • Multiple Content Types Support
    
    • MIME/Multi-part
    
    • MTOM
    
    • Send/Receive Messages as form data
    

    For a complete list of capabilities of the CloudStreams please refer to the post Why do I need to go for CloudStreams - #2 by Suresh_Ganta .

    You would need a license of webMethods CloudStreams and more than 80+ connectors are offered for FREE for the most widely used SaaS applications in the market. For a complete list of connectors please refer to webMethods CloudStreams Connectors - Software AG Tech Community & Forums

    The Microsoft Azure Storage Connector package provides OOB services for the Microsoft Azure Storage Services REST APIs exposed Microsoft and the CloudStreams framework provides the capabilities like Connection Pooling, OOB support for the most common authentication and authorization mechanism.

    We can have a session planned, if your customer is interested to know more about this product and its capabilities to see if it meets their business requirements

    regards,

    Suresh P.N.V.S Ganta

    Software AG Product Management


    #webMethods
    #Integration-Server-and-ESB
    #azure


  • 30.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 03:03 AM

    Thanks for the good explain Suresh.

    I am a coder for the customer. They are licensed for on-prem but not CloudStreams (as far as I know). So going this road won’t help (at least not for my little project). But a capability to keep an eye on.


    #azure
    #webMethods
    #Integration-Server-and-ESB


  • 31.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 06:36 AM

    @Sonam_Chauhan @Suresh_Ganta : We do not have license for webMethods CloudStreams and had to write our own custom framework for Authentication and connecting to different Azure services .

    It was not a major effort for us to build the services to connect to Azure via custom service .

    (Just a Thought) CloudStreams connectors should be core part of the Software AG Offering , instead of it being a separate licensed product . As more and more enterprise products are SAAS based and if it’s made part of core integration Product it will be more beneficial more Software AG and their clients (win win situation :slight_smile: ) .


    #webMethods
    #Integration-Server-and-ESB
    #azure


  • 32.  RE: How to get on-prem IS to upload a file into Azure 'Blob' storage

    Posted Tue January 25, 2022 09:29 AM

    Hi @Rakesh_Kumar3

    Major key points to consider when it comes to writing custom solutions vs webMethods CloudStreams are as below.

    We offer the below key capabilities out of the box thus taking away the need to develop, maintain, support custom solutions to connect/integrate with various SaaS Apps as this requires a lot of time, efforts which in turn is cost, thus enabling you to focus on the business applications building for the customers rather than spending time and efforts on building connectivity to SaaS Apps.

    Hence webMethods CloudStreams is offered as a separate licensed product and the connectors which we have today as well as the ones we are going to create in future are offered for FREE.

    • The frequency of SaaS Apps updates with new features and enhancements and in turn the API updates is quite frequent. The SaaS vendors release frequency varies from weeks, to monthly, bi-monthly, quarterly etc…

    • Connection Management, Authentication & Authorization Mechanisms support – As the technology is evolving new standards of Authentication & Authorization are adopted by the SaaS vendors. Writing custom code to adopt these per SaaS Apps and do connection and connection pool management requires lot of efforts.

    • Support of Different Connectivity Patterns by configuration driven approach without coding which reduces connector development efforts considerably be it be for Software AG for the Out of the Box connectors, as well as for anyone who develops custom connectors using webMethods CloudStreams

      • REST & OData Connectivity Patterns

        • Metadata Lookup
        • Connector Creation from Open API & Swagger
        • OData v2.0 and v4.0 connectors to connect to any OData/REST based integration SaaS Vendors
      • Event Based Patterns

        • Streaming – CometD based, HTTP Streaming etc.
      • SOAP Connectivity Patterns

        • Everything in the WSDL
        • SOAP Metadata Lookup
        • WSDL with Schemas Included
        • Multiple WSDL by functional Area
    • Ability to leverage the 80+ connectors for most widely used SaaS Apps as well as the ability to create custom connectors with configuration driven approach rather than coding with all the above key features support

    Regards,

    Suresh P N V S Ganta
    Software AG Product Management


    #Integration-Server-and-ESB
    #webMethods
    #azure