CX Works

A single portal for curated, field-tested and SAP-verified expertise for your SAP C/4HANA suite. Whether it's a new implementation, adding new features, or getting additional value from an existing deployment, get it here, at CX Works.

Get the Most out of Your Cloud Hot Folders and Azure Blob Storage

Get the Most out of Your Cloud Hot Folders and Azure Blob Storage

Hot Folders is a feature allowing you to integrate files in SAP Commerce quickly and efficiently. Traditionally, this has required you to provide a local or a shared directory where you push your files through SFTP. With SAP Commerce Cloud in the Public Cloud, the Cloud Hot Folders replaces the classic Hot Folders, and Azure Storage replaces local or shared directories. With SAP Commerce Cloud in the Public Cloud, you are pushing blobs to Cloud Hot Folders instead of files. For more on the architecture of Cloud Hot Folders, see the "Cloud Hot Folders" section in SAP Commerce Cloud Architecture.

In this article, we will explain how you can migrate your connectivity from the Hot Folders to the Cloud Hot Folders and what are the different ways to connect and push files/blobs to the Cloud Hot Folders. We will also explain how to emulate the Azure Storage locally, which could be very useful for developers. Finally, we will explain how to upload product media/images to SAP Commerce Cloud using the Cloud Hot Folders.

Table of Contents


Migrate the Connectivity from On-Premise to Cloud Hot Folders

To push/read data to/from the On-Premise Hot Folders, the classical options are:

  • Use FTP/SFTP to transfer files 
  • Use NFS driver

Since Cloud Hot Folders are using Azure Blob Storage, the options above are no longer available.

In order to migrate the connectivity from On-Premise to the Azure Blob Storage, the new options are:

  1. Blob Services REST API
  2. Blob Storage SDK
  3. Explorers for Blob Storage
  4. Blobfuse Virtual System Driver
  5. Azure CLI
  6. AzCopy Command Line Tool

However, it's always possible to create a bridge system that can receive files through SFTP and transfer them to the Azure Blob Storage. If this is critical, there are third-party systems that provide paid services for this feature (for example, https://docevent.io/ ). For the purpose of this article, we will not consider these 3rd party options. It is also possible to create your own such bridge using Azure CLI scripts.

We will now go through all of the options above and will describe the pros and cons for each. 

Connect to Azure Cloud Hot Folders

Blob Services REST API

The Azure Blob Services offer several REST operations through the HTTP protocol. Below are some examples of the offered operations:

  • List Containers
  • Create Container
  • Delete Container
  • List Blobs
  • Get Blob
  • Create Blob
  • Delete Blob
  • and more

For more details, please read https://docs.microsoft.com/en-us/rest/api/storageservices/blob-service-rest-api.


Pros Cons

Compatibility. HTTP is a standard protocol, compatible across systems/platforms.

Synchronous. Operations.

Support. No upgrade needed, the last API version is used.

Usability. Not very practical to use. A custom tool should be implemented to consume the REST operations.


Blob Storage SDK

The other option Azure proposes, is the SDK or Client API. SDK is available for several programming languages:


Pros Cons

Asynchronous. The Java SDK provides asynchronous operations.

Upgrade. Each time a new version is released, an upgrade is needed.

Integrates. with SAP Commerce Cloud, which is already using the Java SDK.

Usability. To perform basic operations (add blob, remove blob), custom code is needed.


Explorers for Blob Storage

Similarly, to SFTP clients like FileZilla, there are several clients/explorers for the Azure Blob Storage.

To manage blobs, Microsoft proposes:

  • A web portal: Microsoft Azure Portal (not in scope, because access isn't available for SAP Commerce Cloud customers)
  • A client: Microsoft Azure Storage Explorer

The below screenshot is Microsoft's Azure Storage Explorer. The export folder (on the left side) contains some zip files (on the right side).



There are also some third-party clients like:


For more information about the Client Tools, please read: https://docs.microsoft.com/en-us/azure/storage/common/storage-explorers


Pros Cons

Usability. The Client Tool is very easy to use for manual operations.

Automation. Pushing or reading blobs automatically is not possible as the Client is a GUI (Graphical User Interface) Tool.


Blobfuse Virtual System Driver

Blobfuse is a virtual file system driver that allows you access to the Azure Blobs through the Linux file system. Blobfuse is using Blob Service Rest API's to translate the basic operations (read, write, list, and more).

First, install Blobfuse. Then, your Blob container can be mounted in the folder of your choice.

For more information on how to install Blobfuse, mount a container, and perform read/write operations, please refer to: https://github.com/Azure/azure-storage-fuse.


For performance reasons, files are cached in the Blobfuse temporary directory. If a Blob is modified in the Azure Storage, Blobfuse will wait the cache timeout to download the latest version.

Pros Cons

Usability. Blob containers can be accessed as a directory. Standard file system operations like ls, copy can be used.

Reading Delay. Files are cached. If the Blob is modified in the Azure Storage, the latest version will be downloaded to Blobfuse after a timeout.


Non Optimized Update. To update a Blob, Blobfuse downloads the entire file to the local cache, modifies it, and then uploads it to Azure Storage.


Concurrent Write. If multiple nodes are trying to write the same file, the last writer will win.


Limitations. Some operations are not supported by Blobfuse; symbolic links, permission, synchronization (readlink, symlink, link, chmod, chown, fsync, lock).


Azure CLI

Azure CLI is a command line interface for managing Azure Subscription including Azure Storage. Install latest Azure CLI for your appropriate operating system via, Install the Azure CLI. Azure CLI scripts can be a powerful way to connect to Cloud Hot Folder and automate ETL integrations using the Cloud Hot Folders and other interfacing systems.

Two environment variables viz. AZURE_STORAGE_ACCOUNT and AZURE_STORAGE_KEY need to be set up in your script (or shell environment in interactive mode) to access Cloud Hot Folders. You can obtain those values from Commerce Cloud Portal (only authorized users can see/get)

Following operations are allowed in the script and/or interactive mode via Azur CLI. Please refer to https://docs.microsoft.com/en-us/azure/storage/common/storage-azure-cli for all specific applicable command details. Since Commerce Cloud in the Public Cloud only provides Role Based Access to Azure Blob Storage (for authorized users) and not to the Azure Subscription or other types of Azure Storage types, only those commands/accesses are applicable.

  • Create/Manage/List blobs and/or container
  • Upload/Download blobs in container 
  • Copy/Delete blobs
  • Set content type
  • and many more

Example Azure CLI sample script. 

The below script first creates a new container in your storage account, then uploads an existing file (as a blob) to that container. It then lists all blobs in the container, and finally, downloads the file to a destination on your server/computer that you run Azure CLI on. Please replace <placeholder text> with appropriate content.

#!/bin/bash
# A simple Azure Storage example script
export AZURE_STORAGE_ACCOUNT=<storage_account_name>
export AZURE_STORAGE_KEY=<storage_account_key>
export container_name=<container_name>
export blob_name=<blob_name>
export file_to_upload=<file_to_upload>
export destination_file=<destination_file>
echo "Creating the container..."
az storage container create --name $container_name
echo "Uploading the file..."
az storage blob upload --container-name $container_name --file $file_to_upload --name $blob_name
echo "Listing the blobs..."
az storage blob list --container-name $container_name --output table
echo "Downloading the file..."
az storage blob download --container-name $container_name --name $blob_name --file $destination_file --output table
echo "Done"


To check that you have successfully connected to Cloud Hot Folders, issue a command az storage blob list --output table -c hybris. You should see tabular output of your blob storage as below.

ShellCommandPrompt$ az storage blob list --output table -c hybris


ShellCommandPrompt$


Note: If you create a new container with a name other than hybris, it will be outside of the Cloud Hot Folder processes. All security best practices and governance needs to be followed for script automation to be secure. Follow the conventions for naming in Azure Blob Storage (hybris/master/hotfolder) for commands to be meaningful for the Commerce in the Public Cloud.


Pros Cons

Usability. More robust commands and options available than just AzCopy. Different and unique automation use case scenarios can be realized.

Usability. For interactive one-time operations with Azure Blob Storage, it is easier/similar to use AzCopy.

Automation. Use cases are limitless with shell scripting, within the bounds that Azure Subscription Login is not available.

Automation. Scripts needs to be written and tested/retested and secured to realize business value.

Automation. Monitoring on other systems can be scripted coupled with Azure Blob Storage management and Cloud Hot Folders.

Limitation. Not all commands from Azur CLI are available because Azure Subscription Login is not available. If a customer has multiple Azure Cloud Subscription(s) outside of Commerce Cloud in the Public Cloud, the same limitation may limit some unique automation use case realization.

Compatibility. Azure CLI is available for Linux, MacOS and Windows and can also be run in Docker Containers.

Upgrade. Each time a new version of Azure CLI is released, an upgrade is needed.

AzCopy Command Line Tool

AzCopy is a command line tool designed to copy data from/to Azure Storage and File Storage. AzCopy is able to copy data from File Storage to Azure Storage or from Azure Storage to File Storage.

AzCopy command is used as follows:


azcopy copy "[source]" "[destination]?sv=SAStoken" --recursive=true [Options]


Source is the source file or folder/container, could be File Storage or Blob Storage.

Target is the target file or folder/container, could be File Storage or Blob Storage.

For more information about all supported options and command details, please read: https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-linux.


Pros Cons

Usability. Simple commands to copy Blobs from/to Azure Storage.

Automation. Pushing or reading Blobs automatically is not possible (or a custom tool must be implemented).

Usability. Simpler than Azure CLI, if it meets the use case needs.


Compatibility. AzCopy tool is available for Linux, MacOs and Windows.

Upgrade. Keep the executable in sync each time a new version is released.


Run Blob Storage Locally

Running a Blob Storage locally could be very useful, especially during development. Reading/Writing data to Blob Storage during feature implementation could be easier when you have a dedicated Blob Storage, so you don't run into conflicts. That said, a dedicated Azure Blob Storage instance for each developer is not possible with SAP Commerce Cloud. You could, however, create and support your own Blob storage separate from your SAP Commerce Cloud instance. We have found running a local Blob Storage is a good compromise for development.

Install and Run the Emulator

First you need to install azureit, an Azure Blob emulator providing most of the supported Blob operations.


npm install -g azurite


Then you need to run the emulator.


azurite -l /BlobStorageTmp


Here, we used /BlobStorageTmp directory to run the emulator. All the uploaded Blobs will be stored in this directory. You can use another directory.

The output of the azurite command above should be as follows:


Azurite, Version 2.7.0

A lightweight server clone of Azure Storage

Azure Queue Storage Emulator listening on port 10001

Azure Table Storage Emulator listening on port 10002

Azure Blob Storage Emulator listening on port 10000


You can also use the Microsoft Blob Storage emulator, however, it's compatible with Microsoft Windows only: https://docs.microsoft.com/en-us/azure/storage/common/storage-use-emulator.


Connect to Local Blob Storage

Microsoft Azure Storage Explorer

On the left navigation panel, right click on "Storage Accounts".



Select "Attach to local emulator" and click "Next".



Under "Storage Accounts", a "local" storage should appear.


Configure SAP Commerce Cloud

Under config folder, edit local.properties and add the following:


# Azure Blob Store: Local (Azurite)
azure.hotfolder.storage.account.connection-string=UseDevelopmentStorage=true
# general setting
cluster.node.groups=integration,yHotfolderCandidate
azure.hotfolder.storage.container.hotfolder=master/hotfolder


You should restart the platform. SAP Commerce Cloud should now be polling the local Blob storage instead of the remote Azure Blob.

How to Upload Media

In this section, we will describe two options to allow you upload media through the Cloud Hot Folders:

  • The URL media file
  • The Zip media file

Zip Media File

First, you need to create a product-with-media folder as follows:



The images folder contains the images in different formats.

In product.csv, you will import the products:


product-00001,,Product 00001,Product 0001 Description of product 00001,ean,manufacturer,manufacturerAID,pieces,approved,jp-vat-full,
product-00002,,Product 00002,Product 0002 Description of product 00002,ean,manufacturer,manufacturerAID,pieces,approved,jp-vat-full,


In the zip_media.csv, you will import the media.


product-00001, product-00001.png
product-00002, product-00002.png


The compressed zip file is attached here, you can use it to test media import: product-with-media.zip.


Conclusion

In this article, we explained how to push/read Blobs to/from Cloud Hot Folders, including the possible options. We explained how to run an Azure Storage emulator locally, which is useful for local development. Finally, we demonstrated how to upload product media/pictures using the Cloud Hot Folder. Now, you should be able to perform all these operations and fully enjoy the new Cloud Hot Folders.

For more details about the Cloud Hot Folder, you can consult the Cloud Hot Folders section in the product documentation.