Migrating from Hot Folders to Cloud Hot Folders
27 min read
Overview

For many years Hot Folders have been a standard import option for getting structured data into SAP Commerce. If you're migrating from on-premise to SAP Commerce Cloud in the Public Cloud you'll be able to leverage Cloud Hot Folder capabilities. In this article we cover the process of migrating from standard SAP Commerce Hot Folders over to Cloud Hot Folders for SAP Commerce Cloud in the Public Cloud.
Table of Contents
Introduction
This article assumes that you already have a very good working knowledge of Hot Folders available as
part of the SAP Commerce core offering, and that you have read the technical information on Cloud Hot Folders as well to get a technical understanding of the new environment. You
can also look at Get the
Most out of Your Cloud Hot Folders and Azure Blob Storage to understand more about how Cloud Hot
Folders work.
We will go through the basics of getting your local environment up and running to be able to test Cloud Hot Folders locally, then run through migrating some real world examples so that you have a reference implementation. Finally we will dive deeper into some error handling topics and how you can monitor the process at run time.
Get Cloud Hot Folders Up and Running
Run Azure Locally
Set up a local development environment to test Cloud Hot Folders locally. This is already covered in
the SAP Help Portal article Working with a Local Setup. If you have done everything correctly you'll
find:
- In the Azurite console you will see that the file is copied to the
processing
folder and then to thearchive
folder (if the impex import was successful) or to theerror
folder (if the impex import failed).
"PUT /devstoreaccount1/hybris/master/hotfolder/processing/test.csv
HTTP/1.1" 202 -
"HEAD /devstoreaccount1/hybris/master/hotfolder/test.csv HTTP/1.1" 200
108"DELETE /devstoreaccount1/hybris/master/hotfolder/test.csv HTTP/1.1" 202 -
"PUT /devstoreaccount1/hybris/master/hotfolder/archive/test.csv.2019-12-19T00-02-44.812Z HTTP/1.1" 202 -
"HEAD /devstoreaccount1/hybris/master/hotfolder/processing/test.csv HTTP/1.1" 200 108
"DELETE /devstoreaccount1/hybris/master/hotfolder/processing/test.csv HTTP/1.1" 202 -
- In your console/terminal running SAP Commerce you
will see the test.csv file being processed with the steps
- DOWNLOADED
- HEADER_SETUP
- HEADER_INIT
- HEADER_TRANSFORMED
- HEADER_EXECUTED
- HEADER_CLEANUP
- DOWNLOADED
Migrating
Now that you have Cloud Hot Folders up and running it is time to have a look at the actual migration process. You will start with configuring local.properties then go through some real world examples of migrating Hot Folders to Cloud Hot Folders.
local.properties
To use Cloud Hot Folders for your project locally you will need to add some additional configuration in local.properties as described below. Adjust the values as needed. See Microsoft Azure Blob Storage Connection Properties for more information.
System Update
If you haven't done this already, then you need to run a System Update to update the type system definitions (*-items.xml) in the database for Cloud Hot Folders.
- hac > Platform > Update
- Select the following extensions
- azurecloud
- cloudcommons
- cloudhotfolder
- azurecloudhotfolder
- Press Update
Examples
The following examples show the Hot Folder code and its corresponding Cloud Hot Folder code. The file content
is abbreviated, mostly just removed the namespace header. The converterMapping
and converter are
the same for both and are not shown here.
Example: Has its own sub folder and the .csv file can be dropped into any folder
In this example, "ABCBlog" has its own sub folder that the .csv file can be dropped into, however the setup does not restrict which folder the .csv file can be dropped into. In theory the .csv file can be dropped into any hotfolder.
Notes:
- Define the
azureBlobInboundSynchronizer
andazureBlobSynchronizingMessageSource
to setup the sub-folder.
Example: Has its own subfolder and the .csv file must be dropped into the subfolder
In this example, "ABCReviews" has its own sub folder that the .csv file must be dropped into.
Notes:
- The
batchFilesCloudHotFolderProc
sets up the pattern that this Cloud Hot Folder will accept.
Example: File priority order (sorting)
In this example, "ABCCamera" has sorting specific to this Cloud Hot Folder instead of on a global level (see Synchronizing, Sorting, and Filtering for more information).
Notes:
-
The property
cloud.hotfolder.storage.file.sort.name.prefix.priority
controls sorting on a global level (for all Cloud Hot Folders). - Here you setup the comparator with
namePrefixPriority
to process .csv files in a specific order.
Example: headerSetupTask set catalog
In this example, "ABCReviews" has its own sub folder that the .csv file must be dropped into and also shows
how to change the header catalog. The default catalog is
ABCMasterProductCatalog as defined in local.properties
- cloud.hotfolder.default.zip.header.catalog
.
Notes:
-
The default catalog to use when importing .csv files in Cloud Hot Folders is set by the property cloud.hotfo
lder.default.mapping.header.catalog
in local.properties.cloud.hotfolder.default.mapping.header.catalog=ABCMasterProductCatalog
- In this example you override the default catalog in the
ABCReviewsCloudHotFolderHeaderSetupTask
(see line 26-30).
Error Handling
Impex Load Error
Out of the box, Cloud Hot Folder is supposed to move the .csv file to the
error
folder when there is a problem processing the file. When an impex file has load errors the CronJob
finishes successfully even though the impex load had errors. Since the CronJob finished successfully the .csv
file gets copied to the
archive
folder.
synchronizeToLocalDirectory()
Exceptions
The Out of the Box (OOTB) code uses a monitor service to log while processing the Cloud Hot Folder. This
may not be completely reliable and may throw a NullPointerException
in the method
synchronizeToLocalDirectory()
, eventually the .csv file does get processed successfully, however
there cab be a lot of noise from these exception(s).
Fixes
Here are some fixes that you can apply to your own code base for the above two problems. When there
is an impex import error the import will fail, copy the .csv file to the
error
folder instead of the
archive
folder and send an email with the name of the failed file. It doesn’t cover all situations, use at
your own risk. The default monitor service has also been replaced since, it was throwing exceptions during
synchronizeToLocalDirectory()
too often.
Notes:
- Added
importResult
to get the status (success/fail) from the impex import.
Notes:
-
Extended
aroundExecute()
to checkimportResult
from impex import.
Notes:
- Extended
execute()
to fail fast when there is an impex import error. -
New method
processFile(final File file, final ABCBatchHeader ABCBatchHeader)
to returnImportResult
.
Notes:
ABCMonitorService
simply writes direct to the logger.
AzureConsole
You can monitor the Cloud Hot Folder progress in the Azure Console. This will show the status of the .csv file as it is processed. The happy path will go thru the following steps:
- upload file to
hybris/master/hotfolder/blog
- copy file to
hybris/master/hotfolder/blog/processing
- delete file from
hybris/master/hotfolder/blog
- copy file to
hybris/master/hotfolder/blog/archive
- delete file from
hybris/master/hotfolder/blog/processing
Unmapped
If the file does not match the mapping pattern
(cloud.hotfolder.default.mapping.file.name.pattern
) or one of the converter(s) then the message
"was not routed as didn't match any configurations" is logged
Message [File [stock-03.csv] modified [1571352719000] was not routed as didn't match any
configurations]
Fix this by defining the file name pattern in the property
cloud.hotfolder.default.mapping.file.name.pattern
. Also check your convertMapping
bean(s).
Console
You can monitor the Cloud Hot Folder progress in your Console/Terminal. There you will see the .csv file being processed with the steps:
- DOWNLOADED
- HEADER_SETUP
- HEADER_INIT
- HEADER_TRANSFORMED
- HEADER_EXECUTED
- HEADER_CLEANUP
Conclusion
This article introduced you to the concrete steps of how to migrate Hot Folders to Cloud Hot Folders to speed up and make your migration process easier.
The articles below will help you dive deeper on this topic and provide additional information:
- Cloud Hot Folders - SAP Help Portal > Cloud Hot Folders
- Cloud Hot Folder Extension - SAP Help Portal > Cloud Hot Folder Extension
-
Get the Most of your Cloud Hot Folders and Azure Blob Storage - CX Works Article gives more background on Cloud Hot Folders. In this article, you will learn how you can migrate your connectivity from the Hot Folders to the Cloud Hot Folders and what are the different ways to connect and push files/blobs to the Cloud Hot Folders. You will also explain how to emulate the Azure Storage locally, which could be very useful for developers. Finally, we will explain how to upload product media/images to SAP Commerce Cloud using the Cloud Hot Folders.
-
Mastering Cloud Hot Folders - This webinar will walk you through the concept of Cloud Hot Folders, show you how to enable and configure them, and provide an overview of their file processing channels, custom-mapping features, and monitoring capabilities.
- There is a PDF version of the webinar available for download as well, highly recommended for the technical details of Cloud Hot Folders.
- Data Importing - SAP Help Portal > Commerce B2C Accelerator - for information on Hot Folder Data Importing
- Your Guide to Developing SAP Commerce Cloud Applications Locally - SAP Commerce Cloud is built to allow for customizations, but how can you make these changes without being connected to the cloud? In this video you will discover tips and tricks for translating local development of your application to successfully deploy with SAP Commerce Cloud.