CX Works

CX Works brings the most relevant leading practices to you.
It is a single portal of curated, field-tested and SAP-verified expertise for SAP Customer Experience solutions

Migrating from Hot Folders to Cloud Hot Folders

27 min read

Migrating from Hot Folders to Cloud Hot Folders

For many years Hot Folders have been a standard import option for getting structured data into SAP Commerce. If you're migrating from on-premise to SAP Commerce Cloud in the Public Cloud you'll be able to leverage Cloud Hot Folder capabilities. In this article we cover the process of migrating from standard SAP Commerce Hot Folders over to Cloud Hot Folders for SAP Commerce Cloud in the Public Cloud.

Table of Contents

Introduction

This article assumes that you already have a very good working knowledge of Hot Folders available as part of the SAP Commerce core offering, and that you have read the technical information on Cloud Hot Folders as well to get a technical understanding of the new environment. You can also look at Get the Most out of Your Cloud Hot Folders and Azure Blob Storage to understand more about how Cloud Hot Folders work.

We will go through the basics of getting your local environment up and running to be able to test Cloud Hot Folders locally, then run through migrating some real world examples so that you have a reference implementation. Finally we will dive deeper into some error handling topics and how you can monitor the process at run time.

Get Cloud Hot Folders Up and Running

Run Azure Locally

Set up a local development environment to test Cloud Hot Folders locally. This is already covered in the SAP Help Portal article Working with a Local Setup. If you have done everything correctly you'll find:

  • In the Azurite console you will see that the file is copied to the processing folder and then to the archive folder (if the impex import was successful) or to the error folder (if the impex import failed).

"PUT /devstoreaccount1/hybris/master/hotfolder/processing/test.csv HTTP/1.1" 202 -
"HEAD /devstoreaccount1/hybris/master/hotfolder/test.csv HTTP/1.1" 200 108

"DELETE /devstoreaccount1/hybris/master/hotfolder/test.csv HTTP/1.1" 202 -
"PUT /devstoreaccount1/hybris/master/hotfolder/archive/test.csv.2019-12-19T00-02-44.812Z HTTP/1.1" 202 -
"HEAD /devstoreaccount1/hybris/master/hotfolder/processing/test.csv HTTP/1.1" 200 108
"DELETE /devstoreaccount1/hybris/master/hotfolder/processing/test.csv HTTP/1.1" 202 -

  • In your console/terminal running SAP Commerce you will see the test.csv file being processed with the steps
    • DOWNLOADED
    • HEADER_SETUP
    • HEADER_INIT
    • HEADER_TRANSFORMED
    • HEADER_EXECUTED
    • HEADER_CLEANUP

Migrating

Now that you have Cloud Hot Folders up and running it is time to have a look at the actual migration process. You will start with configuring local.properties then go through some real world examples of migrating Hot Folders to Cloud Hot Folders. 

local.properties

To use Cloud Hot Folders for your project locally you will need to add some additional configuration in local.properties as described below. Adjust the values as needed. See Microsoft Azure Blob Storage Connection Properties for more information.

local.properties - Cloud Hot Folders
#-------------------------------------------------------------------------#
# Cloud Hot Folders properties - taken from azurecloudhotfolder/project.properties
#-------------------------------------------------------------------------#
azure.hotfolder.storage.account.connection-string=DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;TableEndpoint=http://127.0.0.1:10002/devstoreaccount1;
azure.hotfolder.storage.account.name=devstoreaccount1
# Name of the Blob Container in Azure
azure.hotfolder.storage.container.name=hybris
# Automatically create the Blob Container when Hybris starts up.
azure.hotfolder.storage.container.create=true
# Name of the hotfolder in the Blob Container, tenantId defaults to master in local development.
azure.hotfolder.storage.container.hotfolder=${tenantId}/hotfolder
# A pattern to govern remote file names that should be synchronized
azure.hotfolder.storage.container.match.pattern=^((?!ignore).)*$
# Required to process hot folder files.
cluster.node.groups=integration,yHotfolderCandidate
# Properties from cloudhotfolder/project.properties
# Specify what store folder, catalog, price config and default file names mapped onto hotfolder processes
cloud.hotfolder.default.mapping.root.dir=ABC
cloud.hotfolder.default.mapping.header.catalog=ABCMasterProductCatalog
cloud.hotfolder.default.mapping.header.net=false
cloud.hotfolder.default.mapping.file.name.pattern=^(customer|product|url_media)-\\d+.*
# Specify how files should be imported
cloud.hotfolder.storage.file.sort.name.prefix.priority=coredata,sampledata,product,url_media
cloud.hotfolder.storage.file.sort.name.sequence=^(?<filename>.*)-(?<sequence>\\d*)(?<extension>.*)$
# Default values for catalog, net, store.
cloud.hotfolder.default.zip.header.catalog=ABCMasterProductCatalog
cloud.hotfolder.default.zip.header.net=false
cloud.hotfolder.default.zip.mapping.product.catalog=ABCMasterProductCatalog
cloud.hotfolder.default.zip.mapping.content.catalogs=ABCContentCatalog
cloud.hotfolder.default.zip.mapping.store.names=ABC-ww


System Update

If you haven't done this already, then you need to run a System Update to update the type system definitions (*-items.xml) in the database for Cloud Hot Folders.

  • hac > Platform > Update
  • Select the following extensions
    • azurecloud
    • cloudcommons
    • cloudhotfolder
    • azurecloudhotfolder
  • Press Update

Examples

The following examples show the Hot Folder code and its corresponding Cloud Hot Folder code. The file content is abbreviated, mostly just removed the namespace header. The converterMapping and converter are the same for both and are not shown here.

Example: Has its own sub folder and the .csv file can be dropped into any folder

In this example, "ABCBlog" has its own sub folder that the .csv file can be dropped into, however the setup does not restrict which folder the .csv file can be dropped into. In theory the .csv file can be dropped into any hotfolder.

ABCBlog - Hot Folder - OLD
<beans>
    <context:annotation-config/>
	<bean id="baseDirectoryABCBlog" class="java.lang.String">
		<constructor-arg value="#{baseDirectoryABC}/blog" />
	</bean>
	<file:inbound-channel-adapter id="batchFilesABCBlog" directory="#{baseDirectoryABCBlog}"  filter="filterDoneABCGeneric">
		<int:poller fixed-rate="1000" />
	</file:inbound-channel-adapter>
	<file:outbound-gateway request-channel="batchFilesABCBlog" reply-channel="batchFilesABCBlogProc"
		directory="#{baseDirectoryABCBlog}/processing" delete-source-files="true" />
	<int:service-activator input-channel="batchFilesABCBlogProc" output-channel="batchFilesHeaderInit" ref="ABCBlogHeaderSetupTask"
		method="execute" />
	<bean id="ABCBlogHeaderSetupTask" class="de.hybris.platform.acceleratorservices.dataimport.batch.task.HeaderSetupTask">
		<property name="catalog" value="ABCMasterProductCatalog" />
		<property name="net" value="false" />
		<property name="storeBaseDirectory" ref="baseDirectoryABCBlog" />
	</bean>
</beans>
ABCBlog - Cloud Hot Folder - NEW
You need to add the file name pattern to the property cloud.hotfolder.default.mapping.file.name.pattern in your local.properties file:
    cloud.hotfolder.default.mapping.file.name.pattern=^(blog)-\\d+.*
ABCBlog-spring.xml
<beans>
	<!-- Has its own subfolder & the .csv file can be dropped into any folder. -->
	<!-- Blog :: Azure Inbound File Synchronizer and Channel Adapter -->
	<bean id="ABCBlogAzureBlobInboundSynchronizer" class="de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobInboundSynchronizer">
		<constructor-arg name="sessionFactory" ref="azureBlobSessionFactory"/>
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/blog"/>
		<property name="moveToRemoteDirectory" value="#{azureHotfolderRemotePath}/blog/processing"/>
		<property name="deleteRemoteFiles" value="${azure.hotfolder.storage.delete.remote.files}"/>
		<property name="preserveTimestamp" value="true"/>
		<property name="filter" ref="azureHotfolderFileFilter"/>
		<property name="comparator" ref="azureHotFolderFileComparator"/>
	</bean>
	<bean id="ABCBlogAzureBlobSynchronizingMessageSource" class="de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobSynchronizingMessageSource">
		<constructor-arg name="synchronizer" ref="ABCBlogAzureBlobInboundSynchronizer"/>
		<property name="autoCreateLocalDirectory" value="true"/>
		<property name="localDirectory" value="#{azureHotfolderLocalDirectoryBase}/#{azureHotfolderRemotePath}"/>
		<property name="maxFetchSize" value="${azure.hotfolder.storage.polling.fetch.batch-size}"/>
	</bean>
	<int:inbound-channel-adapter id="ABCBlogAzureInboundChannelAdapter" 
		auto-startup="false" 
		role="${cloud.hotfolder.storage.services.role}" 
		phase="50"
		ref="ABCBlogAzureBlobSynchronizingMessageSource"
		channel="hotfolderInboundFileHeaderEnricherChannel">
		<int:poller fixed-rate="${azure.hotfolder.storage.polling.fixed.rate}"
			task-executor="azureChannelAdapterTaskExecutor"
			max-messages-per-poll="${azure.hotfolder.storage.polling.fetch.batch-size}">
			<int:transactional synchronization-factory="ABCBlogAzureSynchronizationFactory"
			transaction-manager="azurePsuedoTxManager"/>
		</int:poller>
	</int:inbound-channel-adapter>
    <int:transaction-synchronization-factory id="ABCBlogAzureSynchronizationFactory">
        <int:after-commit channel="ABCBlogAzureArchiveOutboundChannelAdapter"/>
        <int:after-rollback channel="ABCBlogAzureErrorOutboundChannelAdapter"/>
    </int:transaction-synchronization-factory>
	<!-- Blog :: Azure Outbound Channel adapters for moving remote files into archive/error folders -->
	<bean id="ABCBlogAzureArchiveMessageHandler" parent="abstractAzureMoveMessageHandler">
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/blog/archive"/>
	</bean>
	<bean id="ABCBlogAzureErrorMessageHandler" parent="abstractAzureMoveMessageHandler">
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/blog/error"/>
	</bean>
	<int:outbound-channel-adapter id="ABCBlogAzureArchiveOutboundChannelAdapter"
		ref="ABCBlogAzureArchiveMessageHandler">
	</int:outbound-channel-adapter>
	<int:outbound-channel-adapter id="ABCBlogAzureErrorOutboundChannelAdapter"
		ref="ABCBlogAzureErrorMessageHandler">
	</int:outbound-channel-adapter>
</beans>

Notes:

  • Define the azureBlobInboundSynchronizer and azureBlobSynchronizingMessageSource to setup the sub-folder.

Example: Has its own subfolder and the .csv file must be dropped into the subfolder

In this example, "ABCReviews" has its own sub folder that the .csv file must be dropped into.

ABCReviews - Hot Folder - OLD
<beans>
	<!-- Set up Hot Folder for Reviews. -->
	<bean id="baseDirectoryABCReviews" class="java.lang.String">
		<constructor-arg value="#{baseDirectoryABC}/reviews" />
	</bean>
	<file:inbound-channel-adapter id="batchFilesABCReviews" directory="#{baseDirectoryABCReviews}" filter="filterDoneABCGeneric">
		<int:poller fixed-rate="1000" />
	</file:inbound-channel-adapter>
	<file:outbound-gateway request-channel="batchFilesABCReviews" reply-channel="batchFilesABCReviewsProc"
		directory="#{baseDirectoryABCReviews}/processing" delete-source-files="true" />
	<int:service-activator input-channel="batchFilesABCReviewsProc" output-channel="batchFilesHeaderInit" ref="ABCReviewsHeaderSetupTask"
		method="execute" />
	<bean id="ABCReviewsHeaderSetupTask" class="de.hybris.platform.acceleratorservices.dataimport.batch.task.HeaderSetupTask">
		<property name="catalog" value="ABCProductCatalogUS" />
		<property name="net" value="false" />
		<property name="storeBaseDirectory" ref="baseDirectoryABCReviews" />
	</bean>	
</beans>
ABCReviews - Cloud Hot Folder - NEW
<beans>
	<!-- Has its own subfolder & the .csv file must be dropped into the subfolder. -->
	<!-- reviews uses the catalog=ABCProductCatalogUS. -->
	<!-- Map files to channel based on name pattern: reviews. -->
	<bean class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
		<property name="targetObject" ref="hotfolderInboundFileChannelMappings"/>
		<property name="targetMethod" value="put"/>
		<property name="arguments">
			<list>
				<bean class="java.util.regex.Pattern" factory-method="compile">
					<constructor-arg value="^(reviews)-\d+.*" />
				</bean>
				<ref bean="ABCReviewsBatchFilesCloudHotFolderProc"/>
			</list>
		</property>
	</bean>
	<int:channel id="ABCReviewsBatchFilesCloudHotFolderProc"/>
	<int:service-activator input-channel="ABCReviewsBatchFilesCloudHotFolderProc" 
		output-channel="batchFilesHeaderInit"
		ref="ABCReviewsCloudHotFolderHeaderSetupTask" 
		method="execute">
	</int:service-activator>
	<bean id="ABCReviewsCloudHotFolderHeaderSetupTask" class="de.hybris.platform.acceleratorservices.dataimport.batch.task.HeaderSetupTask">
		<property name="catalog" value="ABCProductCatalogUS" />
		<property name="net" value="${cloud.hotfolder.default.mapping.header.net}" />
		<property name="storeBaseDirectory" ref="baseDirectoryCloudHotFolder" />
	</bean>
	<!-- Reviews :: Azure Inbound File Synchronizer and Channel Adapter -->
	<bean id="ABCReviewsAzureBlobInboundSynchronizer" class="de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobInboundSynchronizer">
		<constructor-arg name="sessionFactory" ref="azureBlobSessionFactory"/>
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/reviews"/>
		<property name="moveToRemoteDirectory" value="#{azureHotfolderRemotePath}/reviews/processing"/>
		<property name="deleteRemoteFiles" value="${azure.hotfolder.storage.delete.remote.files}"/>
		<property name="preserveTimestamp" value="true"/>
		<property name="filter" ref="azureHotfolderFileFilter"/>
		<property name="comparator" ref="azureHotFolderFileComparator"/>
	</bean>
	<bean id="ABCReviewsAzureBlobSynchronizingMessageSource" class="de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobSynchronizingMessageSource">
		<constructor-arg name="synchronizer" ref="ABCReviewsAzureBlobInboundSynchronizer"/>
		<property name="autoCreateLocalDirectory" value="true"/>
		<property name="localDirectory" value="#{azureHotfolderLocalDirectoryBase}/#{azureHotfolderRemotePath}"/>
		<property name="maxFetchSize" value="${azure.hotfolder.storage.polling.fetch.batch-size}"/>
	</bean>
	<int:inbound-channel-adapter id="ABCReviewsAzureInboundChannelAdapter" 
		auto-startup="false" 
		role="${cloud.hotfolder.storage.services.role}" 
		phase="50"
		ref="ABCReviewsAzureBlobSynchronizingMessageSource"
		channel="hotfolderInboundFileHeaderEnricherChannel">
		<int:poller fixed-rate="${azure.hotfolder.storage.polling.fixed.rate}"
			task-executor="azureChannelAdapterTaskExecutor"
			max-messages-per-poll="${azure.hotfolder.storage.polling.fetch.batch-size}">
			<int:transactional synchronization-factory="ABCReviewsAzureSynchronizationFactory"
			transaction-manager="azurePsuedoTxManager"/>
		</int:poller>
	</int:inbound-channel-adapter>
    <int:transaction-synchronization-factory id="ABCReviewsAzureSynchronizationFactory">
        <int:after-commit channel="ABCReviewsAzureArchiveOutboundChannelAdapter"/>
        <int:after-rollback channel="ABCReviewsAzureErrorOutboundChannelAdapter"/>
    </int:transaction-synchronization-factory>
	<!-- Reviews :: Azure Outbound Channel adapters for moving remote files into archive/error folders -->
	<bean id="ABCReviewsAzureArchiveMessageHandler" parent="abstractAzureMoveMessageHandler">
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/reviews/archive"/>
	</bean>
	<bean id="ABCReviewsAzureErrorMessageHandler" parent="abstractAzureMoveMessageHandler">
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/reviews/error"/>
	</bean>
	<int:outbound-channel-adapter id="ABCReviewsAzureArchiveOutboundChannelAdapter"
		ref="ABCReviewsAzureArchiveMessageHandler">
	</int:outbound-channel-adapter>
	<int:outbound-channel-adapter id="ABCReviewsAzureErrorOutboundChannelAdapter"
		ref="ABCReviewsAzureErrorMessageHandler">
	</int:outbound-channel-adapter>
</beans>

Notes:

  • The batchFilesCloudHotFolderProc sets up the pattern that this Cloud Hot Folder will accept.

Example: File priority order (sorting)

In this example, "ABCCamera" has sorting specific to this Cloud Hot Folder instead of on a global level (see Synchronizing, Sorting, and Filtering for more information).

ABCCamera - Hot Folder - OLD
<beans>
	<!-- Set up Hot Folder for Camera. -->	
     <bean id="ABCCameraComparator" class="de.hybris.platform.acceleratorservices.dataimport.batch.FileOrderComparator">
		<property name="prefixPriority">
			<map>
				<!-- default priority is 0 -->
				<entry key="flash" value="5" />
				<entry key="tripod" value="4" />
				<entry key="lens" value="3" />
				<entry key="model" value="2" />
				<entry key="manufacturer" value="1" />
			</map>
		</property>
	</bean>
	<!-- Set up Hot Folder for Camera. -->
	<bean id="baseDirectoryABCCamera" class="java.lang.String">
		<constructor-arg value="#{baseDirectoryABC}/camera" />
	</bean>
	<file:inbound-channel-adapter id="batchFilesABCCamera" directory="#{baseDirectoryABCCamera}"
		filter="filterDoneABCGeneric" comparator="ABCCameraComparator" >
		<int:poller fixed-rate="1000" />
	</file:inbound-channel-adapter>
	<file:outbound-gateway request-channel="batchFilesABCCamera" reply-channel="batchFilesABCCameraProc"
		directory="#{baseDirectoryABCCamera}/processing" delete-source-files="true" />
	<int:service-activator input-channel="batchFilesABCCameraProc" output-channel="batchFilesHeaderInit" ref="ABCCameraHeaderSetupTask"
		method="execute" />
	<bean id="ABCCameraHeaderSetupTask" class="de.hybris.platform.acceleratorservices.dataimport.batch.task.HeaderSetupTask">
		<property name="catalog" value="ABCProductCatalogUS" />
		<property name="net" value="false" />
		<property name="storeBaseDirectory" ref="baseDirectoryABCCamera" />
	</bean>	
</beans>
ABCCamera - Cloud Hot Folder - NEW
<beans>
	<!-- Has its own subfolder & the .csv file must be dropped into the subfolder. -->
	<!-- camera uses the catalog=ABCProductCatalogUS. -->
	<!-- Map files to channel based on name pattern: camera = manufacturer, model, lens, tripod, flash. -->
	<bean class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
		<property name="targetObject" ref="hotfolderInboundFileChannelMappings"/>
		<property name="targetMethod" value="put"/>
		<property name="arguments">
			<list>
				<bean class="java.util.regex.Pattern" factory-method="compile">
					<constructor-arg value="^(manufacturer,model,lens,tripod,flash)-\d+.*" />
				</bean>
				<ref bean="ABCCameraBatchFilesCloudHotFolderProc"/>
			</list>
		</property>
	</bean>
	<int:channel id="ABCCameraBatchFilesCloudHotFolderProc"/>
	<int:service-activator input-channel="ABCCameraBatchFilesCloudHotFolderProc" 
		output-channel="batchFilesHeaderInit"
		ref="ABCCameraCloudHotFolderHeaderSetupTask" 
		method="execute">
	</int:service-activator>
	<bean id="ABCCameraCloudHotFolderHeaderSetupTask" class="de.hybris.platform.acceleratorservices.dataimport.batch.task.HeaderSetupTask">
		<property name="catalog" value="ABCProductCatalogUS" />
		<property name="net" value="${cloud.hotfolder.default.mapping.header.net}" />
		<property name="storeBaseDirectory" ref="baseDirectoryCloudHotFolder" />
	</bean>
	<!-- Camera :: Sorting to handle camera files in priority order. -->
    <bean id="ABCCameraCloudHotFolderNamePrefixComparator"
          class="de.hybris.platform.cloud.commons.spring.integration.file.comparators.NamePrefixComparator">
        <constructor-arg name="namePrefixPriority" value="manufacturer,model,lens,tripod,flash"/>
    </bean>
    <bean id="ABCCameraAzureHotFolderFileNamePrefixComparator" class="de.hybris.platform.cloud.azure.hotfolder.remote.file.comparators.AzureBlobNameComparatorAdapter">
        <constructor-arg name="comparator" ref="ABCCameraCloudHotFolderNamePrefixComparator"/>
    </bean>
    <util:list id="ABCCameraAzureHotFolderFileComparatorList">
        <ref bean="ABCCameraAzureHotFolderFileNamePrefixComparator"/>
        <ref bean="azureHotFolderFileNameComparator"/>
        <ref bean="azureHotFolderFileNameSequenceComparator"/>
        <ref bean="azureHotFolderFileModifiedComparator"/>
    </util:list>
    <bean id="ABCCameraAzureHotFolderFileComparator" class="org.apache.commons.collections4.comparators.ComparatorChain">
        <constructor-arg ref="ABCCameraAzureHotFolderFileComparatorList"/>
    </bean>
	<!-- Camera :: Azure Inbound File Synchronizer and Channel Adapter -->
	<bean id="ABCCameraAzureBlobInboundSynchronizer" class="de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobInboundSynchronizer">
		<constructor-arg name="sessionFactory" ref="azureBlobSessionFactory"/>
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/camera"/>
		<property name="moveToRemoteDirectory" value="#{azureHotfolderRemotePath}/camera/processing"/>
		<property name="deleteRemoteFiles" value="${azure.hotfolder.storage.delete.remote.files}"/>
		<property name="preserveTimestamp" value="true"/>
		<property name="filter" ref="azureHotfolderFileFilter"/>
		<property name="comparator" ref="ABCCameraAzureHotFolderFileComparator"/>
	</bean>
	<bean id="ABCCameraAzureBlobSynchronizingMessageSource" class="de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobSynchronizingMessageSource">
		<constructor-arg name="synchronizer" ref="ABCCameraAzureBlobInboundSynchronizer"/>
		<property name="autoCreateLocalDirectory" value="true"/>
		<property name="localDirectory" value="#{azureHotfolderLocalDirectoryBase}/#{azureHotfolderRemotePath}"/>
		<property name="maxFetchSize" value="${azure.hotfolder.storage.polling.fetch.batch-size}"/>
	</bean>
	<int:inbound-channel-adapter id="ABCCameraAzureInboundChannelAdapter" 
		auto-startup="false" 
		role="${cloud.hotfolder.storage.services.role}" 
		phase="50"
		ref="ABCCameraAzureBlobSynchronizingMessageSource"
		channel="hotfolderInboundFileHeaderEnricherChannel">
		<int:poller fixed-rate="${azure.hotfolder.storage.polling.fixed.rate}"
			task-executor="azureChannelAdapterTaskExecutor"
			max-messages-per-poll="${azure.hotfolder.storage.polling.fetch.batch-size}">
			<int:transactional synchronization-factory="ABCCameraAzureSynchronizationFactory"
			transaction-manager="azurePsuedoTxManager"/>
		</int:poller>
	</int:inbound-channel-adapter>
    <int:transaction-synchronization-factory id="ABCCameraAzureSynchronizationFactory">
        <int:after-commit channel="ABCCameraAzureArchiveOutboundChannelAdapter"/>
        <int:after-rollback channel="ABCCameraAzureErrorOutboundChannelAdapter"/>
    </int:transaction-synchronization-factory>
	<!-- Camera :: Azure Outbound Channel adapters for moving remote files into archive/error folders -->
	<bean id="ABCCameraAzureArchiveMessageHandler" parent="abstractAzureMoveMessageHandler">
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/camera/archive"/>
	</bean>
	<bean id="ABCCameraAzureErrorMessageHandler" parent="abstractAzureMoveMessageHandler">
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/camera/error"/>
	</bean>
	<int:outbound-channel-adapter id="ABCCameraAzureArchiveOutboundChannelAdapter"
		ref="ABCCameraAzureArchiveMessageHandler">
	</int:outbound-channel-adapter>
	<int:outbound-channel-adapter id="ABCCameraAzureErrorOutboundChannelAdapter"
		ref="ABCCameraAzureErrorMessageHandler">
	</int:outbound-channel-adapter>
</beans>


Notes:

  • The property cloud.hotfolder.storage.file.sort.name.prefix.priority controls sorting on a global level (for all Cloud Hot Folders).

  • Here you setup the comparator with namePrefixPriority to process .csv files in a specific order.

Example: headerSetupTask set catalog

In this example, "ABCReviews" has its own sub folder that the .csv file must be dropped into and also shows how to change the header catalog. The default catalog is ABCMasterProductCatalog as defined in local.properties - cloud.hotfolder.default.zip.header.catalog .

ABCReviews - Hot Folder - OLD
<beans>
	<!-- Set up Hot Folder for Reviews. -->
	<bean id="baseDirectoryABCReviews" class="java.lang.String">
		<constructor-arg value="#{baseDirectoryABC}/reviews" />
	</bean>
	<file:inbound-channel-adapter id="batchFilesABCReviews" directory="#{baseDirectoryABCReviews}" filter="filterDoneABCGeneric">
		<int:poller fixed-rate="1000" />
	</file:inbound-channel-adapter>
	<file:outbound-gateway request-channel="batchFilesABCReviews" reply-channel="batchFilesABCReviewsProc"
		directory="#{baseDirectoryABCReviews}/processing" delete-source-files="true" />
	<int:service-activator input-channel="batchFilesABCReviewsProc" output-channel="batchFilesHeaderInit" ref="ABCReviewsHeaderSetupTask"
		method="execute" />
	<bean id="ABCReviewsHeaderSetupTask" class="de.hybris.platform.acceleratorservices.dataimport.batch.task.HeaderSetupTask">
		<property name="catalog" value="ABCProductCatalogUS" />
		<property name="net" value="false" />
		<property name="storeBaseDirectory" ref="baseDirectoryABCReviews" />
	</bean>	
</beans>
ABCReviews - Cloud Hot Folder - NEW
<beans>
	<!-- Has its own subfolder & the .csv file must be dropped into the subfolder. -->
	<!-- reviews uses the catalog=ABCProductCatalogUS. -->
	<!-- Map files to channel based on name pattern: reviews. -->
	<bean class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
		<property name="targetObject" ref="hotfolderInboundFileChannelMappings"/>
		<property name="targetMethod" value="put"/>
		<property name="arguments">
			<list>
				<bean class="java.util.regex.Pattern" factory-method="compile">
					<constructor-arg value="^(reviews)-\d+.*" />
				</bean>
				<ref bean="ABCReviewsBatchFilesCloudHotFolderProc"/>
			</list>
		</property>
	</bean>
	<int:channel id="ABCReviewsBatchFilesCloudHotFolderProc"/>
	<int:service-activator input-channel="ABCReviewsBatchFilesCloudHotFolderProc" 
		output-channel="batchFilesHeaderInit"
		ref="ABCReviewsCloudHotFolderHeaderSetupTask" 
		method="execute">
	</int:service-activator>
	<bean id="ABCReviewsCloudHotFolderHeaderSetupTask" class="de.hybris.platform.acceleratorservices.dataimport.batch.task.HeaderSetupTask">
		<property name="catalog" value="ABCProductCatalogUS" />
		<property name="net" value="${cloud.hotfolder.default.mapping.header.net}" />
		<property name="storeBaseDirectory" ref="baseDirectoryCloudHotFolder" />
	</bean>
	<!-- Reviews :: Azure Inbound File Synchronizer and Channel Adapter -->
	<bean id="ABCReviewsAzureBlobInboundSynchronizer" class="de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobInboundSynchronizer">
		<constructor-arg name="sessionFactory" ref="azureBlobSessionFactory"/>
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/reviews"/>
		<property name="moveToRemoteDirectory" value="#{azureHotfolderRemotePath}/reviews/processing"/>
		<property name="deleteRemoteFiles" value="${azure.hotfolder.storage.delete.remote.files}"/>
		<property name="preserveTimestamp" value="true"/>
		<property name="filter" ref="azureHotfolderFileFilter"/>
		<property name="comparator" ref="azureHotFolderFileComparator"/>
	</bean>
	<bean id="ABCReviewsAzureBlobSynchronizingMessageSource" class="de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobSynchronizingMessageSource">
		<constructor-arg name="synchronizer" ref="ABCReviewsAzureBlobInboundSynchronizer"/>
		<property name="autoCreateLocalDirectory" value="true"/>
		<property name="localDirectory" value="#{azureHotfolderLocalDirectoryBase}/#{azureHotfolderRemotePath}"/>
		<property name="maxFetchSize" value="${azure.hotfolder.storage.polling.fetch.batch-size}"/>
	</bean>
	<int:inbound-channel-adapter id="ABCReviewsAzureInboundChannelAdapter" 
		auto-startup="false" 
		role="${cloud.hotfolder.storage.services.role}" 
		phase="50"
		ref="ABCReviewsAzureBlobSynchronizingMessageSource"
		channel="hotfolderInboundFileHeaderEnricherChannel">
		<int:poller fixed-rate="${azure.hotfolder.storage.polling.fixed.rate}"
			task-executor="azureChannelAdapterTaskExecutor"
			max-messages-per-poll="${azure.hotfolder.storage.polling.fetch.batch-size}">
			<int:transactional synchronization-factory="ABCReviewsAzureSynchronizationFactory"
			transaction-manager="azurePsuedoTxManager"/>
		</int:poller>
	</int:inbound-channel-adapter>
    <int:transaction-synchronization-factory id="ABCReviewsAzureSynchronizationFactory">
        <int:after-commit channel="ABCReviewsAzureArchiveOutboundChannelAdapter"/>
        <int:after-rollback channel="ABCReviewsAzureErrorOutboundChannelAdapter"/>
    </int:transaction-synchronization-factory>
	<!-- Reviews :: Azure Outbound Channel adapters for moving remote files into archive/error folders -->
	<bean id="ABCReviewsAzureArchiveMessageHandler" parent="abstractAzureMoveMessageHandler">
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/reviews/archive"/>
	</bean>
	<bean id="ABCReviewsAzureErrorMessageHandler" parent="abstractAzureMoveMessageHandler">
		<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/reviews/error"/>
	</bean>
	<int:outbound-channel-adapter id="ABCReviewsAzureArchiveOutboundChannelAdapter"
		ref="ABCReviewsAzureArchiveMessageHandler">
	</int:outbound-channel-adapter>
	<int:outbound-channel-adapter id="ABCReviewsAzureErrorOutboundChannelAdapter"
		ref="ABCReviewsAzureErrorMessageHandler">
	</int:outbound-channel-adapter>
</beans>

Notes:

  • The default catalog to use when importing .csv files in Cloud Hot Folders is set by the property cloud.hotfolder.default.mapping.header.catalog in local.properties. 

    cloud.hotfolder.default.mapping.header.catalog=ABCMasterProductCatalog
  • In this example you override the default catalog in the  ABCReviewsCloudHotFolderHeaderSetupTask (see line 26-30).

Error Handling

Impex Load Error

Out of the box, Cloud Hot Folder is supposed to move the .csv file to the error folder when there is a problem processing the file. When an impex file has load errors the CronJob finishes successfully even though the impex load had errors. Since the CronJob finished successfully the .csv file gets copied to the archive folder.

synchronizeToLocalDirectory() Exceptions

The Out of the Box (OOTB) code uses a monitor service to log while processing the Cloud Hot Folder. This may not be completely reliable and may throw a NullPointerException in the method synchronizeToLocalDirectory(), eventually the .csv file does get processed successfully, however there cab be a lot of noise from these exception(s).

synchronizeToLocalDirectory() Exception
ERROR [task-scheduler-2] [ErrorHandler] unexpected exception caught
org.springframework.messaging.MessagingException: Problem occurred while synchronizing 'master/hotfolder/blog' to local directory; nested exception is org.springframework.messaging.MessagingException: 
Failed to execute on session; nested exception is java.lang.NullPointerException
	at de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobInboundSynchronizer.synchronizeToLocalDirectory(AzureBlobInboundSynchronizer.java:414) ~[classes/:?]
	at de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobSynchronizingMessageSource.synchronizeFiles(AzureBlobSynchronizingMessageSource.java:196) ~[classes/:?]
	at de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobSynchronizingMessageSource.poll(AzureBlobSynchronizingMessageSource.java:188) ~[classes/:?]
	at de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobSynchronizingMessageSource.doReceive(AzureBlobSynchronizingMessageSource.java:170) ~[classes/:?]
	at de.hybris.platform.cloud.azure.hotfolder.remote.inbound.AzureBlobSynchronizingMessageSource.doReceive(AzureBlobSynchronizingMessageSource.java:36) ~[classes/:?]
	at org.springframework.integration.endpoint.AbstractMessageSource.receive(AbstractMessageSource.java:134) ~[spring-integration-core-4.3.19.RELEASE.jar:4.3.19.RELEASE]
	at org.springframework.integration.endpoint.SourcePollingChannelAdapter.receiveMessage(SourcePollingChannelAdapter.java:224) ~[spring-integration-core-4.3.19.RELEASE.jar:4.3.19.RELEASE]

Fixes

Here are some fixes that you can apply to your own code base for the above two problems. When there is an impex import error the import will fail, copy the .csv file to the error folder instead of the archive folder and send an email with the name of the failed file. It doesn’t cover all situations, use at your own risk. The default monitor service has also been replaced since, it was throwing exceptions during synchronizeToLocalDirectory() too often.

ABC-spring.xml
<!-- Override monitorService.  -->
<!-- See cloudcommons/cloudcommons-monitoring.xml -->
<alias name="ABCMonitorService" alias="monitorService"/>
	<bean id="ABCMonitorService" class="com.ABC.b2c.core.cloud.commons.services.monitor.ABCMonitorService">
	<property name="emailService" ref="emailService" />
	<property name="configurationService" ref="configurationService" />
</bean>
<!-- Override batchRunnerTask (AbstractImpexRunnerTask) for error handling. --> 
<!-- See cloudhotfolder/cloudhotfolder-spring.xml -->
<bean id="batchRunnerTask" class="com.ABC.b2c.core.dataimport.batch.task.ABCImpexRunnerTask">
	<property name="sessionService" ref="sessionService" />
	<property name="importService" ref="aopMonitoringImportService" />
	<lookup-method name="getImportConfig" bean="importConfig" /> 
</bean>
<!-- Override batchHeaderAspectBean for better error handling. -->
<!-- See cloudhotfolder/hot-folder-aop.xml -->
<bean id="batchHeaderAspectBean" class="com.ABC.b2c.core.cloud.hotfolder.aop.ABCBatchHeaderAspect" parent="abstractMonitoringAspect">
	<property name="fileNameHeaderKey" ref="fileNameHeaderKey"/>
</bean>
ABCBatchHeader.java
package com.ABC.b2c.core.dataimport.batch;
import de.hybris.platform.acceleratorservices.dataimport.batch.BatchHeader;
import de.hybris.platform.servicelayer.impex.ImportResult;
public class ABCBatchHeader extends BatchHeader {
	private ImportResult importResult;
	public ABCBatchHeader(BatchHeader batchHeader) {
		this.setCatalog(batchHeader.getCatalog());
		this.setEncoding(batchHeader.getEncoding());
		this.setFile(batchHeader.getFile());
		this.setLanguage(batchHeader.getLanguage());
		this.setNet(batchHeader.isNet());
		this.setSequenceId(batchHeader.getSequenceId());
		this.setStoreBaseDirectory(batchHeader.getStoreBaseDirectory());
		this.setTransformedFiles(batchHeader.getTransformedFiles());
	}
	public ImportResult getImportResut() {
		return importResult;
	}
	public void setImportResult(final ImportResult importResult) {
		this.importResult = importResult;
	}
}

Notes:

  • Added importResult to get the status (success/fail) from the impex import.


ABCBatchHeaderAspect.java
package com.ABC.b2c.core.cloud.hotfolder.aop;
import static org.slf4j.LoggerFactory.getLogger;
import de.hybris.platform.acceleratorservices.dataimport.batch.BatchHeader;
import de.hybris.platform.cloud.commons.aop.exception.StepException;
import de.hybris.platform.cloud.hotfolder.aop.BatchHeaderAspect;
import de.hybris.platform.cloud.commons.services.monitor.Status;
import de.hybris.platform.cloud.commons.services.monitor.Step;
import de.hybris.platform.cloud.hotfolder.dataimport.batch.zip.ZipBatchHeader;
import java.util.Date;
import org.slf4j.Logger;
import org.aspectj.lang.ProceedingJoinPoint;
import de.hybris.platform.servicelayer.impex.ImportResult;
import com.ABC.b2c.core.cloud.commons.services.monitor.ABCMonitorService;
import com.ABC.b2c.core.dataimport.batch.ABCBatchHeader;
public class ABCBatchHeaderAspect extends BatchHeaderAspect {
	private static final Logger LOG = getLogger(ABCBatchHeaderAspect.class);
	/**
	 * AOP Around implementation to monitor the execution of an transformed impex associated with a Header
	 *
	 * @param pjp    Pointcut to be executed
	 * @param header the Header being executed
	 * @return the value from the execution
	 * @throws Throwable any exception thrown by the method called
	 */
	@Override
	public Object aroundExecute(final ProceedingJoinPoint pjp, final BatchHeader header) throws Throwable {
		if (LOG.isDebugEnabled()) {
			LOG.debug("about to run method [{}] on target [{}]", pjp.getSignature().getName(), pjp.getTarget());
		}
		final String fileName = getFileName(header);
		final Date started = new Date();
		Object proceed = null;
		try {
			proceed = pjp.proceed();
		}
		catch (final Exception e) {
			monitorFailedStep(Step.HEADER_EXECUTED, started, e, "Failed to process file [{}]", fileName);
			endMonitor(Status.FAILURE);
			throw new StepException(Step.HEADER_EXECUTED, e);
		}
		// Check importResult from impex import since impex quietly fails.
		if (proceed instanceof ABCBatchHeader) {
			final ABCBatchHeader ABCBatchHeader = (ABCBatchHeader) proceed;
			final ImportResult importResult = ABCBatchHeader.getImportResut();
			if (importResult.isSuccessful()) {
				monitorSuccessfulStep(Step.HEADER_EXECUTED, started, "Successfully processed file [{}]. " +
						"SequenceId [{}]", fileName, header.getSequenceId());
				return proceed;
			} else if (importResult.isError()) {
				monitorFailedStep(Step.HEADER_EXECUTED, started, null, "Failed to process file [{}]", fileName);
				endMonitor(Status.FAILURE);
				throw new StepException(Step.HEADER_EXECUTED, null);
			}
		}
		return proceed;
	}
	private String getFileName(final BatchHeader header) {
		return header instanceof ZipBatchHeader
				? ((ZipBatchHeader) header).getOriginalFileName()
				: header.getFile().getName();
	}
}

Notes:

  • Extended aroundExecute() to check importResult from impex import.


ABCImpexRunnerTask.java
package com.ABC.b2c.core.dataimport.batch.task;
import de.hybris.platform.acceleratorservices.dataimport.batch.BatchHeader;
import de.hybris.platform.acceleratorservices.dataimport.batch.HeaderTask;
import de.hybris.platform.acceleratorservices.dataimport.batch.task.AbstractImpexRunnerTask;
import de.hybris.platform.servicelayer.impex.ImpExResource;
import de.hybris.platform.servicelayer.impex.ImportConfig;
import de.hybris.platform.servicelayer.impex.ImportResult;
import de.hybris.platform.servicelayer.impex.ImportService;
import de.hybris.platform.servicelayer.impex.impl.StreamBasedImpExResource;
import de.hybris.platform.servicelayer.session.Session;
import de.hybris.platform.servicelayer.session.SessionService;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.IOException;
import org.apache.commons.collections.CollectionUtils;
import org.apache.log4j.Logger;
import org.springframework.beans.factory.annotation.Required;
import org.springframework.util.Assert;
import com.ABC.b2c.core.dataimport.batch.ABCBatchHeader;
public class ABCImpexRunnerTask extends AbstractImpexRunnerTask {
	private static final Logger LOG = Logger.getLogger(ABCImpexRunnerTask.class);
	@Override
	public BatchHeader execute(final BatchHeader header) throws FileNotFoundException {
		Assert.notNull(header);
		Assert.notNull(header.getEncoding());
		final ABCBatchHeader ABCBatchHeader = new ABCBatchHeader(header);
		if (CollectionUtils.isNotEmpty(ABCBatchHeader.getTransformedFiles())) {
			final Session localSession = getSessionService().createNewSession();
			try {
				for (final File file : ABCBatchHeader.getTransformedFiles()) {
					final ImportResult importResult = processFile(file, ABCBatchHeader);
					ABCBatchHeader.setImportResult(importResult);
					if (importResult.isError()) {
						return ABCBatchHeader;
					}
				}
			}
			finally {
				getSessionService().closeSession(localSession);
			}
		}
		return ABCBatchHeader;
	}
	/**
	 * Process an impex file using the given encoding
	 *
	 * @param file
	 * @param encoding
	 * @throws FileNotFoundException
	 */
	protected ImportResult processFile(final File file, final ABCBatchHeader ABCBatchHeader) throws FileNotFoundException {
		ImportResult importResult = null;
		try (final FileInputStream fis = new FileInputStream(file)) {
			final ImportConfig config = getImportConfig();
			if (config == null) {
				LOG.error(String.format("Import config not found. The file %s won't be imported.",
						file == null ? null : file.getName()));
				return null;
			}
			final ImpExResource resource = new StreamBasedImpExResource(fis, ABCBatchHeader.getEncoding());
			config.setScript(resource);
			importResult = getImportService().importData(config);
			if (importResult.isError() && importResult.hasUnresolvedLines())
			{
				LOG.error(importResult.getUnresolvedLines().getPreview());
			}
		}
		catch (final IOException | IllegalStateException e) {
			LOG.error("Error occured while process file: " + file, e);
		}
		return importResult; 
	}
	/**
	 * Lookup method is configured in spring config to return the import config.
	 * @see de.hybris.platform.acceleratorservices.dataimport.batch.task.AbstractImpexRunnerTask#getImportConfig()
	 */
	@Override
	public ImportConfig getImportConfig()
	{
		return null;
	}
}

Notes:

  • Extended execute() to fail fast when there is an impex import error.
  • New method processFile(final File file, final ABCBatchHeader ABCBatchHeader) to return ImportResult.


ABCMonitorService.java
package com.ABC.b2c.core.cloud.commons.services.monitor;
import de.hybris.platform.acceleratorservices.email.EmailService;
import de.hybris.platform.acceleratorservices.model.email.EmailAddressModel;
import de.hybris.platform.acceleratorservices.model.email.EmailMessageModel;
import de.hybris.platform.cloud.commons.services.monitor.MonitorHistory;
import de.hybris.platform.cloud.commons.services.monitor.MonitorService;
import de.hybris.platform.cloud.commons.services.monitor.Status;
import de.hybris.platform.cloud.commons.services.monitor.Step;
import de.hybris.platform.cloud.commons.services.monitor.SystemArea;
import de.hybris.platform.servicelayer.config.ConfigurationService;
import org.slf4j.helpers.MessageFormatter;
import org.springframework.beans.factory.annotation.Required;
import java.time.LocalDateTime;
import java.time.ZoneId;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.Optional;
import org.apache.log4j.Logger;
public class ABCMonitorService implements MonitorService
{
	private static final Logger LOGGER = Logger.getLogger(ABCMonitorService.class);
	private static final String SEND_EMAIL_ON_ERROR = "ABC.hotFolder.sendMailOnError";
	private static final String EMAIL_FROM = "ABC.hotFolder.emailFromAddress";
	private static final String EMAIL_FROM_DEFAULT = "hotfolder.import@ABC.com";
	private static final String EMAIL_FROM_DISPLAY_NAME = "HotFolder";
	private static final String EMAIL_TO = "ABC.hotFolder.emailToAddress";
	private static final String EMAIL_TO_DISPLAY_NAME = "ABC";
	private static final String EMAIL_SUBJECT = "HotFolder Import Error";
	private static final String ERROR_FILE_PREFIX = "error_";
	private ConfigurationService configurationService;
	private EmailService emailService;
	private MonitorHistory monitorHistory;
	/**
	 * @see de.hybris.platform.cloud.commons.services.monitor.MonitorService#begin(de.hybris.platform.cloud.commons.services.monitor.SystemArea, java.lang.String)
	 */
	@Override
	public MonitorHistory begin(SystemArea area, String key) {
		return lazyMonitorHistory();
	}
	/**
	 * @see de.hybris.platform.cloud.commons.services.monitor.MonitorService#current()
	 */
	@Override
	public Optional<MonitorHistory> current() {
		Optional<MonitorHistory> monitorHistory = Optional.ofNullable(lazyMonitorHistory());
		return monitorHistory;
	}
	/**
	 * @see de.hybris.platform.cloud.commons.services.monitor.MonitorService#resume(de.hybris.platform.cloud.commons.services.monitor.SystemArea, java.lang.String)
	 */
	@Override
	public MonitorHistory resume(SystemArea area, String key) {
		return lazyMonitorHistory();
	}
	public boolean getSendEmailOnError() {
		return getConfigurationService().getConfiguration().getBoolean(SEND_EMAIL_ON_ERROR, false);
	}
	public String getEmailFrom() {
		return getConfigurationService().getConfiguration().getString(EMAIL_FROM, EMAIL_FROM_DEFAULT);
	}
	public EmailAddressModel getFromEmailAddress() {
		return getEmailService().getOrCreateEmailAddressForEmail(getEmailFrom(), EMAIL_FROM_DISPLAY_NAME);
	}
	public String getEmailTo() {
		return getConfigurationService().getConfiguration().getString(EMAIL_TO);
	}
	public EmailAddressModel getToEmailAddress() {
		return getEmailService().getOrCreateEmailAddressForEmail(getEmailTo(), EMAIL_TO_DISPLAY_NAME);
	}
	public List<EmailAddressModel> getToEmailAddressList() {
		List<EmailAddressModel> emailToAddresses = new ArrayList<>();
		emailToAddresses.add(getToEmailAddress());
		return emailToAddresses;
	}
	public ConfigurationService getConfigurationService() {
		return configurationService;
	}
	@Required
	public void setConfigurationService(ConfigurationService configurationService) {
		this.configurationService = configurationService;
	}
	public EmailService getEmailService() {
		return emailService;
	}
	@Required
	public void setEmailService(EmailService emailService)
	{
		this.emailService = emailService;
	}
	public MonitorHistory getMonitorHistory() {
		return monitorHistory;
	}
	public MonitorHistory lazyMonitorHistory() {
		if (getMonitorHistory() == null) {
			setMonitorHistory(new ABCMonitorHistory());
		}
		return getMonitorHistory();
	}
	public void setMonitorHistory(final MonitorHistory monitorHistory) {
		this.monitorHistory = monitorHistory;
	}
	class ABCMonitorHistory implements MonitorHistory {
		/**
		 * @see de.hybris.platform.cloud.commons.services.monitor.MonitorHistory#addAction(java.lang.String, de.hybris.platform.cloud.commons.services.monitor.Status, java.util.Date, java.util.Date, java.lang.String, java.lang.Object[])
		 */
		@Override
		public MonitorHistory addAction(String code, Status status, Date start, Date end, String message, Object... args) {
			LOGGER.info("addAction() code=" + code + 
					" status=" + status + 
					" start=" + formatDate(start) + 
					" end=" + formatDate(end) + 
					" message=" + formatMessage(message, args));
			return this;
		}
		/**
		 * @see de.hybris.platform.cloud.commons.services.monitor.MonitorHistory#stepSucceeded(de.hybris.platform.cloud.commons.services.monitor.Step, java.util.Date, java.util.Date, java.lang.String, java.lang.Object[])
		 */
		@Override
		public MonitorHistory stepSucceeded(Step step, Date start, Date end, String message, Object... args) {
			LOGGER.info("stepSucceeded() step=" + step + 
					" start=" + formatDate(start) + 
					" end=" + formatDate(end) + 
					" message=" + formatMessage(message, args));
			return this;
		}
		/**
		 * @see de.hybris.platform.cloud.commons.services.monitor.MonitorHistory#stepFailed(de.hybris.platform.cloud.commons.services.monitor.Step, java.util.Date, java.util.Date, java.lang.Throwable, java.lang.String, java.lang.Object[])
		 */
		@Override
		public MonitorHistory stepFailed(Step step, Date start, Date end, Throwable throwable, String message, Object... args) {
			LOGGER.info("stepFailed() step=" + step + 
					" start=" + formatDate(start) + 
					" end=" + formatDate(end) + 
					" message=" + formatMessage(message, args));
			LOGGER.error(throwable);
			stepFailedEmail(formatMessage(message, args));
			return this;
		}
		protected void stepFailedEmail(final String message) {
			if (!getSendEmailOnError()) {
				return;
			}
			final EmailMessageModel emailMessage = emailService.createEmailMessage(getToEmailAddressList(), 
					null, 
					null, 
					getFromEmailAddress(), 
					null, 
					EMAIL_SUBJECT,
					message, 
					null);
			emailService.send(emailMessage);
		}
		/**
		 * @see de.hybris.platform.cloud.commons.services.monitor.MonitorHistory#checkpoint()
		 */
		@Override
		public void checkpoint() {
			LOGGER.info("checkpoint()");
		}
		/**
		 * @see de.hybris.platform.cloud.commons.services.monitor.MonitorHistory#end(de.hybris.platform.cloud.commons.services.monitor.Status)
		 */
		@Override
		public void end(Status status) {
			LOGGER.info("end() status=" + status);
		}
		protected String formatDate(final Date date) {
			if (date == null) { 
				return "";
			}
			final LocalDateTime localDateTime = LocalDateTime.ofInstant(date.toInstant(), ZoneId.systemDefault());
			return localDateTime.toString();
		}
		protected String formatMessage(final String message, final Object[] params) {
			return MessageFormatter.arrayFormat(message, params).getMessage();
		}
	}
}

Notes:

  • ABCMonitorService simply writes direct to the logger.

AzureConsole

You can monitor the Cloud Hot Folder progress in the Azure Console. This will show the status of the .csv file as it is processed. The happy path will go thru the following steps:

  1. upload file to hybris/master/hotfolder/blog
  2. copy file to hybris/master/hotfolder/blog/processing
  3. delete file from hybris/master/hotfolder/blog
  4. copy file to hybris/master/hotfolder/blog/archive
  5. delete file from hybris/master/hotfolder/blog/processing


Azure Console - Happy Path
Upload file to blob:hybris/master/hotfolder/blog
172.17.0.1 - - [21/Oct/2019:21:22:04 +0000] "PUT /devstoreaccount1/hybris/master/hotfolder/blog/blog-01.csv HTTP/1.1" 201 -
172.17.0.1 - - [21/Oct/2019:21:22:05 +0000] "HEAD /devstoreaccount1/hybris/master/hotfolder/blog/blog-01.csv HTTP/1.1" 200 64
172.17.0.1 - - [21/Oct/2019:21:22:05 +0000] "GET /devstoreaccount1/hybris/master/hotfolder/blog/blog-01.csv HTTP/1.1" 200 64
172.17.0.1 - - [21/Oct/2019:21:22:06 +0000] "PUT /devstoreaccount1/hybris/master/hotfolder/blog/processing/blog-01.csv HTTP/1.1" 202 -
172.17.0.1 - - [21/Oct/2019:21:22:06 +0000] "HEAD /devstoreaccount1/hybris/master/hotfolder/blog/blog-01.csv HTTP/1.1" 200 64
172.17.0.1 - - [21/Oct/2019:21:22:06 +0000] "DELETE /devstoreaccount1/hybris/master/hotfolder/blog/blog-01.csv HTTP/1.1" 202 -
172.17.0.1 - - [21/Oct/2019:21:22:06 +0000] "HEAD /devstoreaccount1/hybris/master/hotfolder/blog/archive/ HTTP/1.1" 404 -
172.17.0.1 - - [21/Oct/2019:21:22:06 +0000] "PUT /devstoreaccount1/hybris/master/hotfolder/blog/archive/blog-01.csv.2019-10-21T21-22-06.303Z HTTP/1.1" 202 -
172.17.0.1 - - [21/Oct/2019:21:22:06 +0000] "HEAD /devstoreaccount1/hybris/master/hotfolder/blog/processing/blog-01.csv HTTP/1.1" 200 64
172.17.0.1 - - [21/Oct/2019:21:22:06 +0000] "DELETE /devstoreaccount1/hybris/master/hotfolder/blog/processing/blog-01.csv HTTP/1.1" 202 -

Unmapped

If the file does not match the mapping pattern (cloud.hotfolder.default.mapping.file.name.pattern) or one of the converter(s) then the message "was not routed as didn't match any configurations" is logged

Message [File [stock-03.csv] modified [1571352719000] was not routed as didn't match any configurations] 

Fix this by defining the file name pattern in the property cloud.hotfolder.default.mapping.file.name.pattern. Also check your convertMapping bean(s).

Console

You can monitor the Cloud Hot Folder progress in your Console/Terminal. There you will see the .csv file being processed with the steps:

  1. DOWNLOADED
  2. HEADER_SETUP
  3. HEADER_INIT
  4. HEADER_TRANSFORMED
  5. HEADER_EXECUTED
  6. HEADER_CLEANUP

Conclusion

This article introduced you to the concrete steps of how to migrate Hot Folders to Cloud Hot Folders to speed up and make your migration process easier. 

The articles below will help you dive deeper on this topic and provide additional information:

  • Cloud Hot Folders - SAP Help Portal > Cloud Hot Folders
  • Cloud Hot Folder Extension - SAP Help Portal > Cloud Hot Folder Extension
  • Get the Most of your Cloud Hot Folders and Azure Blob Storage - CX Works Article gives more background on Cloud Hot Folders. In this article, you will learn how you can migrate your connectivity from the Hot Folders to the Cloud Hot Folders and what are the different ways to connect and push files/blobs to the Cloud Hot Folders. You will also explain how to emulate the Azure Storage locally, which could be very useful for developers. Finally, we will explain how to upload product media/images to SAP Commerce Cloud using the Cloud Hot Folders.

  • Mastering Cloud Hot Folders - This webinar will walk you through the concept of Cloud Hot Folders, show you how to enable and configure them, and provide an overview of their file processing channels, custom-mapping features, and monitoring capabilities. 

    • There is a PDF version of the webinar available for download as well, highly recommended for the technical details of Cloud Hot Folders.
  • Data Importing - SAP Help Portal > Commerce B2C Accelerator - for information on Hot Folder Data Importing
  • Your Guide to Developing SAP Commerce Cloud Applications Locally - SAP Commerce Cloud is built to allow for customizations, but how can you make these changes without being connected to the cloud? In this video you will discover tips and tricks for translating local development of your application to successfully deploy with SAP Commerce Cloud.
Overlay