SAP Hybris Tutorial – Cronjob | Task Schedule | Export Order Data to CSV | Unit 8

0 Comments

Hi everyone. In this post, we will come into the different concepts, not about CMS, we will learn about:

  • What is a cronjob?
  • Why do we need to use cronjob?
  • How to create a cronjob?

Let’s go.

1)What is a cronjob?

SAP Commerce provides a means to set up regular tasks. With these tasks or cron jobs, you can repeatedly perform complex business logic at particular times and intervals.

2) Why do we need to use cronjob?

You may want to perform an inventory every Sunday at midnight, for example, or notify the web administrator of the peak loads on the servers every hour. You can achieve this through a combination of dedicated classes for the business logic, and the embedded cron job management functionality of SAP Commerce.

3) How to create a cronjob?

We do an exercise by following the steps below:

Exercise: Export order data into a CSV file then store it in media type to download. Export at every midnight.

Step 1: Create a cronjob model in order to transfer data as an input/output request to perform the task. if we don’t need the input data, we can ignore this step and use the “CronJobModel” from OOTB. In our case, we need to input the date to export data and the output is the media CSV file to download.

  • Implementation: Create an item type that extends the “CronJob” item in hybrislearningcore-items.xml file.
           <itemtype code="HLOrderExportCronJob"
                      extends="CronJob"
                      jaloclass="de.hybris.learning.core.jalo.HLOrderExportCronJob"
                      generate="true" autocreate="true" >
                <attributes>
                    <attribute qualifier="dateExport" type="java.util.Date">
                        <persistence type="property" />
                        <modifiers write="true" read="true" search="true"/>
                    </attribute>
                    <attribute qualifier="exportMedia" type="MediaCollection">
                        <modifiers read="true" write="true" search="true" />
                        <persistence type="property"/>
                    </attribute>
                </attributes>
            </itemtype>

Step 2: build ant all to generate the cronjob model.

Step 3: Create DAO and Service to get Order data by date.

We need to create a DAO to query the order data and respond the result to the service layer.

There are 2 interfaces: HLAccessDataDAO.java and HLAccessDataService.java.

There are 2 java classes that implement the 2 interfaces above: HLAccessDataDAOImpl.java and HLAccessDataServiceImpl.java.

Below is the code implemented in the HLAccessDataDAOImpl class:

public class HLAccessDataDAOImpl implements HLAccessDataDAO {

    public static final String TIME_FORMAT = "yyyy-MM-dd HH:mm:ss";

    @Autowired
    private FlexibleSearchService flexibleSearchService;

    @Override
    public List<OrderModel> getOrderDataByDate(Date date) {
        final String query = "SELECT {pk} FROM {Order} WHERE {modifiedtime} BETWEEN ?startTimeOfDay AND ?endTimeOfDay";
        SimpleDateFormat simpleDateFormat = new SimpleDateFormat(TIME_FORMAT);

        final FlexibleSearchQuery flexibleSearchQuery = new FlexibleSearchQuery(query);
        flexibleSearchQuery.addQueryParameter("startTimeOfDay", simpleDateFormat.format(atStartOfDay(date)));
        flexibleSearchQuery.addQueryParameter("endTimeOfDay", simpleDateFormat.format(atEndOfDay(date)));

        final SearchResult<OrderModel> result = flexibleSearchService.search(flexibleSearchQuery);
        if(null != result){
            result.getResult();
        }

        return Collections.emptyList();
    }

    private static Date atEndOfDay(Date date) {
        return DateUtils.addMilliseconds(DateUtils.ceiling(date, Calendar.DATE), -1);
    }

    private static Date atStartOfDay(Date date) {
        return DateUtils.truncate(date, Calendar.DATE);
    }
}

Below is the code implemented in the HLAccessDataServiceImpl class:

public class HLAccessDataServiceImpl implements HLAccessDataService {

    @Resource(name = "hLAccessDataDAO")
    private HLAccessDataDAO hLAccessDataDAO;

    @Override
    public List<OrderModel> getOrderDataByDate(Date date) {
        return hLAccessDataDAO.getOrderDataByDate(date);
    }
}

Step 4: Register these 2 implemented classes into the spring bean context.

In the hybrislearningcore-spring.xml file. Add the following xml code:

<alias name="defaultHLAccessDataDAO" alias="hLAccessDataDAO" />
<bean id="defaultHLAccessDataDAO" class="de.hybris.learning.core.dao.impl.HLAccessDataDAOImpl" />

<bean id="hLAccessDataService" class="de.hybris.learning.core.services.impl.HLAccessDataServiceImpl" />

Step 5: Create an HLOrderExportJob class that extends abstractJobPerformable and has performed get order list then store in a CSV file and save it to the media of the cronjob.

@Override
public PerformResult perform(HLOrderExportCronJobModel hlOrderExportModel) {
        // 1. check data from input
        if(null == hlOrderExportModel.getDateExport()){
            LOGGER.info("Missing field data: DateExport => process data for current day: {}");
            hlOrderExportModel.setDateExport(new Date());
        }
        try {
            // 2. process query data by the inputted date
            List<OrderModel> orders = hLAccessDataService.getOrderDataByDate(hlOrderExportModel.getDateExport());

            // 3. process create data into csv file
            String csvFileName = "OrderExport_" + (new SimpleDateFormat(DATE_FORMAT).format(hlOrderExportModel.getDateExport()));
            File file = generateCSV(csvFileName, orders);

            // 4. process save data into media collection field
            saveData(file, csvFileName, hlOrderExportModel);

        } catch (Exception ex){
            LOGGER.error("Error when processing export order data: {}", ex);
            return new PerformResult(CronJobResult.ERROR, CronJobStatus.ABORTED);
        }

        return new PerformResult(CronJobResult.SUCCESS, CronJobStatus.FINISHED);
    }

Step 6: Register the job into spring bean context which has parents is abstractJobPerformable

<bean id="hLOrderExportJob" class="de.hybris.learning.core.job.HLOrderExportJob" parent="abstractJobPerformable"/>

Step 7: Build ant all -> hybrisserver.bat

Step 8: After the server started, -> update the system in HAC -> verify the job has existed.

Verify whether the job springId is registered into the service layer job by flexible search below:

SELECT * FROM {servicelayerjob} WHERE {code}='hLOrderExportJob'

If you don’t see any results. Please execute the following impex:

INSERT_UPDATE ServicelayerJob;code[unique=true];springId;
;hLOrderExportJob;hLOrderExportJob;

Step 9: Create an instance of our cronjob and create a trigger job by Impex or Backoffice.

By the Backoffice, we can go to cronjob and then create a new cronjob from this console.

By the Impex, we will execute the following Impex below:

# Define the cron job and the job that it wraps
INSERT_UPDATE HLOrderExportCronJob; code[unique=true];job(code);singleExecutable;sessionLanguage(isocode)
;hLOrderExportCronJob;hLOrderExportJob;false;en

# Define the trigger that periodically invokes the cron job
INSERT_UPDATE Trigger;cronjob(code)[unique=true];cronExpression
; hLOrderExportCronJob; 0 59 23 ? * * *

As the defined cron expression is the cronjob will be run everyday at 23h59.

Step 10: Test the cronjob by input and without input date data. -> Download the media file from the cronjob.

Login to Backoffice console, go to the cronjob -> find to hLOrderExportCronJob -> input the date we want to export data -> trigger to start the cronjob -> Download and view the result.

Congratulation.! Good job. Thank you for your concentration. If this post is useful, please share and give me a heart up in the post.

That’s all for today. In the next post, we will learn about the important parts of the items.xml file. Hope to see you in the next article.

Happy coding.!!! <3



<<< Allow Drag, Drop, Move and Clone Component

Data Modeling | items.xml File >>>



Leave a Reply

Your email address will not be published. Required fields are marked *