info@yenlo.com
eng
Menu
Enterprise Integration 15 min

Efficient Large File Processing with Apache Camel (Part 2)

Explore efficient large file processing with Apache Camel in this insightful blog by Ajanthan Eliyathamby, integration specialist at Yenlo. Learn how to optimize your workflows and streamline your file handling processes.

Ajanthan Eliyathamby Integration Expert Yenlo
Ajanthan Eliyathamby
Integration Expert
Local Deployment Diagram

A Guide Using Java / XML-DSL and Spring Boot

In our previous blog in this series Efficient Large File Processing with Apache Camel we focused on XML-DSL. Now we will be using the Java DSL for the integration and comparing the performance within the two.

Java-DSL Implementation

Same as we have done for XML-DSL implementation, first we will look into the project structure in a reusable way. Refer the below image which explains how this is structured for this sample implementation.

The template project will be include the Unit Test as well, using the “camel-test-spring-junit5”.

Explanation on the camel-file-templates project

As mentioned in the previous blog, our approach involves first extracting the information from the CSV file, then performing data processing to generate a JSON output containing specified fields. Additionally, we aim to group a defined number of lines from the file simultaneously. This aggregation process is facilitated by the Camel Configuration specified in the application.yml file under the property “noOfLinesToReadAtOnce.” To carry out this aggregation, we will utilize the following class.

package com.integration.camel.file.common.route.aggregate;

import org.apache.camel.AggregationStrategy;
import org.apache.camel.Exchange;
import org.springframework.stereotype.Component;

@Component
public class JsonAggregationStrategy implements AggregationStrategy {
    public Exchange aggregate(Exchange oldExchange, Exchange newExchange) {
        if (oldExchange == null) {
            return newExchange;
        }
        String oldBody = oldExchange.getIn().getBody(String.class);
        String newBody = newExchange.getIn().getBody(String.class);
        String body = null;
        if (!oldBody.startsWith("[")) {
            body = "[ " + oldBody + ", " + newBody + " ]";
        } else{
            body = oldBody.replace("]", "") + ", " + newBody + " ]";
        }
        oldExchange.getIn().setBody(body);
        return oldExchange;
    }
}

Under utils, we have TimeGap.java. This is used to calculate the process time which can be useful to comparing the performance.

package com.integration.camel.file.common.route.utils;

import java.util.Date;
import java.util.concurrent.TimeUnit;
import org.springframework.stereotype.Component;

@Component
public class TimeGap {
    public String calculateTimeDifference(Date startTime, Date endTime) {
        long diffInMillis = endTime.getTime() - startTime.getTime();
        long seconds = TimeUnit.MILLISECONDS.toSeconds(diffInMillis);
        long minutes = TimeUnit.MILLISECONDS.toMinutes(diffInMillis);
        long hours = TimeUnit.MILLISECONDS.toHours(diffInMillis);
        return String.format("%d hours, %d minutes, %d seconds, %d milliseconds", hours, minutes, seconds, diffInMillis);
    }
}

Then under templates, we will have 2-kinds of templates:
FileToTopicRouteTemplate.java

This route template has been designed for the purpose of monitoring a file location, fetching the file’s content, processing it through streaming, segmenting it based on CSV lines, and finally publishing it to a topic.

package com.integration.camel.file.common.route.templates;

import com.integration.camel.file.common.route.aggregate.JsonAggregationStrategy;
import org.apache.camel.CamelContext;
import org.apache.camel.LoggingLevel;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.model.dataformat.BindyType;
import org.apache.camel.model.dataformat.JsonLibrary;

/*
* This Route Template created for the reuse of a typical File To Topic Route.
*/
public class FileToTopicRouteTemplate extends RouteBuilder {

    private final CamelContext camelContext;

    public FileToTopicRouteTemplate(CamelContext camelContext) {
        this.camelContext = camelContext;
    }

    @Override
    public void configure() throws Exception {

        String routeId = camelContext.resolvePropertyPlaceholders("{{file-to-topic.routeId}}");
        String fileUri = camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.uri}}");
        String token = camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.token}}");
        int noOfLinesToReadAtOnce = Integer.parseInt(camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.noOfLinesToReadAtOnce}}"));
        boolean skipFirstLine = Boolean.parseBoolean(camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.skipFirstLine}}"));
        boolean streaming = Boolean.parseBoolean(camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.streaming}}"));
        String inputMapperClassName = camelContext.resolvePropertyPlaceholders("{{file-to-topic.mapperClass}}");
        String processorClass = camelContext.resolvePropertyPlaceholders("{{file-to-topic.processorClass}}");
        String endpointUri = camelContext.resolvePropertyPlaceholders("{{file-to-topic.endpoint.uri}}");

        routeTemplate("file-to-topic-route-template")
                .templateBean("timeGapBean")
                            .typeClass("com.integration.camel.file.common.route.utils.TimeGap")
                            .end()

                .from(fileUri)
                .routeId(routeId)
                .log(LoggingLevel.INFO, "Starting to process the file: ${header.CamelFileName} and ${header.camelFileLength} Bytes")
                .setProperty("startTime", simple("${date:now}"))
                .split(body().tokenize(token, noOfLinesToReadAtOnce , skipFirstLine)).streaming(streaming)
                            .log(LoggingLevel.DEBUG, "Message in process after the initial split: ${body}")
                                .unmarshal().bindy(BindyType.Csv, Class.forName(inputMapperClassName))
                                    .split(body(), new JsonAggregationStrategy())
                                        .log(LoggingLevel.DEBUG, "Message in process after the individual split: ${body}")
                                        .bean(processorClass)
                                        .marshal().json(JsonLibrary.Jackson)
                                        .log(LoggingLevel.DEBUG, "Message in process at the end of individual split: ${body}")
                                    .end()
                                    .log(LoggingLevel.DEBUG, "Message will be sent after aggregation: ${body}")
                                .to(endpointUri)
                            .end()
                .setProperty("endTime", simple("${date:now}"))
                .log(LoggingLevel.INFO, "Done processing the file: ${header.CamelFileName}")
                .to("bean:{{timeGapBean}}?method=calculateTimeDifference(${exchangeProperty.startTime},${exchangeProperty.endTime})")
                .log(LoggingLevel.INFO,"Time Taken to complete the route " + routeId + " on ${date:now} ${body}");

    }
}
wp advanced api management guide
Whitepaper Advanced API Management guide

Helping you to select and design your Enterprise API Management platform

Download now

TopicToRestRouteTemplate.java

This route template is designed to retrieve messages from the topic and subsequently forward them to a backend, while also allowing for the possibility of redelivery.

package com.integration.camel.file.common.route.templates;

import org.apache.camel.CamelContext;
import org.apache.camel.LoggingLevel;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.http.base.HttpOperationFailedException;

/*
 * This Route Template created for the reuse of a typical Topic To Rest Route.
 */
public class TopicToRestRouteTemplate extends RouteBuilder {

    private final CamelContext camelContext;

    public TopicToRestRouteTemplate(CamelContext camelContext) {
        this.camelContext = camelContext;
    }

    @Override
    public void configure() {

        String routeId = camelContext.resolvePropertyPlaceholders("{{topic-to-rest.routeId}}");
        String listenerUri = camelContext.resolvePropertyPlaceholders("{{topic-to-rest.listener.uri}}");
        String lockPeriodMilliSeconds = camelContext.resolvePropertyPlaceholders("{{topic-to-rest.receiver.throttle.lockPeriodMilliSeconds}}");
        String requestCount = camelContext.resolvePropertyPlaceholders("{{topic-to-rest.receiver.throttle.requestCount}}");
        String receiverToken = camelContext.resolvePropertyPlaceholders("{{topic-to-rest.receiver.token}}");
        String receiverDelay = camelContext.resolvePropertyPlaceholders("{{topic-to-rest.receiver.reDelivery.delay}}");
        String deadLetterQueue = camelContext.resolvePropertyPlaceholders("{{topic-to-rest.receiver.reDelivery.deadLetterQueue}}");
        String receiverUri = camelContext.resolvePropertyPlaceholders("{{topic-to-rest.receiver.uri}}");
        String reDeliveryAttempts = camelContext.resolvePropertyPlaceholders("{{topic-to-rest.receiver.reDelivery.attempts}}");

        routeTemplate("topic-to-rest-route-template")
                .from(listenerUri)
                .routeId(routeId)
                .throttle(constant(requestCount)).timePeriodMillis(lockPeriodMilliSeconds)
                .onException(HttpOperationFailedException.class)
                        .onWhen(simple("${exception.statusCode} == 422"))
                            .maximumRedeliveries(reDeliveryAttempts)
                            .redeliveryDelay(receiverDelay)
                            .handled(true)
                            .log(LoggingLevel.ERROR, "HTTP error occurred with status ${exception.statusCode}. Response body: ${exception.message}")
                            .to(deadLetterQueue)
                        .end()
                .setHeader("Content-Type", constant("application/json"))
                .setHeader("Authorization", constant(receiverToken))
                .to(receiverUri)
                .choice()
                        .when(simple("${header.CamelHttpResponseCode} == 200"))
                        .log("Message Successfully sent to Rest Endpoint and Received status code: ${header.CamelHttpResponseCode}")
                        .endChoice();

    }
}

application.yml. As this is a template project used as a library the test/resources will contain the needed configurations.

camel:
  springboot:
    name: camel-file-templates

Under “test”, we will have the InputCsvMapper.java and InputCsvProcessor.java, these classes will be used for testing purposes.

package com.integration.camel.file.common.route.mapper;

import lombok.Data;
import org.apache.camel.dataformat.bindy.annotation.CsvRecord;
import org.apache.camel.dataformat.bindy.annotation.DataField;

@Data
@CsvRecord(separator = ",")
public class InputCsvMapper {

    @DataField(pos = 1, columnName = "id")
    private int id;

    @DataField(pos = 2, columnName = "firstname")
    private String firstName;

    @DataField(pos = 3, columnName = "lastname")
    private String lastName;

    @DataField(pos = 4, columnName = "email")
    private String email;

    @DataField(pos = 5, columnName = "email2")
    private String email2;

    @DataField(pos = 6, columnName = "profession")
    private String profession;

}

As our goal is to modify the payload before posting to the backend, we are using the InputCsvProcessor to cater that.

package com.integration.camel.file.common.route.process;

import com.integration.camel.file.common.route.mapper.InputCsvMapper;
import lombok.extern.slf4j.Slf4j;
import org.apache.camel.Exchange;
import org.apache.camel.Processor;
import org.apache.camel.util.json.JsonObject;
import org.springframework.stereotype.Component;

@Component
public class InputCsvProcessor implements Processor {

    @Override
    public void process(Exchange exchange) throws Exception {

        InputCsvMapper csvRecord = exchange.getIn().getBody(InputCsvMapper.class);

        JsonObject jsonObject = new JsonObject();
        jsonObject.put("updatedId", csvRecord.getId());
        jsonObject.put("updateName", csvRecord.getFirstName());

        exchange.getIn().setBody(jsonObject);
    }
}

Then the FileToTopicRouteTest.java which contains the Unit Test Implemented for a positive scenario in the FileToTopicRouteTemplate.

package com.integration.camel.file.common.route;

import com.integration.camel.file.common.route.templates.FileToTopicRouteTemplate;
import org.apache.camel.CamelContext;
import org.apache.camel.EndpointInject;
import org.apache.camel.RoutesBuilder;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.mock.MockEndpoint;
import org.apache.camel.test.spring.junit5.CamelSpringBootTest;
import org.junit.jupiter.api.Test;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.test.context.ActiveProfiles;

@SpringBootTest
@CamelSpringBootTest
@EnableAutoConfiguration
@ActiveProfiles("test")
class FileToTopicRouteTest {

    @EndpointInject("mock:activemq:topic:camel.testtopic")
    MockEndpoint mockEndpoint;

    @Configuration
    static class TestConfig {

        @Bean
        RoutesBuilder route() {
            return new RouteBuilder() {

                @Autowired
                CamelContext camelContext;

                @Override
                public void configure() throws Exception {
                    camelContext.addRoutes(new FileToTopicRouteTemplate(camelContext));
                    templatedRoute("file-to-topic-route-template").routeId("file-to-topic-route-test");
                }
            };
        }
    }

    @Test
    void verifyTheMessageCountReceived_Success() throws InterruptedException {
        mockEndpoint.setExpectedMessageCount(7);
        mockEndpoint.assertIsSatisfied();
    }

}

Then the configuration file for test — application-test.yml

file-to-topic:
  routeId: "file-to-topic-route"
  file:
    uri: "file:src/test/resources?noop=true&delay=20000&antInclude=data_test_*.csv"
    token: "\n"
    noOfLinesToSkip: 1
    noOfLinesToReadAtOnce: 2
    skipFirstLine: true
    streaming: true
  mapperClass: "com.integration.camel.file.common.route.mapper.InputCsvMapper"
  processorClass: "com.integration.camel.file.common.route.process.InputCsvProcessor"
  endpoint:
    uri: "mock:activemq:topic:camel.testtopic"

The Test data file: data_test_1.csv, which will consist of the below headers.

id,firstname,lastname,email,email2,profession

Finally, the pom.xml, here as mentioned earlier during the xml-dsl implementation, the library needs to be built using the maven-jar-plugin.

Refer the pom.xml at https://bitbucket.org/yenlo/yenlo_camel/src/master/java-dsl/camel-file-templates/pom.xml

Execute the mvn clean install to deploy the artifacts to the local repository to be used in the next project.

Explanation on the camel-file-integration-sample project

InputCsvMapper.java and InputCsvProcessor.java: InputCsvMapper is the one who maps the csv attributes and the InputCsvProcessor will be helping to modify the payload before sending it to backend.

package com.integration.camel.sample.mapper;

import lombok.Data;
import org.apache.camel.dataformat.bindy.annotation.CsvRecord;
import org.apache.camel.dataformat.bindy.annotation.DataField;

@Data
@CsvRecord(separator = ",")
public class InputCsvMapper {

    @DataField(pos = 1, columnName = "id")
    private int id;

    @DataField(pos = 2, columnName = "firstname")
    private String firstName;

    @DataField(pos = 3, columnName = "lastname")
    private String lastName;

    @DataField(pos = 4, columnName = "email")
    private String email;

    @DataField(pos = 5, columnName = "email2")
    private String email2;

    @DataField(pos = 6, columnName = "profession")
    private String profession;

}
package com.integration.camel.sample.process;


import com.integration.camel.sample.mapper.InputCsvMapper;
import org.apache.camel.Exchange;
import org.apache.camel.Processor;
import org.apache.camel.util.json.JsonObject;
import org.springframework.stereotype.Component;

@Component
public class InputCsvProcessor implements Processor {

    @Override
    public void process(Exchange exchange) throws Exception {

        InputCsvMapper csvRecord = exchange.getIn().getBody(InputCsvMapper.class);

        JsonObject jsonObject = new JsonObject();
        jsonObject.put("updatedId", csvRecord.getId());
        jsonObject.put("updateName", csvRecord.getFirstName());

        exchange.getIn().setBody(jsonObject);
    }
}

Route Builders defined under builders’ package, where we will be loading the route templates for our actual integration use.

FileToTopicRoute.java

package com.integration.camel.sample.builders;

import com.integration.camel.file.common.route.templates.FileToTopicRouteTemplate;
import org.apache.camel.CamelContext;
import org.apache.camel.builder.RouteBuilder;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;

@Component
public class FileToTopicRoute extends RouteBuilder {

    @Autowired
    CamelContext camelContext;

    @Override
    public void configure() throws Exception {

        camelContext.addRoutes(new FileToTopicRouteTemplate(camelContext));
        templatedRoute("file-to-topic-route-template").routeId(camelContext.resolvePropertyPlaceholders("{{file-to-topic.routeId}}"));

    }

}

TopicToRestRoute.java

package com.integration.camel.sample.builders;

import com.integration.camel.file.common.route.templates.FileToTopicRouteTemplate;
import org.apache.camel.CamelContext;
import org.apache.camel.builder.RouteBuilder;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;

@Component
public class FileToTopicRoute extends RouteBuilder {

    @Autowired
    CamelContext camelContext;

    @Override
    public void configure() throws Exception {

        camelContext.addRoutes(new FileToTopicRouteTemplate(camelContext));
        templatedRoute("file-to-topic-route-template").routeId(camelContext.resolvePropertyPlaceholders("{{file-to-topic.routeId}}"));

    }

}

TopicToRestRoute.java

package com.integration.camel.sample.builders;

import com.integration.camel.file.common.route.templates.TopicToRestRouteTemplate;
import org.apache.camel.CamelContext;
import org.apache.camel.builder.RouteBuilder;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;

@Component
public class TopicToRestRoute extends RouteBuilder {

    @Autowired
    CamelContext camelContext;

    @Override
    public void configure() throws Exception {

        camelContext.addRoutes(new TopicToRestRouteTemplate(camelContext));
        templatedRoute("topic-to-rest-route-template").routeId(camelContext.resolvePropertyPlaceholders("{{topic-to-rest.routeId}}"));

    }
}

Configuring the properties in application.yml

camel:
  springboot:
    name: camel-file-integration-sample

logging:
  level:
    org:
      apache:
        camel: DEBUG

spring:
  activemq:
    broker-url: "tcp://XXXXXXXX:61616"
    user: XXXXXX
    password: XXXXXX

file-to-topic:
  routeId: "file-to-topic-route"
  file:
    uri: "file:src/main/resources?noop=true&delay=20000&antInclude=file_*.csv"
    token: "\n"
    noOfLinesToSkip: 1
    noOfLinesToReadAtOnce: 2
    skipFirstLine: true
    streaming: true
  mapperClass: "com.integration.camel.sample.mapper.InputCsvMapper"
  processorClass: "com.integration.camel.sample.process.InputCsvProcessor"
  endpoint:
    uri: "activemq:topic:camel.testtopic"

topic-to-rest:
  routeId: "topic-to-rest-route"
  listener:
    uri: "activemq:topic:camel.testtopic"
  receiver:
    uri: "https://XXXXXXXXXXXXX"
    token: "12323444552211"
    reDelivery:
      attempts: 3
      delay: 5000
      deadLetterQueue: "activemq:queue:dead-letter"
    throttle:
      lockPeriodMilliSeconds: 10000
      requestCount: 1

data_2.csv is the file that will used for the integration execution and will have the headers as below:
id,firstname,lastname,email,email2,profession

The pom.xml can be find at https://bitbucket.org/yenlo/yenlo_camel/src/master/java-dsl/camel-file-integration-sample/pom.xml
As mentioned earlier a test can be implemented under test, as we have already done the same kind of testing in the template project, we will skip it here.

Execute “mvn clean package” and then run the application using “java -jar target/camel-file-integration-sample-1.0.0-SNAPSHOT.jar”

That’s it… we have implemented the same functionality using the Java-DSL as well.

Performance Comparison between XML-DSL and Java-DSL

It’s always a question: which one to go with? based on the implementation or the development process, even though we feel that the XML-DSL will be easy to understand for a non-developer, if we consider the proper packaging and reusable code development, then Java-DSL is better.

As I mentioned during the XML-DSL implementation, when considering the packaging angle: Java-DSL is better than the XML-DSL, because when we do the packaging with XML-DSL, then we also need to do some additional configuration to load the xml route templates to Camel Context from the dependency projects.
Further to compare the performance between the two methids, for this sample use case, I have done a test for 10MB file and the results are as follows.

Based on the results we can conclude that, if we have the Java Expertise in the Team, it is better to go with the Java-DSL Implementation, which will help to us maintain the performance along with the support of Unit, Integration Testing and Debugging capabilities.

Appendix

When using Idempotent Consumer, there are several options to use as an Idempotent Repo.  Here we have used the File Idempotent Repository. For other options refer the link at https://camel.apache.org/components/4.0.x/eips/idempotentConsumer-eip.html. In this method, when the route gets initiated, first it will read the file and persist a track.txt file in a directory of the pod which will be mounted to a NFS location. Then during initiation of processing, it will write the file name that to be processed in the shared file. When another pod tries to listen the same file, it will be skipped as the record will already exist.

In this technique, there is a possibility that the file can be read multiple times, to avoid this it is always better to have a random delay before idempotent consumer is used. Then we can make sure that the file is not processed again.

Local Deployment Diagram

Below is the FileToTopicRouteTemplate.java modified after the addition of the FileIdempotentRepository.

package com.integration.camel.file.common.route.templates;

import com.integration.camel.file.common.route.aggregate.JsonAggregationStrategy;
import org.apache.camel.CamelContext;
import org.apache.camel.LoggingLevel;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.model.dataformat.BindyType;
import org.apache.camel.model.dataformat.JsonLibrary;
import org.apache.camel.support.processor.idempotent.FileIdempotentRepository;

import java.io.File;

/*
* This Route Template created for the reuse of a typical File To Topic Route.
*/
public class FileToTopicRouteTemplate extends RouteBuilder {

    private final CamelContext camelContext;

    public FileToTopicRouteTemplate(CamelContext camelContext) {
        this.camelContext = camelContext;
    }

    @Override
    public void configure() throws Exception {

        String routeId = camelContext.resolvePropertyPlaceholders("{{file-to-topic.routeId}}");
        String fileUri = camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.uri}}");
        String token = camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.token}}");
        int noOfLinesToReadAtOnce = Integer.parseInt(camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.noOfLinesToReadAtOnce}}"));
        boolean skipFirstLine = Boolean.parseBoolean(camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.skipFirstLine}}"));
        boolean streaming = Boolean.parseBoolean(camelContext.resolvePropertyPlaceholders("{{file-to-topic.file.streaming}}"));
        String inputMapperClassName = camelContext.resolvePropertyPlaceholders("{{file-to-topic.mapperClass}}");
        String processorClass = camelContext.resolvePropertyPlaceholders("{{file-to-topic.processorClass}}");
        String endpointUri = camelContext.resolvePropertyPlaceholders("{{file-to-topic.endpoint.uri}}");
        String trackFile = camelContext.resolvePropertyPlaceholders("{{file-to-topic.trackFile}}");

        routeTemplate("file-to-topic-route-template")
                .templateBean("timeGapBean")
                            .typeClass("com.integration.camel.file.common.route.utils.TimeGap")
                            .end()

                .from(fileUri)
                .delay(simple("${random(1000,5000)}"))
                .idempotentConsumer(header("CamelFileName"),
                        FileIdempotentRepository.fileIdempotentRepository(new File(trackFile)))
                .routeId(routeId)
                .log(LoggingLevel.INFO, "Starting to process the file: ${header.CamelFileName} and ${header.camelFileLength} Bytes")
                .setProperty("startTime", simple("${date:now}"))
                .split(body().tokenize(token, noOfLinesToReadAtOnce , skipFirstLine)).streaming(streaming)
                            .log(LoggingLevel.DEBUG, "Message in process after the initial split: ${body}")
                                .unmarshal().bindy(BindyType.Csv, Class.forName(inputMapperClassName))
                                    .split(body(), new JsonAggregationStrategy())
                                        .log(LoggingLevel.DEBUG, "Message in process after the individual split: ${body}")
                                        .bean(processorClass)
                                        .marshal().json(JsonLibrary.Jackson)
                                        .log(LoggingLevel.DEBUG, "Message in process at the end of individual split: ${body}")
                                    .end()
                                    .log(LoggingLevel.DEBUG, "Message will be sent after aggregation: ${body}")
                                .to(endpointUri)
                            .end()
                .setProperty("endTime", simple("${date:now}"))
                .log(LoggingLevel.INFO, "Done processing the file: ${header.CamelFileName}")
                .to("bean:{{timeGapBean}}?method=calculateTimeDifference(${exchangeProperty.startTime},${exchangeProperty.endTime})")
                .log(LoggingLevel.INFO,"Time Taken to complete the route " + routeId + " on ${date:now} ${body}");

    }
}

That concludes our discussion on Efficient Large File Processing with Apache Camel. During the Part-1 of this series we have covered the challenges in large file processing and how we can implement it with the Apache Camel XML-DSL and here in Part-2 we have covered the implementation with Java DSL and the comparison between both implementations Java DSL and XML DSL.

I trust that this series of blogs has provided you with ample information to begin your journey into file processing with Camel. I look forward to reconnecting in another blog. Stay tuned for more insights and updates.

Whitepaper: API Security

API Security
Download Whitepaper
eng
Close