spring,war,spring-batch,web-deployment,spring-batch-admin
Spring Batch Admin has two ways of creating a WAR deployment as of right now: Copy the sample application and use it (https://github.com/spring-projects/spring-batch-admin/tree/master/spring-batch-admin-sample). That will provide a fully functional Spring Batch Admin web app that can be deployed to any servlet container. Embed the jar files provided by the framework...
java,jmx,jboss6.x,spring-batch-admin
When Spring Batch Admin deploys it registers all step executions with JMX. When you have a large number of step executions in the database it can take a long time to register all of them with JMX. If you do not need JMX you can disable it by overriding the...
spring,spring-batch,spring-batch-admin
Adding a value to the step's ExecutionContext makes it available only to that step. To make it so that another step can access it, you need to promote that key to the job's ExecutionContext. To do that, take a look at the ExecutionContextPromotionListener. It will promote whatever keys you've configured...
spring-batch,spring-integration,spring-amqp,spring-batch-admin
I am not sure what you mean by "running locally" but you don't have any routing information on the outbound adapters; if rabbit doesn't know how to route messages, he simply drops them. You need to add routing-key="${import.exchanges.queue}" and routing-key="${import.exchanges.reply.queue}" to the adapters. This will use the default exchange ("")...
spring,spring-batch,spring-integration,spring-amqp,spring-batch-admin
That is just a test case - everything runs locally - look at the JMS test case for a more real-world example. The channel item writer sends all the chunks to JMS using an outbound channel adapter. The remote side (jms listener container) receives the chunks, processes them and sends...
java,spring,batch-processing,spring-batch,spring-batch-admin
If you have a large file, I'd recommend storing it to disk unless there is a good reason not to. By saving the file to disk it allows you to restart the job without the need to re-download the file if an error occurs. With regards to the Tasklet vs...
java,spring,gradle,spring-batch,spring-batch-admin
Spring Batch Admin is currently in development for our 2.0 release. As part of that release, we will be upgrading the dependencies across the board (including this issue). If you're willing to work off the latest and greatest, the current master has these changes already applied. Otherwise, I'll be releasing...
spring,spring-mvc,spring-boot,spring-batch,spring-batch-admin
Spring Batch Admin configures the StepScope via XML which provides for java proxying as the mechanism for proxies. However, @StepScope uses dynamic subclasses. In order for this to work, instead of using the @StepScope shortcut, use the following: @Scope(value = "step", proxyMode = ScopedProxyMode.INTERFACES) In addition to the above update,...
java,spring,spring-mvc,spring-batch,spring-batch-admin
The short answer is that you won't want to use @EnableBatchProcessing with Spring Batch Admin. SBA provides a number of beans on a global scale that the @EnableBatchProcessing also provides. SBA 2.0 (currently in development) will probably fill the gaps between what is currently there and what @EnableBatchProcessing provides (specifically...
spring,spring-boot,spring-batch,spring-batch-admin
A couple things here: Don't use @EnableBatchProcessing with Spring Batch Admin. SBA provides a number of the components out of the box. If you're willing to use the latest and greatest, it provides everything @EnableBatchProcessing provides without using the annotation. The stack trace you're getting is because @EnableBatchProcessing is registering...
spring,spring-batch,spring-batch-admin
The ability to stop a child job from a parent job isn't currently supported in Spring Batch. The way you'd have to do it is to actually stop the child, then the parent. That being said, it doesn't seem like an unreasonable enhancement and pull requests are always welcome (take...
spring,batch-processing,spring-batch,spring-batch-admin
From past discussions, the CSV reader may have serious performance issues. You might be better served by writing a reader using another CSV parser. Depending on your validation data, you might create a job scoped filter bean that wraps a Map that can be either preloaded very quickly or lazy...