Looking at the JBeret implementation, there is no way to actually get the underlying org.jberet.repository.JobRepository instance. So a first step would be to make this type @Inject-able via BatchBeanProducer.
However, even then, both JdbcRepository and MongoRepository are not overriding the removeJob/removeJobInstance methods to actually remove these artifacts from the underlying repositories.
Furthermore, both JdbcRepository and MongoRepository are not overriding getJob/getJobs. Hence a WildFly restart with config:
<subsystem xmlns="urn:jboss:domain:batch:1.0"> <job-repository> <jdbc jndi-name="java:jboss/datasources/Batch"/> </job-repository> <thread-pool> <max-threads count="10"/> <keepalive-time time="30" unit="seconds"/> </thread-pool> </subsystem>
is not reloading previous jobs. Anyone working on this?
The JobRepository is really meant for internal usage. I've filed a JIRA to implement a purge API, [JBERET-99] Add a way to delete/purge job repository artifacts - JBoss Issue Tracker.
Can you explain a bit what you mean by "not reloading previous jobs"? When I use a JDBC repository and list all jobs, I get back the all jobs that have been run including ones before I restarted WildFly.
James R. Perkins
It's the jobOperator.getJobNames() that does not reload correctly from the database. getJobNames() only returns the in-memory 'current working set'. Not really a bug according to the specs, but just weird from a dev point-of-view.
I've created a small patch to expose JobRepository, and to have removeJob/removeJobInstance to propagate correctly towards the underlying JDBC database: removal of job artifacts by fcorneli · Pull Request #15 · jberet/jsr352 · GitHub