I am trying to use cache event listeners as a sort of topic-basic messaging system to implement a request/reply pattern over asynchronous messaging system and multi-instance application.
More precisely, I need to make a synchronous REST endpoint out of asynchronous messaging. The flow goes like this: when I receive a HTTP request, I send an asynchronous message via AMQP to an external system, which responds to some predefined queue. The same application (but possibly another instance of it on another node) is listening to this queue and then wants to correlate this response to the correct node and correct thread. Because that AMQP system is out of my control, I thought I would use Infinispan to "message" all the cluster nodes about the received response via cache and cache event listeners. That would work, but next I need to correlate this response to the correct thread to return a HTTP response. My idea is that I could register a one-off event listener as a callback during a request to receive a response (correlated by some generated ID). However, I am concerned about possible performance issues. It seems to me that this is not the intended use for listeners. Can anyone shed some light on this?
That doesn't sound like a good use of a distributed cache or data grid. A clustered message system, or a reactive toolkit like Vert.x, might be better suited for this.