-
1. Re: Failed to create session factory - remote client
jbertram Feb 23, 2015 9:57 AM (in response to tuhinbhowmick)If the code and configuration (aside from IP address) that worked in DEV and QA is not working now in PROD then my first thought is the problem is with the environment.
However, looking at your configuration I do see a problem. Your netty-connector and netty-acceptor are both using a socket-binding as well as specifying custom host and port parameters. This is not valid. I believe the socket-binding takes precedence here. Therefore, you should configure your host:port combination through the socket-binding.
-
2. Re: Failed to create session factory - remote client
tuhinbhowmick Feb 23, 2015 1:16 PM (in response to jbertram)Thanks a lot Justin.
My feeling was also the same but I am unable to figure out what environment issues may cause this kind of blockage. So my first thought was network (but DC says they can ping from the Linux Box to the Windows Box for the port 5445). Any lead from your side that we can investigate further?
For the second part, If I can understand you, I need to mention socket binding in -
<socket-binding-group name="standard-sockets" default-interface="public" port-offset="${jboss.socket.binding.port-offset:0}">
...
<socket-binding name="messaging" port="5445"/>
<socket-binding name="messaging-throughput" port="5455"/>
.....
</socket-binding-group>
i have done that already.
Do I need to do something like below as well ? -
<connectors>
<netty-connector name="netty" socket-binding="messaging"/>
<netty-connector name="netty-throughput" socket-binding="messaging-throughput">
<param key="batch-delay" value="50"/>
</netty-connector>
<in-vm-connector name="in-vm" server-id="0"/>
</connectors>
(I can try though my previous configuration already worked in the QA and Dev)
-
3. Re: Failed to create session factory - remote client
jbertram Feb 23, 2015 2:37 PM (in response to tuhinbhowmick)Any lead from your side that we can investigate further?
I would need more details about your environment before I could provide you with any more recommendations.
For the second part, If I can understand you, I need to mention socket binding in -
<socket-binding-group name="standard-sockets" default-interface="public" port-offset="${jboss.socket.binding.port-offset:0}">
...
<socket-binding name="messaging" port="5445"/>
<socket-binding name="messaging-throughput" port="5455"/>
.....
</socket-binding-group>
i have done that already.
I'm not really clear on what you're saying/asking here so let me just clarify a bit more what I was saying in my previous comment. Remove the <param> entries from both your netty-connector and netty-acceptor because they aren't valid. The address:port to which HornetQ will bind its acceptor(s) and to which the connector(s) will connect is determined by the relevant socket-binding. Please ensure that the relevant socket-binding is using the IP address or hostname that your client expects. It's possible that you simply aren't binding JBoss AS to the proper network interface which is preventing your clients from connecting.
-
4. Re: Failed to create session factory - remote client
tuhinbhowmick Feb 24, 2015 10:19 AM (in response to jbertram)Hi Justin,
Thanks again for the help.
We are going to deploy and test in production once again with the following change. We need to go with a short downtime in production as its already running with some application, so the process is very crucial for us.
Below are the changes we made in standalone xml file. Please suggest if you see any discrepancies -
....
<connectors>
<netty-connector name="netty" socket-binding="messaging"/>
<netty-connector name="netty-throughput" socket-binding="messaging-throughput">
<param key="batch-delay" value="50"/>
</netty-connector>
<in-vm-connector name="in-vm" server-id="0"/>
</connectors>
<acceptors>
<netty-acceptor name="netty" socket-binding="messaging"/>
<netty-acceptor name="netty-throughput" socket-binding="messaging-throughput">
<param key="batch-delay" value="50"/>
<param key="direct-deliver" value="false"/>
</netty-acceptor>
<in-vm-acceptor name="in-vm" server-id="0"/>
</acceptors>
......
<interfaces>
<interface name="management">
<inet-address value="${jboss.bind.address.management:127.0.0.1}"/>
</interface>
<interface name="public">
<inet-address value="${jboss.bind.address:0.0.0.0}"/>
</interface>
<interface name="unsecure">
<inet-address value="${jboss.bind.address.unsecure:127.0.0.1}"/>
</interface>
</interfaces>
.....
<socket-binding-group name="standard-sockets" default-interface="public" port-offset="${jboss.socket.binding.port-offset:0}">
<socket-binding name="management-native" interface="management" port="${jboss.management.native.port:9999}"/>
<socket-binding name="management-http" interface="management" port="${jboss.management.http.port:9990}"/>
<socket-binding name="management-https" interface="management" port="${jboss.management.https.port:9443}"/>
<socket-binding name="ajp" port="8009"/>
<socket-binding name="http" port="8080"/>
<socket-binding name="https" port="8443"/>
<socket-binding name="jacorb" interface="unsecure" port="3528"/>
<socket-binding name="jacorb-ssl" interface="unsecure" port="3529"/>
<socket-binding name="messaging" port="5445"/>
<socket-binding name="messaging-throughput" port="5455"/>
<socket-binding name="osgi-http" interface="management" port="8090"/>
<socket-binding name="remoting" port="4447"/>
<socket-binding name="txn-recovery-environment" port="4712"/>
<socket-binding name="txn-status-manager" port="4713"/>
<outbound-socket-binding name="mail-smtp">
<remote-destination host="localhost" port="25"/>
</outbound-socket-binding>
</socket-binding-group>
-
5. Re: Failed to create session factory - remote client
jbertram Feb 24, 2015 11:11 AM (in response to tuhinbhowmick)The only issue I'd point out with your configuration is that by default you'd bind the "public" interface to 0.0.0.0 which I wouldn't recommend. This will make your connectors point to 0.0.0.0:5445 which will be meaningless to a remote client. Is it absolutely necessary that you bind to 0.0.0.0? If not, then you should bind to a particular IP address or hostname that is meaningful to remote clients.
-
6. Re: Failed to create session factory - remote client
tuhinbhowmick Feb 24, 2015 11:43 AM (in response to jbertram)In fact, we have several other MDBs internally and few externally accessing the queues. When we tried to bind to the specific IP in DEV, the internal clients(localhost) did not work.Then I changed it to 0.0.0.0 when all the external and internal clients could connect and post.
Will binding the public interface to 0.0.0.0 create any issue?
-
7. Re: Failed to create session factory - remote client
jbertram Feb 24, 2015 12:01 PM (in response to tuhinbhowmick)When we tried to bind to the specific IP in DEV, the internal clients(localhost) did not work.
So you have clients running within the application server that are doing operations on "localhost"? If so, why? That would essentially be a remote operation (albeit through a loop-back address) for local clients. Local clients would typically use local resources for simplicity and performance reasons. For example, a local JNDI lookup wouldn't specify "localhost" in the properties for its InitialContext, it would simply create a blank InitialContext so that the defaults (i.e. local resources) would be used. By using "localhost" for local clients you're forcing the server to bind to (at least) the localhost interface.
Will binding the public interface to 0.0.0.0 create any issue?
Yes. That is what I explained in my previous comment.
-
8. Re: Failed to create session factory - remote client
tuhinbhowmick Feb 24, 2015 12:25 PM (in response to jbertram)for local, we are using something like this -
TransportConfiguration transportConfiguration = new TransportConfiguration(
NettyConnectorFactory.class.getName());
ConnectionFactory factory = (ConnectionFactory) HornetQJMSClient
.createConnectionFactoryWithoutHA(JMSFactoryType.CF,
transportConfiguration);
Queue queue = HornetQJMSClient.createQueue(DxRConstants.LOCAL_QUEUENAME);
// create the connection
connection = factory.createConnection();
System.out.println("create the connection");
// create the session
session = connection.createSession(false, Session.AUTO_ACKNOWLEDGE);
System.out.println("create the session");
MessageProducer sender = session.createProducer(queue);
ObjectMessage message = session.createObjectMessage();
message.setObject(o);
sender.send(message);
-
9. Re: Failed to create session factory - remote client
tuhinbhowmick Feb 24, 2015 12:36 PM (in response to tuhinbhowmick)so its not localhost operation .. its using the local resources as you suggested..
but when I bind public interface to the specific IP, the local clients fails
-
10. Re: Failed to create session factory - remote client
jbertram Feb 24, 2015 1:10 PM (in response to tuhinbhowmick)Here's your problem:
TransportConfiguration transportConfiguration = new TransportConfiguration(NettyConnectorFactory.class.getName());
You shouldn't be using org.hornetq.core.remoting.impl.netty.NettyConnectorFactory here. You should be using org.hornetq.core.remoting.impl.invm.InVMConnectorFactory since your server is in the same JVM as your client. A Netty connector should only be used for a remote client.
-
11. Re: Failed to create session factory - remote client
tuhinbhowmick Feb 24, 2015 1:22 PM (in response to jbertram)I will try that for sure and also bind to the specific IP. Thanks
-
12. Re: Failed to create session factory - remote client
tuhinbhowmick Feb 26, 2015 12:36 PM (in response to tuhinbhowmick)Hi Justin,
Finally we are able to resolve the issue with your help ..
Thanks a lot for your comments and suggestions.
Cheers
Tuhin
-
13. Re: Failed to create session factory - remote client
jbertram Feb 26, 2015 12:57 PM (in response to tuhinbhowmick)Glad you got it all sorted out.