0 Replies Latest reply on Jan 4, 2005 3:42 PM by alal

    WARN  [HandlerRequest] Error registering request when using

    alal

      Configuration:
      JBOSS 3.2.4/Tomcat5.0
      Apache 2.0.8
      OS: FreeBSD
      mod_jk from freebsd port mod_jk-apache2

      I am using mod_jk mainly to have users access the web-app without the 8080 port number in the URL.
      I followed the steps in the wiki forum
      http://www.jboss.org/wiki/Wiki.jsp?page=UsingMod_jk1.2WithJBoss

      In the file JBHOME/server/all/deploy/jbossweb-tomcatxx.sar/META-INF/jboss-service.xml
      I do not have a UseJK attribute. (strange???)
      Adding one was causing errors in starting Jboss (saying that such an attribute did not exist).
      But it looks like it works without the attribute.

      I always get a warning whenever I access my app
      as
      http://localhost/MyApp

      WARN [HandlerRequest] Error registering request

      but dont when using the port number.

      The server log and apache logs are of no help in giving more details.

      On searching for the warning on the forum and the web, I believe people encountered OutOfMemory errors next. I'd really want to avoid them right away

      I had two questions,
      1. Am I doing the right thing to acheive my primary goal of eliminating the port number from the URL.?

      2. What is the source of this warning?, the app I am using to test this does not have anything that can cause memory leaks. I am using JavaServer Faces and all I have is a login page.

      Thanks a lot,

      Anagh Lal.

      Here is my workers.properties file

      # Define Node1
      worker.node1.port=8009
      worker.node1.host=demo.xxx.edu
      worker.node1.type=ajp13
      worker.node1.lbfactor=1
      #worker.node1.local_worker=1 (1)
      worker.node1.cachesize=10
      # Define Node2
      worker.node2.port=8009
      worker.node2.host=localhost
      worker.node2.type=ajp13
      worker.node2.lbfactor=1
      #worker.node2.local_worker=1 (1)
      worker.node2.cachesize=10

      # Load-balancing behaviour
      worker.loadbalancer.type=lb
      worker.loadbalancer.balanced_workers=node1, node2
      worker.loadbalancer.sticky_session=1
      worker.loadbalancer.local_worker_only=1
      worker.list=loadbalancer