2 Replies Latest reply on Sep 27, 2004 2:27 PM by john doe

    Dealing real life issues : improve statistics, robots crawl,

    john doe Newbie

      I'm working on the following issues, any comments, suggestions, laughs are welcome :

      Allowing robots to index nukes sites. Usual URLs rewriting methods seem difficult to set up (IMHO, JBoss gurus can you hear me ?)

      There is another, simpler way to achieve the same result :
      create some "dummy pages" presenting a flat view of database content, organised with "clean" (friendly) URL links.
      Robots can easily follow those links to index the full content.

      Queryng a "dummy page" should not result in a "FIle ID not found" but redirected on home page, or better "de-frindlys'ed" to view the real page with normal URL.

      A batch process (a module, sorry) can compute those pages each night for example ...

      Another way : detect bots crawl and presents "custom pages to them. thi pages presents only friendly URLs.
      At the same time, we could log the web crawl for futur analyses ...

      secondly, I would like Nukes to log each visitor, each page (independently) and presents nice graphs and reportings. Ip address will be "dns looked up" in background (ideally during low cpu activity ...)

      And last, bots activity will be detected by Ip criteria and reported.

      I'm allready working on those futurs modules.

      Thanks for your attention