A Scale for Crawler Effectiveness on the Client-Side Hidden Web
Computer Science and Information Systems, Tome 9 (2012) no. 2.

Voir la notice de l'article provenant de la source Computer Science and Information Systems website

The main goal of this study is to present a scale that classifies crawling systems according to their effectiveness in traversing the “clientside” Hidden Web. First, we perform a thorough analysis of the different client-side technologies and the main features of the web pages in order to determine the basic steps of the aforementioned scale. Then, we define the scale by grouping basic scenarios in terms of several common features, and we propose some methods to evaluate the effectiveness of the crawlers according to the levels of the scale. Finally, we present a testing web site and we show the results of applying the aforementioned methods to the results obtained by some open-source and commercial crawlers that tried to traverse the pages. Only a few crawlers achieve good results in treating client-side technologies. Regarding standalone crawlers, we highlight the open-source crawlers Heritrix and Nutch and the commercial crawler WebCopierPro, which is able to process very complex scenarios. With regard to the crawlers of the main search engines, only Google processes most of the scenarios we have proposed, while Yahoo! and Bing just deal with the basic ones. There are not many studies that assess the capacity of the crawlers to deal with client-side technologies. Also, these studies consider fewer technologies, fewer crawlers and fewer combinations. Furthermore, to the best of our knowledge, our article provides the first scale for classifying crawlers from the point of view of the most important client-side technologies.
Keywords: Web Search, Crawlers, Hidden Web, Web Spam, JavaScript
@article{CSIS_2012_9_2_a5,
     author = {V{\i}ctor M. Prieto and Manuel Alvarez and Rafael Lopez-Garc{\i}a and Fidel Cacheda},
     title = {A {Scale} for {Crawler} {Effectiveness} on the {Client-Side} {Hidden} {Web}},
     journal = {Computer Science and Information Systems},
     publisher = {mathdoc},
     volume = {9},
     number = {2},
     year = {2012},
     url = {http://geodesic.mathdoc.fr/item/CSIS_2012_9_2_a5/}
}
TY  - JOUR
AU  - Vıctor M. Prieto
AU  - Manuel Alvarez
AU  - Rafael Lopez-Garcıa
AU  - Fidel Cacheda
TI  - A Scale for Crawler Effectiveness on the Client-Side Hidden Web
JO  - Computer Science and Information Systems
PY  - 2012
VL  - 9
IS  - 2
PB  - mathdoc
UR  - http://geodesic.mathdoc.fr/item/CSIS_2012_9_2_a5/
ID  - CSIS_2012_9_2_a5
ER  - 
%0 Journal Article
%A Vıctor M. Prieto
%A Manuel Alvarez
%A Rafael Lopez-Garcıa
%A Fidel Cacheda
%T A Scale for Crawler Effectiveness on the Client-Side Hidden Web
%J Computer Science and Information Systems
%D 2012
%V 9
%N 2
%I mathdoc
%U http://geodesic.mathdoc.fr/item/CSIS_2012_9_2_a5/
%F CSIS_2012_9_2_a5
Vıctor M. Prieto; Manuel Alvarez; Rafael Lopez-Garcıa; Fidel Cacheda. A Scale for Crawler Effectiveness on the Client-Side Hidden Web. Computer Science and Information Systems, Tome 9 (2012) no. 2. http://geodesic.mathdoc.fr/item/CSIS_2012_9_2_a5/