Implementation and Performance Analysis of Web Robot for URL Analysis 


Vol. 27,  No. 3, pp. 226-233, Mar.  2002


PDF
  Abstract

This paper proposes the web robot based on Multi-Agent which the mutual dependency should be minimized each other with dividing the function each to collect Webpage. In result it is written to make a foundation for producing the effective statistics to analyze the domastic webpages and text, multimedia file composition ratio through performance analysis of the implemented system. It is easy that Web robot of the sequential processing method to collect Webpage on the same resource environment produces the limit of collecting performance. So to speak, Webpages have "Dead-links" URL which is produced by the temporary host down and unstability of network resource. If there are much "Dead-links" URL in the webpages, it takes a lot of time for web robot to collect HTML. The purpose of this paper to be proposed, makes the maximum improvement to extract the webpages to process "Dead-links" URL on the Inactive URL scanner Agent.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

W. Kim, H. Kim, Y. O. Chin, "Implementation and Performance Analysis of Web Robot for URL Analysis," The Journal of Korean Institute of Communications and Information Sciences, vol. 27, no. 3, pp. 226-233, 2002. DOI: .

[ACM Style]

Weon Kim, Hiecheol Kim, and Yong Ohk Chin. 2002. Implementation and Performance Analysis of Web Robot for URL Analysis. The Journal of Korean Institute of Communications and Information Sciences, 27, 3, (2002), 226-233. DOI: .

[KICS Style]

Weon Kim, Hiecheol Kim, Yong Ohk Chin, "Implementation and Performance Analysis of Web Robot for URL Analysis," The Journal of Korean Institute of Communications and Information Sciences, vol. 27, no. 3, pp. 226-233, 3. 2002.