Intelligent web crawler for file safety inspection

Citation

Ling, Cong Xiang and Ooi, Shih Yin and Pang, Ying Han (2015) Intelligent web crawler for file safety inspection. In: 2015 IEEE International Conference on Signal and Image Processing Applications (ICSIPA). IEEE, pp. 309-314. ISBN 978-1-4799-8996-6

[img] Text
72.pdf
Restricted to Repository staff only

Download (900kB)

Abstract

The Internet has always been growing with all the contents and information added by different types of users. Without proper storage and indexing, these contents can easily be lost in the sea of information housed by the Internet. Hence, an automated program, known as the web crawler is used to index all the contents added to the Internet. With proper configurations and settings, a web crawler can be used for other purposes besides web indexing, which include downloading files from the web. Millions or billions of files are uploaded on the Internet and for most of the sites which host these files, there are no direct indication of whether the file is safe and free of malicious codes. Therefore, this paper aims to provide a construction of a web crawler which crawls all the pages in a given website domain, and download all the possible downloadable files linked to those pages, for the purpose of file safety inspection.

Item Type: Book Section
Uncontrolled Keywords: crawler, inspection
Subjects: Q Science > QA Mathematics > QA71-90 Instruments and machines > QA75.5-76.95 Electronic computers. Computer science
Divisions: Faculty of Information Science and Technology (FIST)
Depositing User: Ms Rosnani Abd Wahab
Date Deposited: 04 Dec 2017 15:18
Last Modified: 04 Dec 2017 15:18
URII: http://shdl.mmu.edu.my/id/eprint/6542

Downloads

Downloads per month over past year

View ItemEdit (login required)