Protecting Content And Handling Robots
Do you want a competitor to grab your web site content within a day? Have you ever had your server stall or bandwidth limit exceeded from an unknown robot rapidly crawling many thousand pages an hour? Do visitors complain because you had to block their ISP’s IP, and they do not have access to your site? These problems may seem not so critical until you develop a site with a hundred thousand pages or bigger.
This tutorial provides a solution for big sites to:
- Protect their content against mass download software
- Handle search robots
- Prevent unauthorized user activity
- Save bandwidth.
A working example of the code produced in this article can be found at http://www.alxg.net/sim/.
You can download the code produced in this article at http://www.alxg.net/sim/sim.zip.