I wanted to make HTTP connection to any website. I will grab the data and store
to local databaes. This is been done. Now As I m doing this programmetically,
this program called as "ROBOT" not as normal user, because no human request is
there, it is genarated through program(Software). In a day there can be multiple
request to a perticuler page. this is alarming for site admin, and there is a
chance that site admin ban my request.
so I want to make a request in a way that site admin can not identify this.
Approach can be:
1) I may have pool of 5 IP address and each time i will pick random IP for
request. (Fesibility Unknown).
2) I might have a proxy server, which randers IP and fwd the request to site.
(Fesibility unknown).
3) I can be over load http protocol and use it. (How to do this is unknown).
\r\n
\r\nany \r\nothe idea ?? or any help in any of above approach is really
\r\nhelpful.