These are some of the daily questions website owners asking us on “SEO Ask an expert” page.
Question from Mike: How can we block spider bots from stealing our email aliases?
Martin: Usually it is “catch 22” — You want spider bots to visit your pages but not certain information.
There are clear rules spiders obey when it come to “friendly” spider like GoogleBot, and the like, but I’m not sure if you’re asking about them or about the “dogs”.
Mike: Thank you so much for your email reply.
We are an 11 year old virtual tour vendor in the Atlanta, GA area. We are having an increasing problem with our email system. Spider bots are apparently reading our email program somehow and stealing our email alias name and using then to do blasts of SPAM. Ad example:
[hidden]
The “links” is just one of our aliases. Here is another one I received today…
[hidden]
I am receiving well over 300 per day addressed to one of our aliases. In other words, we have our own mail server on a windows based machine. We use a product called “IMAIL”. Its revision is probably over 5 years old, but it has been working fine for our needs except for this problem. Some of our emails are actually SPAM with one of our aliases as the “from” which we did not send. We believe SPAMMERS are stealing our email names and sending mass amounts of SPAM to other clients across the nation. In some recent cases, this has black listed our site … from Bellsouth.net and Comcast.net.
We intend on changing the IP address for the mail server or the domain name to …net for our mail server. But before we do that, we need to stop the spider bots from getting in. Actually, on that machine, we really do not want ANY spiders to gain access.
Any advise would be grateful.
Thanks you, Mike
Martin: Hi Mike,
Sorry for the late response. I was locked into meetings at the search engines committee where I’m member at.
I solved this kind of problem to the BBB.org which they want real visitors only and none of the spiders as they count each memebr hits.
I can’t dig in deeper to your particular problem, and the free advice we can provide you is to use robots.txt file like this (you can copy its content and save it on your web root):
http://www.subsume.com/robots.txt
If there is still something you need from us, please let me know!