Ok, I think i did it correctly. I created and uploaded a sitemap.xml file. I then changed my robots.txt file to read as follows…
User-agent: *
Disallow: /log
Disallow: /xxx http://www.evansville-weather.com/sitemap.xml
<meta name="description" content="See live weather from the Outer Hebrides in Scotland. Includes time lapse movies, forecasts and actual weather conditions updated every few seconds.">
This is Google’s cache of http://www.snoqualmieweather.com/contact. It is a snapshot of the page as it appeared on Aug 8, 2008 04:35:32 GMT. The current page could have changed in the meantime.
You just have to wait for google to purge it out naturally or you can make a mod rewrite to 301 redirect that old URL to your new URL
The redirect help if anybody clicks on the google link, but they probably would contact you from your site anyway.
Great Info from ya Both…I’ll wait and see if Google will fix it them selves. Does it help any if i resubmit my Sitemap file in there webtools to maybe speed things up?
gsitecrawler can ping google when it uploads your sitemap, but google will usually come get it when ever it wants. I would not worry about it.
It may take several days or even several weeks for google to purge your old pages, you just have to wait.
I`m confused… 8O can somebody guide me step by step what should I do?
Should I dl one of these two(photo)? should I create a file called robots.txt and enter in the text? and then uploaded to the website?
I got mine running. The bots were all over my WUHistory and radar files so they are now removed from being crawled. This was pretty simple and I was quite impressed with what google had about my site all ready. Thanks for the info on getting this up and running.
If I understand this correctly Google will look at the sitemap.xml and then index the site nicely as it has done with Carterlake’s or TNET’S site? And this could take from 6 days to 6 months? So far google says my sitemap is acceptable just wondering what happens next!