click “URL list” tab, click “Refresh Table” button,
highlight all your http://www.cavecountryweather.com/wx13.php URLs (you highlight many at once by click on the first one, then scroll to the last one, hold down the Shift key and click last one, they all turn yellow)
click the “Delete” button on the gsite crawler screen
now click (Re)Crawl - This Project, wait until the crawlers are idle again and then click “Refresh Table” button
Make sure all is well.
If you have the FTP settings setup in gsitecrawler you can now click “Generate - Google Sitemap-File” let it save (overwrite), then upload to FTP, then Submit to google all automatically.
Ok, I think i did it correctly. I created and uploaded a sitemap.xml file. I then changed my robots.txt file to read as follows…
User-agent: *
Disallow: /log
Disallow: /xxx http://www.evansville-weather.com/sitemap.xml
<meta name="description" content="See live weather from the Outer Hebrides in Scotland. Includes time lapse movies, forecasts and actual weather conditions updated every few seconds.">
This is Google’s cache of http://www.snoqualmieweather.com/contact. It is a snapshot of the page as it appeared on Aug 8, 2008 04:35:32 GMT. The current page could have changed in the meantime.
You just have to wait for google to purge it out naturally or you can make a mod rewrite to 301 redirect that old URL to your new URL
The redirect help if anybody clicks on the google link, but they probably would contact you from your site anyway.
Great Info from ya Both…I’ll wait and see if Google will fix it them selves. Does it help any if i resubmit my Sitemap file in there webtools to maybe speed things up?
gsitecrawler can ping google when it uploads your sitemap, but google will usually come get it when ever it wants. I would not worry about it.
It may take several days or even several weeks for google to purge your old pages, you just have to wait.
I`m confused… 8O can somebody guide me step by step what should I do?
Should I dl one of these two(photo)? should I create a file called robots.txt and enter in the text? and then uploaded to the website?