sitemap.xml

Does E-rice have a problem with sitemaps? Ever since changing to them, I keep getting errors in my map through google. Heres the error im getting…

URL timeout: robots.txt timeout
We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit.

Ive tried removing the robot.txt file and putting it back. But I keep getting the same error message. Ive also tried with a capital S but that didnt do anything either.

http://www.snoqualmieweather.com/sitemap.xml

Any ideas on this would be great…Thanks, Mark.

Mark,

I would open up a ticket with e-rice and see if they can see an issue with it. They are usually pretty good getting back to you in a timely manner.

Chuck

OK…Will do…Thanks.

The longest I waited was about 12 hours but that was on a weekend. But usually 2-4 hours is the norm. Granted if you do it at night it take a while.

Chuck

I opened a ticket when i first switched about some ftp problems and it took 2 days for a response. Hopefully I’ll get a faster response this time around.

Where do you see the error?

In the Google webmaster tools. Google wont validate my sitemap since switching to e-rice. Not sure if the timing is a coincidence or not.

Just as a data point. I use E-Rice and I don’t have a problem with the sitemap.

Mike

Edit: Well I should have checked before posting this. I went to the webmaster tools
and sure enough I have the same problem. It was working so something changed
either on Google or E-rice.

How does a robot.txt file timeout? Google is saying that. I never changed anything with my sitemap or robot.txt file when I made the change to e-rice.

URL timeout: robots.txt timeout

Yeah, I have not changed anything either and I seem to be getting the
same error. I would suspect a problem at Google. Be interesting to
see what Alan at E-Rice says about it.

Mike

Another reference point, Carterlake who started the thread is on e-rice too.

is your sitemap validating, or you just getting a robots error?

Mark, just a side comment, have you disallowed the log and xxx in your robot.txt (the one I can see doesn’t show that)?

Disallow: /log
Disallow: /xxx

No…I didnt do that. Will add that now.

I resubmitted my sitemap to google after fixing the robots.txt file and i just got the error again saying the robots file is timing out. Im starting to think its on googles end, least i hope so.

Hmmm http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=35164 do you have to tell them the url of robots.txt in the setup ?

In the webmaster tools section, you can have google Analyze your robots file.

URL results URL Googlebot Googlebot-Mobile
http://www.snoqualmieweather.com/robots.txt Allowed Allowed

URL results URL Googlebot Googlebot-Mobile
http://www.snoqualmieweather.com Allowed Allowed

So the timing out thing is odd, since there results are showing they can access it.

That is strange, you would think that if it was a global problem with google that there would be a lot of people posting on the internet, but google :roll: doesn’t find that :?

Have you looked in your error log on e-rice?

Nothing really stands out in the log to me.

this is interesting…

Line 0 : http://www.snoqualmieweather.com/robots.txt robots.txt file does not appear to be valid

There is no Line 0…lol