I just found this link… http://www.wrh.noaa.gov/firewx/?latitude=&longitude=&wfo=sew&interface=fwzones&click.x=4&click.y=181
I wonder If this could be used?
Update: Doesn’t work.
I just found this link… http://www.wrh.noaa.gov/firewx/?latitude=&longitude=&wfo=sew&interface=fwzones&click.x=4&click.y=181
I wonder If this could be used?
Update: Doesn’t work.
Jachym’s appears to work for me, https://realweatherstation.com/vim/fire_weather.php but I don’t know what it’s supposed to do :dontknow:
Strange… But I noticed your page Is secured. I wonder if that has anything to do with it.
AFAIK that only affects incoming requests, so I don’t think the other server would know that :?
hmmm… So I wonder why its working for you but not for me. You think its a server thing blocking it?
The nws has been known to block IP’s…
Both me and my parents site are on 1and1. I wonder if anyone else on 1and1 is blocked.
Or your server doesnt have the cURL extension installed.
We could check this.
Just create a blank document.
Inside it put:
<?php
echo phpinfo();
?>
Save it as something.php and upload it anywhere to your server. And then post here a link to this file. It will show the exact PHP configuration and installed extensions. The script I sent you above works for me fine as well.
Any ideas?
I would contact them and ask if they are blocking your server IP.
Ive had issues with 1and1 in the past blocking my IP, but normally when that happens I cant access my site. You think there blocking my site from accessing the files I need for this script? Maybe for security reasons?
Well, I think it would be a good idea to ask 1and1 to check it out but I was really thinking of the weather.gov folks.
Ok… Ill send weather.gov an email too see what they say. I will email 1and1 also incase they have blocked the weather.gov domain for some reason.
The problem was the script doesn’t include a User-agent: header in the request. Change the code
fputs($socketConnection, "GET $resourcePath HTTP/1.0\r\nHost: $domain\r\n\r\n");
to
fputs($socketConnection, "GET $resourcePath HTTP/1.0\r\nHost: $domain\r\nUser-agent: Mozilla 5.0 (fire_weather.php)\r\n\r\n");
and it works fine.
That 403-Forbidden response from the forecast.weather.gov is caused by the lack of the User-agent: header in the request. If they ultimately switch to https only access, then the script will have to use curl (or switch to file_get_contents() with the User-agent: header included) to work then.
Best regards,
Ken
@Ken: The curl version doesn’t work for Mark either, see reply #8
His original script worked with the code fix I tried… it failed with the same 403 in the cache before the fix.
Jachym’s curl-based script worked fine too (since it had a User-agent: in the request).
The current version of fire_weather.php?sce=view on his site is the one without a User-agent: in the request, so it’s bound to continue not working…
Hmmm, that’s confusing :roll:
Requests to forecast.weather.gov without a User-agent: header will only return a 403-Forbidden response from the NWS website – they started enforcing that in March, 2015 (as I remember).
The current script on his site has no User-agent: string in the request, so is doomed to fail until updated.
I don’t see in the postings the URLs for the script with a fix (Jachym’s curl version), so no way to see if the failure persists with a fixed version installed.
Also… the script has a built-in cache lifetime of 1800 seconds, so if a new script is installed, it has to be run with a ?cache=refresh to have the script try the reload from the URL instead of just doing a reload from the cache file.
I was just going off Mark’s post saying that Jachym’s script didn’t work for him :dontknow: Sorry to have caused confusion :oops: