Scripting Best Practices
Often times clients will need to gather weather data with automated scripting which typically comes with high volumes in a short time frame. The weather API is ready for your automated queries, but there are a few tips outlined below to help optimize your requests.
With observations, most stations report hourly. However, some stations can report up to every minute. If a script is running every 5 minutes for observations, there's a good chance the same data is being fetched over and over again until the next hour. Additionally, observation stations are not synced and can individually report at different rates/times throughout the day.
Forecast data is updated as often as every hour within the US and every 3 to 6 hours for the rest of the globe. Normally, even the hourly updates of US forecasts do not show significant changes. Depending on your use case, fetching forecasts every 3 to 6 hours may be more than sufficient and lower your overall daily API accesses. Additionally, due to forecast model variable processing times, we cannot guarantee a time that a specific locations forecast will be updated.
There are a few specific items that come to mind in regards to optimization, but most of it boils down to speed. With the techniques outlined below, we will focus on removing unnecessary data for your use case and avoiding popular times during the day. With these target areas in mind, we should be able to significantly decrease your latency and in turn, complete your scripts faster.
Often times scripts are set to run at the top of the hour. However, if everyone else is doing you may notice increased latencies. While our infrastructure is ready to handle a sudden increase in load, moving script start times to just outside of the top or bottom of the hour may be beneficial to your application. Rather than starting at the :00 minute, try sending requests at :13, or :22 minutes after or even 10 minutes before the hour.
The weather API has loads of information, especially the forecasts endpoint. Most users do not need every single data attribute included in the default API response. We recommend using the
fields= parameter to decrease the size of your JSON document - which means less bandwidth, memory footprint and quicker load times for your application.
For example, the typical response from the observations endpoint is 1600 bytes. If the application only needs a timestamp and the temperature in Fahrenheit, adding the
&fields=ob.dateTimeISO,ob.tempF parameter will decrease your file size to roughly 930 bytes. While a few hundred bytes likely won't make or break an application, every little bit should help improve performance.
The savings is more substantial with the forecasts endpoint. For example, a 15-day hourly forecast will return a JSON document that is approximately 600KB in size. If you only require the date, temperature in Fahrenheit and weather icon name, then utilizing
&fields=periods.dateTimeISO,periods.tempF,periods.icon will lower the returned JSON file size to approximately 25KB. That's a savings of more than 95%!
Additionally, limiting the results will often make the JSON document easier to read for us humans during development and debugging.
The forecasts endpoint provides several days of forecast data. This can be daily, day/night, or even hourly. Many applications will only use a subset of the intervals provided so our recommendation is to limit the number of intervals in your response. For example, if an application only needs an hourly forecast for 24 hours, use the
plimit= parameter so the API only returns 24 intervals. This saves time as the API does not have to calculate all 372 hours that are available. Similarly, if you only need a 3-day forecast your query might look like this:
Our support staff will gladly help and can provide best practice recommendations for your specific application. We want to help you grow and optimize your applications. Feel free to reach out whenever you have a question through support (opens in a new tab) and our support team will get back to you as soon as possible.