Caching JavaScript API Calls - javascript

I'm querying the GitHub API from the client using JavaScript (on this page).
There are 14 API calls each time the page loads, which means I will end up hitting GitHub's API rate limit of 5000 calls per hour pretty fast.
Most caching strategies I've seen assume that you have access to a server, but in my case I'm running a purely static Middleman site.
So my question is this: how can I cache API requests from the client? Are there third-party apps that provide this service?
(Note that my use case is many different clients hitting the page (e.g. it has been linked from Hacker News), not a single client refreshing. So local caching wouldn't really help much. )

Agreed with Firebase or separate data store alternative from #David so you can create a persistent cache mechanism since you don't have access to the server where the application sits. It's basically another data store and you can update your logic in Middleman to either make a fresh call to the Github api or to pull from data saved in Firebase based on some checks you do when a person visits that Translation page. Check out the logic here

You can cache a single client's page by using local storage or a cookie. This way if the user refreshes, you can have logic to see if you want to query the API again. This would be fine if your user base was small.
This type of caching is typically done on the server since you are limiting yourself to ~357 users per hour at best.
To cache on the client side, store the data in local storage and log the time of the query. Then decide on an interval (let's say 5 minutes). Then prior to any refresh or page load, look a the users local storage and see if the query was within the last 5 minutes. If it was, read from the local storage. If not, then query the API again. This only applies to each user but by querying every 5 minutes, it would allow you to say ~30 users per hour.
http://diveintohtml5.info/storage.html

No server, eh? You could use something like Parse. Make a Parse object, set the key to the particular GitHub API URI, and set the value to something like this:
{
stored: <Date>,
value: <stringified JSON returned from GitHub API call>
}
Then when someone hits your client, first call Parse to see if you already have a cached version for that particular API call. If you don't, make the call to GitHub's API and then store the results on Parse (with stored set to the current DateTime so you can check for staleness later).
If Parse does have a cached version stored, check the stored value to see how old it is - if it is stale, make a fresh call to GitHub, and store the results back into Parse. Otherwise, just parse the JSON string from value and you're good to go.
This is assuming that you want individual caching control over the 14 GitHub API calls. If you don't, then just store the compiled calls into one object on Parse under a key like cache.

Related

Newb question: How to maintain a data list for all users in my webapp?

Disclaimer: I'm finding my way and not sure how to ask the question. This is what I want to do:
I am getting data from twitter API in my app (this is working).
I then want to store that data (as an array), and serve it to whichever user is accessing the app (so that I don't need to query the API everytime, just poll every 10 mins).
What do I need to be able to do this? (external database? or can I just save to a file on the server in someway? or something else)
For ref I'm building with sveltekit, and deploying with vercel.
If you are using Twitter's API directly within your own app, every user of your app needs to query the API at least once to get some data. You cannot serve the results that are returned to one user to other users without having your own back-end server and handling this accordingly. However, you can save a copy of the data returned to each user to that user's localStorage so that specific user does not have to query the API every time.
You can save the data on the client's localStorage and save an expiry timestamp that allows you to query the API again after the timestamp has passed.
Here is a tutorial on how to use localStorage with SvelteKit

Api call request limit of 1 call each hour

I use an API in Switzerland, which allows me to request the api one time every hour in production.
I don't need more than one request each week, since it's event data, but I don't know what i have to do that i can use this api for 200+ users each day.
Do I have to save the data somewhere like firebase or are there services for this? I'm very new in this field. Could you give me a hint?
Building on top of what Dr. cool said, you'll most likely want to use cron jobs: http://code.tutsplus.com/tutorials/scheduling-tasks-with-cron-jobs--net-8800
Also keep in mind, some API's do not allow you to store the data they provide on your own server. Make sure you read the terms of use before doing so from the API provider.
It's better to have a program on the server that can run once a week and load data from the API. This data should be saved in a database. Then when one of your users needs the data, it's ready to load from your database without hitting the API limit.
Yes, Firebase is a great option. Or you can use MySQL or other server-side databases.

Building my own database of Facebook posts by calling Graph API via Server?

I want to build my own database of Facebook posts by periodically calling the Facebook Graph API and saving the results. User would then communicate with my own database instead of directly with Facebook.
I know that the API calls require an Access token that is generated from your Facebook login. From what I understand, this means the user logging in on the clientside would be using their own access token to make the calls. However, I want to make the calls from the server, which means using my own access token.
To illustrate the process flow:
*SERVER*
myFBAccessToken ---(API call every 15 mins)---> Facebook ---(returns)---> Fb posts ---(save to)---> myDatabase
*CLIENT*
viewFbPosts ---(db call)---> myDatabase
My questions are:
----------------------
1. Is it possible to use a single access token to regularly call the API from server? (Every 15 mins)
2. Will doing so violate any usage limitations on how frequently you can call the API?
3. Does Facebook allow for storing of their content on external databases?
Alternatively, if this is not recommended, does anyone know of a way to get more than the latest 25 posts from the facebook /feed?
I am using MEAN stack (mongodb, expressjs, angularjs, nodejs) with asynchronous functions.
Yes, you can use the same token for the same user multiple times. However, once it expires, you will have to re-login your user again to get a new access token.
There is not an official limitation of number of queries that you are sending to graph API. However, being involved in this sphere for a long time, I found out that 1 query per 1 second is workable for a single user. If you try to exceed it, you will most probably get JSON with error.
You do not have to notify facebook that you are going to store its data in external database. You simply get permitted information using graph API and, afterwards, it is totally up to you what you are going to do with the data. Facebook is responsible for flow of the data from their servers and making sure that you are the person who has a right to get that information on a legal basis.

Is this model secure for offline storage?

I'm making interface based on 3rd party api. My app works as data analyzer, and to do that i need to download lots of data and this data keeps updating.
But api has limits, and i can't have user download entire set of data at each session. That would not scale.
So i'm thinking simplest way is to let user sync when he wish to, and save the data in local storage.
However, i don't want any user on that computer to have access to all data.
Therefore, i've come up with this scheme that will let the data be safe.
send javascript via https.
using js do oauth and retrieves data from api servers.
get opensource library like cryptojs also using https.
Encrypts data using user's (secret key + salt) and then save both in storage. So even if someone sees it then he can't understand it.
Then at every new session take stored salt and secret key to match hash(es) and get that data. Or simply start new auth process from step 2.
if hashed data is older than 30 days delete it. (which could mean user no longer uses the computer or has forgotten password, either way data should be outdated and will need to be resynced by downloading all again)

When should I use PHP Session vs Browser Local Storage vs JavaScript Object Parameters?

When is it appropriate to use the many different ways that modern day AJAX based applications are storing data? I'm hoping for some specific guidelines that I can give developers. Here's what I'm seeing so far, and it's getting messy.
PHP Server Side Session: PHP Session data is probably the oldest way to store session based information. I'm often passing in parameters through various AJAX calls from JavaScript/jQuery objects - to store in PHP Session. I'm also returning objects of data (some session information) back through as a response/result to JavaScript/jQuery methods.
Browser based Local Storage: This is often used to store data that needs to persist on the front end, yet I'm uncertain at times when to use it. One good use was to store geolocation information from navigator.geolocation. I've been storing a lot of information here, but I am not certain that is wise. It never seems to expire, but can be deleted from Resources.
JavaScript Object with config parameter(s): I've been building JavaScipts objects with an init method that sets-up a 'settings' parameter. This is very useful as I usually build it from data passed in from PHP. With jQuery Mobile this data can even persist from page to page and change with AJAX request responses.
So, what guidelines would you give on usage of each?
PHP Session data is NOT Permanent Data storage as when you destroy the browsers session you will loose the Data. This is useful if you dont
want to permanently store data.
Browsers Local Storage is Permanent unless you delete the data yourself or you clear the browsers cache. Some users clear the cache from time to time so this can be a problem.
Any other method Such as Objects is not permanent Data storage.
Other Browser related Permanent storage are COOKIES (if you don't
expire them at session close), IndexedDb (Check here for current browser support http://caniuse.com/#feat=indexeddb).
So depending on your Website or App you need to decide what data needs to be
stored for a short time, or for long time or forever until you deleted it manually.
As an example, you will use LocalStorage if you were storing
Bookmarks, and if you were Storing Geolocation points you use Cookies
and expire them after you close the browser or the App.
If you were Logging in to an Account using PHP then best practice is to create a PHP
Session, and even change the session timeout when the user clicks
(Remember me).
These are just a couple of examples from thousands of possible needs.

Categories

Resources