Consider I have a zoo app that shows all the zoos for each city. Each city is a page with a list of zoos.
In my current solution, on each page, I have ajax call to the server that pulls the list of the zoos for that particular city.
The performance is extremely important for me and my thought was to remove the ajax call and replace it with a JSON object that will live in the app. That way I will save a call to the server and I believe the data will arrive faster.
Is this solution makes sense? There are around 40 cities with ~50 zoos for each.
Consider the data is static and will never change.
Since 900 records is not much **, you can get all the records at once during the initial load and filter the all records array by city, that way your user experience would be much smoother, since client side js processing is far better than n/w latency.
** - note: strictly considering the data set size of ~900
Other solution can be - cache the data in the session scope and when ever there is a specific request for a city check for the availability in session scope, if it's not there make a n/w call.
I think correct question is what is my performance requirements?
Because you can write all your data in json object and do everything on client side without any ajax call but in this case when any client visit your page that means it will download all data. and that is another question mark
Related
I use Axios to get a response from an endpoint and then I input that response into a variable in the form of an object. The response I get from the server has more than 10000 data.
My question is :
How to process the data so it doesn't take up memory and can insert it into a table?
Is there a request handler that can process that response better?
Or is there a method to partially download the response so that it can be partially consumed by the user from the frontend?
There are many things to look at here:
Every site has a memory limit ( Assigned by the browsers ) and the site crashes if the memory exceeds.
The short and ugly fix would be... taking all those and storing only some part of it (Ugly fix).
The right and correct fix would be making a kind of pagination and limiting the number of data per request. in this case, You'll have proper data and a good user experience with the help of Tabs/pages with limited scroll. And most important, the site's memory will be optimized.
I think I have a tough one for you guys. Or at least it's been tough for me. I've been searching for the best way to do this in Stack Overflow and everyone that has asked has been given a different response.
I have this code that is accessing an API and calling a maintenance list of all the vehicles in a fleet.
function getMaintenanceList() {
var settings = {
"url": "API URL HERE",
"method": "GET",
"timeout": 0,
"headers": {
"Authorization": "Bearer token here"
},
};
$.ajax(settings).done(function (response) {
// The response the API sends is a JSON object.
// It is an array.
var jsonMaintenance = response;
var parsedJson = JSON.stringify(jsonMaintenance);
//Left over code from when I was trying to
//pass the data directly into the other page
// I was unable to do so
//return jsonMaintenance;
//Left over code from when this was in a PHP file
//and I was posting the stringified response to the page
// for testing purpose
//I had to disable CORS in Google Chrome to test the response out
//console.log(jsonMaintenance);
//document.getElementById("main").innerHTML = parsedJson;
});
};
The code above works well. What I was attempting to do here was write the stringified response to a file. Save that file in the server. Call it from another page using JavaScript and save it as an object in JavaScript, parse it using JSON.parse(), and then pull the required information.
Here's an explanation as to why I'm trying to do it this way. When I call the maintenance list from the API, I'm getting the entire maintenance list from the API, but I need to be able to display only parts of the information from the list.
On one page, we'll call it vehicle-list.php, on it I have a list of all the vehicles in our fleet. They all have unit numbers assigned to them. When I click on a unit number on this page it'll take me to another page which has more information on the vehicle such as the VIN number, license plate, etc. we'll call this page vehicle-info.php. We're using this page for all the vehicles' information, in other words, when we click on different unit numbers on vehicle-list.php it'll always take us to vehicle-info.php. We're only updating the DOM when we go to the page.
I only want to include the information specific to each vehicle unit in the page along with the other info in the DOM. And I only want to call the info from the API once as I am limited to a certain amount of calls for that API. This is why I am attempting to do it this way.
I will say that what I originally wanted to do was get this JSON response once every 24 hours by using a function in vehicle-list.php save the reponse as a variable as seen above var jsonMaintenance = response; and then just access certain parts of the array every time a unit number is clicked. However, I have been unable to access the variable in any other page. I've written many test files attempting to call jsonMaintenance without success so I've been trying to to just save it as a text file to the server and I haven't been able to figure that out either.
After explaining all of the above. My questions are these:
How do I best manipulate this data to accomplish what I want to accomplish? What would be the best standard? Is the above code even the right way to call the data for what I'm trying to do?
There doesn't seem to be a set standard on accomplishing any of this when I search on Stack Overflow. I'd like to be as efficient as possible.
Thank you for your time.
there is a lot of ways how you pass your data through your website after getting it in from an api call, the best approach is to store these information in a database and call it back in which ever way you want, you can do that as far as you are using php, you can store it to sql or to access, if you don't want to store these information in a database like in sql or access, then best way is to store it to localStorage and call it back whenever you want.
I will show you briefly how you can do that, if you want better explanation post an example of your returned data.
to store an item in localstorage use,
localStorage.setItem('key', 'value');
to call an item back from localstorage use,
var somevar = localStorage.getItem('key')
to remove specific item from localstorage use,
localStorage.removeItem('key')
to clear all items saved to localstorage use,
localStorage.clear()
be aware storing the data to localStorage is only at the station you are using
I would do it somehow like this.
Call the maintenance list from the API with the server side language of your choice which seems to be PHP in your case. Lets say the script is called: get-list.php. This can be triggered by a cron job running get-list.php in intervals limited to the certain amount of calls that you are allowed to do for that API. Or if you are not able to create cron jobs then trigger the same get-list.php with an AJAX-call (eg jQuery.get('sld.tld/get-list.php') - in this case get-list.php have to figure out if its the right time to call the API or not).
Now that you have the data you can prepare it as you want and store it as a JSON-string in a text file or database of your choice. If I get you right you have a specific dataset for each vehicle, which have to be identified by an id (you named it "unit number") so your JSON would look kind of: {"unit1": { property1: "val1", property2: "val2" }, "unit2": { property1: "valXYZ", property2: "valABC" }} or alike.
Now when you link to vehicle-info.php from vehicle-list.php, you do it like so: ancor or similar as well. Of course you can also grab the data with AJAX, its just important to deliver vehicle-info.php the corresponding unit number (or id - better to say) and you are good to go.
vehicle-info.php now have all there is to render the page, which is the complete data set stored in text file or data base and the id (unit number) to know which part of the whole dataset to extract.
I wanted to give you this different approach because in my experience this should work out just so much better. If you are working server side (eg PHP) you have write permissions which is not the case for JavaScript-client side. And also performance is not so much of an issue. For instance its not an issue if you have heavy manipulating on the data set at the get-list.php-level. It can run for minutes and once its done it stores the ready-to-use-data making it staticly available without any further impact on performance.
Hope it helps!
If i ran into a similiar problem i would just store the data in a database of my own and call it from there, considering you are only (willing/abe/allowed) to request the data from the API very rarely but need to operate on the data quite frequently (whenever someone clicks on a specific vehice on your applicaiton) this seems like the best course of action.
So rather than querying the data on client side, I'd call it from server, store it on server and and have the client operate on that data.
I want to select the entire table from mysql and validate those values using javascript. I know I can do this using AJAX which will send a HTTP request to the server but I feel sending too many HTTP request to the server can be quite bad as the page will load much slower, or I could store the value as the page load in the client cookie. I think it will be much faster but might be too much data stored in the cookie specially if the mysql table is too big.
What do you guys think should be the best approach for this? Is it a good thing to store mysql data in a client cookies
UPDATE
Say I have 100 Items in mysql, these 100 items I needed them displayed in a page dynamically, as the next time the page is loaded there might be 101. That means I need to validate how many items are there and allocate them to the page, and im not sure how to do that and where either client side or server side.
You could do an AJAX call for the information, do the validating of what data you need/don't need, then send it back as an array. When you get the array back you can do a for loop through the array creating the div's for each index/object in the array. This way if you have 5 items that need to be shown, you can show them. Then later if you have 7, you can show them as well. Hope this helps!
This may be a "stupid" question to ask, but I am working with a "a lot" of data for the first time.
What I want to do: Querying the World Bank API
Problem: The API is very unflexible when it comes to searching/filtering... I could query every country/indicator for it self, but I would generate a lot of calls. So I wanted to download all informations abourt a country or indicator at once and then sort them on the machine.
My Question: Where/How to store the data? Can I simply but it into an array, do I have to worry about size? Should I write to a temporary json file ? Or do you have another idea ?
Thanks for your time!
Example:
20 Countries, 15 Indicators
If I would query every country for itself I would generate 20*15 API calls, if I would call ALL countries for 1 indicator it would result in 15 API calls. I would get a lot of "junk" data :/
You can keep the data in RAM in an appropriate data structure (array or object) if the following are true:
The data is only needed temporarily (during one particular operation) or can easily be retrieved again if your server restarts.
If you have enough available RAM for your node.js process to store the data in RAM. In a typical server environment, there might be more than a GB of RAM available. I wouldn't recommend using all of that, but you could easily use 100MB of that for data storage.
Keeping it in RAM will likely make it faster and easier to interact with than storing it on disk. The data will, obviously, not be persistent across server restarts if it is in RAM.
If the data is needed long term and you only want to fetch it once and then have access to the data over and over again even if your server restarts of if the data is more than hundreds of MBs or if your server environment does not have a lot of RAM, then you will want to write the data to an appropriate database where it will persist and you can query it as needed.
If you don't know how large your data will be, you can write code to temporarily put it in an array/object and observe the memory usage of your node.js process after the data has been loaded.
I would suggest storing it in a nosql database, since you'll be working with JSON, and querying from there.
mongodb is very 'node friendly' - there's the native driver - https://github.com/mongodb/node-mongodb-native
or mongoose
Storing data from an external source you don't control brings with it the complexity of keeping the data in sync if the data happens to change. Without knowing your use case or the API it's hard to make recommendations. For example, are you sure you need the entire data set? Is there a way to filter down the data based on information you already have (user input, etc)?
I've developed a web application with the concept of Single Page Application but none of the modern techs and frameworks.
So I have a jQuery page that dynamically requests data to localhost - a Laravel instance that compiles the entries in the DB (within a given time interval).
So the client wants to see all the entries for last week, the app works fine. But if he wants to see the results for the whole last month... well, they're so many that the default execution time of the php ins't enough to process all the data (30 seconds). I can easily override this, of course, but then the jQuery client will loop through these arrays of objects and do stuff with them (sort, find, sum...). So I'm not even sure jQuery can handle this many data.
So my question can be broken in two:
Can laravel ->paginate() be used so the ajax request of jQuery can also chunk the data? How does this work (hopefully in a manner that doesn't force me to rewrite all the code).
How could I store large amounts of information on the client? It's only temporary but the users will hang around for a considerable amount of time on my webpage, and I don't want them to wait 5 minutes every time they press a button
Thanks.
If you want to provide an interface to a large amount of data stored in a backed, you should paginate the data. This is a standard approach, so I'm sure your client will be ok with that.
Using pagination is pretty simple - see the docs for Laravel 5.0 here: http://laravel.com/docs/5.0/pagination
In order to paginate results in the backend, you need to call paginate($perPage) on your query instead of get() in your controller, like that:
$users = User::whereIsActive(true)->paginate(15);
This will return paginated result with 15 records per page. Page number will be taken from page parameter of the request. In order to get 3rd page of users, you'll need your frontend jQuery app to send a request to URL like:
/users?page=3
I don't recommend caching data in the frontend application. The data can be changed by some other user and you won't even know about it. And with pagination, your requests should be lightweight enough to stop worrying about a request sent to fetch every page of results.
Not sure if you're subscribed to laracasts but Jeffery Way is amazing in explaining features of Laravel and I highly recommend his videos.
In short you can paginate the results, then on the view when you call the foreach on your items you can array_chunk() the results to display them how you need to. But the paginated results are going to be fetched using a query in the URL, and i'm not sure that is what you want if you're already using a lot of jQuery to keep everything on the same page.
https://laracasts.com/lessons/crazy-simple-pagination
But assuming you're already paginating the results with whatever jQuery you've already written for the json data...
You could also use a query scope to get the data you need to for the amount of time to scope to create a simple api to use with ajax. I think that's probably what you're looking for.
So here's what I would do assuming you're already doing some pagination manually with your javascript.
Create a few query scopes to filter the data for different lengths of time
Create simple routes to fetch results from URI using the query scopes
Get the json data from the route preforming an ajax requests to the URIs created
More information on Query Scopes: http://laravel.com/docs/5.1/eloquent#query-scopes