Ember JS : How to reuse ember data while calling? - javascript

I have two models to call. In the first model I include somedate like this.
this.store.query('comment',{
include : 'person,address'
});
And in the second call I include the same details that already stored in the store.
this.store.query('post',{
include : 'person,address'
});
So, the API call takes a lot of time to resolve. Is there any way I can use the first API include data in the second API call to create a relationship between those two models(person, address).
This would save a lot of time for me.
Note: Examples are testing purpose only.

You are using the query() method of Ember Data's store. It expects two arguments: the model name as first argument and the query as a second argument. Last one is directly passed to your backend as part of the request. The responsible code is quite simple: https://github.com/emberjs/data/blob/v3.10.0/addon/adapters/rest.js#L535-L560
If you are using default JSONAPIAdapter the requests executed by your method calls look like this:
this.store.query('comment', { include: 'person,address' });
=> GET /comments?include=person,address
this.store.query('post', { include: 'person,address' });
=> GET /posts?include=person,address
The API does not know from that request that the client already has some of the person and address records cached locally. Ember Data does not include that information by default. You could customize your Adapter two do it but I wouldn't recommend so - especially cause that may blow up the request size and reduce the cache hit rate by a fair amount. Also you may want to reload the locally cached records.
If you expect two have most of the related records already cached locally, you may simply not want to ask the server to include them? In that case it might be cheaper to load them afterwards in a coalesced request.

Related

What is the best way to manipulate an API AJAX JSON response and pass it to another page?

I think I have a tough one for you guys. Or at least it's been tough for me. I've been searching for the best way to do this in Stack Overflow and everyone that has asked has been given a different response.
I have this code that is accessing an API and calling a maintenance list of all the vehicles in a fleet.
function getMaintenanceList() {
var settings = {
"url": "API URL HERE",
"method": "GET",
"timeout": 0,
"headers": {
"Authorization": "Bearer token here"
},
};
$.ajax(settings).done(function (response) {
// The response the API sends is a JSON object.
// It is an array.
var jsonMaintenance = response;
var parsedJson = JSON.stringify(jsonMaintenance);
//Left over code from when I was trying to
//pass the data directly into the other page
// I was unable to do so
//return jsonMaintenance;
//Left over code from when this was in a PHP file
//and I was posting the stringified response to the page
// for testing purpose
//I had to disable CORS in Google Chrome to test the response out
//console.log(jsonMaintenance);
//document.getElementById("main").innerHTML = parsedJson;
});
};
The code above works well. What I was attempting to do here was write the stringified response to a file. Save that file in the server. Call it from another page using JavaScript and save it as an object in JavaScript, parse it using JSON.parse(), and then pull the required information.
Here's an explanation as to why I'm trying to do it this way. When I call the maintenance list from the API, I'm getting the entire maintenance list from the API, but I need to be able to display only parts of the information from the list.
On one page, we'll call it vehicle-list.php, on it I have a list of all the vehicles in our fleet. They all have unit numbers assigned to them. When I click on a unit number on this page it'll take me to another page which has more information on the vehicle such as the VIN number, license plate, etc. we'll call this page vehicle-info.php. We're using this page for all the vehicles' information, in other words, when we click on different unit numbers on vehicle-list.php it'll always take us to vehicle-info.php. We're only updating the DOM when we go to the page.
I only want to include the information specific to each vehicle unit in the page along with the other info in the DOM. And I only want to call the info from the API once as I am limited to a certain amount of calls for that API. This is why I am attempting to do it this way.
I will say that what I originally wanted to do was get this JSON response once every 24 hours by using a function in vehicle-list.php save the reponse as a variable as seen above var jsonMaintenance = response; and then just access certain parts of the array every time a unit number is clicked. However, I have been unable to access the variable in any other page. I've written many test files attempting to call jsonMaintenance without success so I've been trying to to just save it as a text file to the server and I haven't been able to figure that out either.
After explaining all of the above. My questions are these:
How do I best manipulate this data to accomplish what I want to accomplish? What would be the best standard? Is the above code even the right way to call the data for what I'm trying to do?
There doesn't seem to be a set standard on accomplishing any of this when I search on Stack Overflow. I'd like to be as efficient as possible.
Thank you for your time.
there is a lot of ways how you pass your data through your website after getting it in from an api call, the best approach is to store these information in a database and call it back in which ever way you want, you can do that as far as you are using php, you can store it to sql or to access, if you don't want to store these information in a database like in sql or access, then best way is to store it to localStorage and call it back whenever you want.
I will show you briefly how you can do that, if you want better explanation post an example of your returned data.
to store an item in localstorage use,
localStorage.setItem('key', 'value');
to call an item back from localstorage use,
var somevar = localStorage.getItem('key')
to remove specific item from localstorage use,
localStorage.removeItem('key')
to clear all items saved to localstorage use,
localStorage.clear()
be aware storing the data to localStorage is only at the station you are using
I would do it somehow like this.
Call the maintenance list from the API with the server side language of your choice which seems to be PHP in your case. Lets say the script is called: get-list.php. This can be triggered by a cron job running get-list.php in intervals limited to the certain amount of calls that you are allowed to do for that API. Or if you are not able to create cron jobs then trigger the same get-list.php with an AJAX-call (eg jQuery.get('sld.tld/get-list.php') - in this case get-list.php have to figure out if its the right time to call the API or not).
Now that you have the data you can prepare it as you want and store it as a JSON-string in a text file or database of your choice. If I get you right you have a specific dataset for each vehicle, which have to be identified by an id (you named it "unit number") so your JSON would look kind of: {"unit1": { property1: "val1", property2: "val2" }, "unit2": { property1: "valXYZ", property2: "valABC" }} or alike.
Now when you link to vehicle-info.php from vehicle-list.php, you do it like so: ancor or similar as well. Of course you can also grab the data with AJAX, its just important to deliver vehicle-info.php the corresponding unit number (or id - better to say) and you are good to go.
vehicle-info.php now have all there is to render the page, which is the complete data set stored in text file or data base and the id (unit number) to know which part of the whole dataset to extract.
I wanted to give you this different approach because in my experience this should work out just so much better. If you are working server side (eg PHP) you have write permissions which is not the case for JavaScript-client side. And also performance is not so much of an issue. For instance its not an issue if you have heavy manipulating on the data set at the get-list.php-level. It can run for minutes and once its done it stores the ready-to-use-data making it staticly available without any further impact on performance.
Hope it helps!
If i ran into a similiar problem i would just store the data in a database of my own and call it from there, considering you are only (willing/abe/allowed) to request the data from the API very rarely but need to operate on the data quite frequently (whenever someone clicks on a specific vehice on your applicaiton) this seems like the best course of action.
So rather than querying the data on client side, I'd call it from server, store it on server and and have the client operate on that data.

in Ember.js 2.3, how do I compile a hasMany async call into one call in ember instead of several?

I'm upgrading to ember-cli and ember 2.3. Say I have a model called User and a model called Post , and a user ...
posts: DS.hasMany('post', {async:true})
Now, this works the way I expect it to, lazily loading data and not loading posts unless it is required in either the .js or the template. So when I do
{{#each user.posts as |post|}}
{{post.title}}
{{/each}}
I get each post to render its title without a problem. However, in my server logs, I see this:
GET /posts/2
GET /posts/7
GET /posts/13
where the numbers are the post ids. This is to be expected, as when I return a user instance from the server, I return a list of the ids as the parameter 'posts'. So the user instance has:
...
'posts': '2,7,13'
...
in its data.
Now my question is this: way back when, when I used ember-data 1.0 (pre ember-cli and pre ember 1.13), I remember this call being made to the database instead for the same use case:
GET /posts?ids=2&7&13
or something along that line. I can't remember the exact format, but then, I could access the list of ids on the server side using this line of code:
var ids = req.query.ids.toString();
which gave me a comma separated list of ids (in string format). I would then convert this into the sql statement
SELECT * from posts where id in (2,7,13)
This SQL call was interpreted as a manyArray, I think, on the Ember Side and easily behaved as you would expect an Ember Array would.
How can I get this to happen again? I am quite confident that I am missing something and that I don't have to 'hack' ember-data; I would very much like to compress these calls into one instead of having an individual call to the database for each 'post'.
I should also mention that I am not looking to make {async:false} for these calls.
I think the thing you are looking for is coalesceFindRequests, this is a setting on your Adapter to tell Ember to bunch multiple requests that happen in the same runloop into one GET request as you had in the past.
You can see more detail here but essentially all you need to do is add the following to either your ApplicationAdapter (to enable it for all requests for all types) or to your posts adapter (so that it only affects the post requests)
Here is an example if you are using pod structure for your files (which I recommend)
// app/application/adapter.js
import DS from 'ember-data';
export default DS.JSONAPIAdapter.extend({
coalesceFindRequests: true
})

Dojo dstore: both server-side queries and client-side filtering

I'm a little confused with how to support both server-side queries and client-side filtering with dstore, and am hoping for some guidance. My scenario:
I am communicating with an archive server, so I only have get and query requests, nothing that updates the data.
I want to perform both server-side queries and client-side filtering.
I'd like to cache the results so I'm not accessing the server for every fetch().
If I use a Request, filter() will pass its query parameters to the server, but the data isn't cached and I can't tell how to filter on the client side.
If I use a RequestMemory, filter() is applied to the local cache, and I can't tell how to specify parameters for the server.
All the pieces seem to be there with dstore, I just haven't figured out how to put them all together yet. Thanks for any help.
Looks like I figured it out. There were a couple issues with how I was using RequestMemory. The first was that I didn't realize RequestMemory invoked fetch() automatically. The second issue was that I used an object as the queryParam when it should have been an array.
To meet my requirements I created a new store that extended from Request and Cache, just like RequestMemory, but I did not call fetch() in the postscript() function. Then I could pass parameters to the server:
store.fetch({queryParams: ['key=value']}).then(function(data) {
console.log("fetch", data);
});
I could then 'freeze' the store by setting store.isValidFetchCache = true and subsequently perform client-side filters:
store.filter({type: 'xyz'}).fetch().then(function(data) {
console.log("filter", data);
});

How can I chunk results (lazy load?) without rewriting my whole application (Laravel + jQuery, SPA style)

I've developed a web application with the concept of Single Page Application but none of the modern techs and frameworks.
So I have a jQuery page that dynamically requests data to localhost - a Laravel instance that compiles the entries in the DB (within a given time interval).
So the client wants to see all the entries for last week, the app works fine. But if he wants to see the results for the whole last month... well, they're so many that the default execution time of the php ins't enough to process all the data (30 seconds). I can easily override this, of course, but then the jQuery client will loop through these arrays of objects and do stuff with them (sort, find, sum...). So I'm not even sure jQuery can handle this many data.
So my question can be broken in two:
Can laravel ->paginate() be used so the ajax request of jQuery can also chunk the data? How does this work (hopefully in a manner that doesn't force me to rewrite all the code).
How could I store large amounts of information on the client? It's only temporary but the users will hang around for a considerable amount of time on my webpage, and I don't want them to wait 5 minutes every time they press a button
Thanks.
If you want to provide an interface to a large amount of data stored in a backed, you should paginate the data. This is a standard approach, so I'm sure your client will be ok with that.
Using pagination is pretty simple - see the docs for Laravel 5.0 here: http://laravel.com/docs/5.0/pagination
In order to paginate results in the backend, you need to call paginate($perPage) on your query instead of get() in your controller, like that:
$users = User::whereIsActive(true)->paginate(15);
This will return paginated result with 15 records per page. Page number will be taken from page parameter of the request. In order to get 3rd page of users, you'll need your frontend jQuery app to send a request to URL like:
/users?page=3
I don't recommend caching data in the frontend application. The data can be changed by some other user and you won't even know about it. And with pagination, your requests should be lightweight enough to stop worrying about a request sent to fetch every page of results.
Not sure if you're subscribed to laracasts but Jeffery Way is amazing in explaining features of Laravel and I highly recommend his videos.
In short you can paginate the results, then on the view when you call the foreach on your items you can array_chunk() the results to display them how you need to. But the paginated results are going to be fetched using a query in the URL, and i'm not sure that is what you want if you're already using a lot of jQuery to keep everything on the same page.
https://laracasts.com/lessons/crazy-simple-pagination
But assuming you're already paginating the results with whatever jQuery you've already written for the json data...
You could also use a query scope to get the data you need to for the amount of time to scope to create a simple api to use with ajax. I think that's probably what you're looking for.
So here's what I would do assuming you're already doing some pagination manually with your javascript.
Create a few query scopes to filter the data for different lengths of time
Create simple routes to fetch results from URI using the query scopes
Get the json data from the route preforming an ajax requests to the URIs created
More information on Query Scopes: http://laravel.com/docs/5.1/eloquent#query-scopes

CouchDB run a list function on _changes

I need to create a feed based on changed documents. I figured the _changes api would be perfect in this case. Ie simply store the last sequence id client side, so we can use it to limit the results in the next call to _changes.
Currently, the application performs the following:
calls _changes with since/filter parameters
calls a show function for each id in the _changes feed
renders all changes into the customer feed
What I would like, is to be able to call a list function on the entire _changes result in a single request. This would remove the need to explicitly parse the _changes result on the client, and move that functionality to the server.
The question: is this even remotely possible?
I have been trying to implement a view, doing "almost" the same thing as _changes, but without any real luck.
This is not possible at the time, and the _changes API is different enough from regular views that it's not entirely straightforward to implement.
There is a ticket open in the CouchDB issue tracker, but it hasn't been updated in quite a while.

Categories

Resources