Angular $resource, save and re-render - javascript

I'm using $resource to interact with my backend.
I've a page that shows the list of items coming from $resouce.query() .
gcResourceClubs.query({
page:(($scope.currentPage - 1) * $scope.itemsPerPage),
size:$scope.itemsPerPage})
.$promise.then(function(res){
$scope.clubs = res
});
(I use a promise because the values are used to set the pagination in the html)
Now, in another directive (the one of the form) I perform this operation.
gcResourceClubs.save(club); which saves the club to the server.
(I tried with scope.clubs.$save(club) but it POST data as url encoded rather than a json object, so it does not work with my backend. I thought that in this way the list of object would be automatically updated, is it true?)
Now, when the resource is created, how can I easily re-render the list?
I thought of making a promise over the save and do the same logic as of the promise of the query. Or to redirect the app to the page that display the list.
What is the "correct" way to do this? Isn't there anything automated for this? (I used for a while backbone, and there I had events, but in angular with directives and controllers I'm not sure that's the correct way of doing it)

Related

Cache API with MVC Views

I have a basic MVC form and I've been trying to use the Javascript Cache API to cache all my css, js, html files so that when users (people in the field) do not have reliable access, they can still use my web form. Obviously I'm using IndexedDB and service workers as well to check for a connection and save locally when a connection is not available, syncing when it is available.
I've gone through some tutorials and everything seems straightforward when dealing with caching actual, physical files (css, html, js). MVC is weird though since you're routing. I created the basix Index, Create, Edit, Details views. When I create an array of URL's to cache such as
var urlsToCache = [
'/App/Details',
'/App/Edit',
'/App/Create',
'/App/Index',
'/App/Content/bootstrap.css',
'/App/Content/site.css',
'/App/Scripts/jquery-1.10.2.js',
'/App/Scripts/jquery.form.js',
'/App/sw.js',
'/App/Scripts/bootstrap.js',
]
.. everything caches except for DETAILS and EDIT. Index and create cache fine. I'm actually surprised the latter two cache at all since they aren't physical files. I'm assuming Details and Edit don't cache because they don't work without querystring parameters.
Is it POSSIBLE to cache these two views at all? Or does anyone know of anything on NuGet that addresses this situation?
I changed this in the GET method for my Edit action to return an empty Model if there was no ID
if (id == null)
{
//return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
return View();
}
This allowed me to load the Edit page without a querystring variable and not get an error message. The page loads with no data but it allows me to cache it. At this point I suppose I would have to tell my service worker to check if the page is online. If it is, route the request normally, else query the local storage and manually plug the values into the fields.
So let this be a lesson to anyone creating Offline-enabled apps with MVC and using Cache API. Get rid of the lines that return bad request errors in your CRUD views if ID numbers aren't passed. Just pass back a blank model to the view (return View()). This allows you to cache your pages. And you'll obviously need to write code to handle the offline retrieval and presentation in code that executes when the page loads, but it will still allow you to utilize the MVC/Razor features when online.
One thing to note: "/App/Edit" will cache. If you load "/App/Edit/2", it won't match a url in your cache so you'll get an offline message. However, you can easily modify your Index page to send the ID via post. Just have a form on the page that goes to the Edit action and change the link to an underlined span with an onclick that sets the value of a hidden field to the ID. You'll have to pass another hidden field to let it know that it needs to retrieve instead of update (since the controller has different GET AND POST actions for Edit. The GET action is useless, but keep it for caching. You're retrieval that you normall would do int the GET is now going to be done in the POST with an if statement to check for your hidden field flag.

What's the correct way to handle complex objects with Restangular?

In the app on which I'm working I use a REST API with
Angular 1 on the frontend. I use the extendModel function of Restangular in order to initialize the data that I receive. During the initialization phase I create a lot of cross-references and the object becomes cyclic. Here comes my problem. In order to put or post my data back to the server I have to either copy the object by picking just the fields that I need or I can work on the same object and delete all the references that I created and restore them when I get a response from the server. I feel like these 2 options kinda go head-to-head with the intended usage of Restangular. Is there a better way to do this?

Retain data on browser refresh in AngularJS

I am working in AngularJS application with Spring rest as backend. I am pretty new to angularjs.
I have a UI page where i display some list of objects in table.There is an edit button against every record.When i click edit, another page opens which set the data accordingly.
The issue is, being on edit page, if I refresh the browser, I loss my data. One way I can think is to make another rest call but I want to avoid making any rest call.
Is there any way to retain data on the page on refresh or making rest call is better solution?
I would think the cleanest way to do it is to make a REST call, as the data could have changed on the server. However, if you want to avoid the call anyway, you may use localstorage. Just store the data in your table in localstore with key value pairs (assign some unique key to every row). You could use this plugin: Angular LocalStorage

How can I chunk results (lazy load?) without rewriting my whole application (Laravel + jQuery, SPA style)

I've developed a web application with the concept of Single Page Application but none of the modern techs and frameworks.
So I have a jQuery page that dynamically requests data to localhost - a Laravel instance that compiles the entries in the DB (within a given time interval).
So the client wants to see all the entries for last week, the app works fine. But if he wants to see the results for the whole last month... well, they're so many that the default execution time of the php ins't enough to process all the data (30 seconds). I can easily override this, of course, but then the jQuery client will loop through these arrays of objects and do stuff with them (sort, find, sum...). So I'm not even sure jQuery can handle this many data.
So my question can be broken in two:
Can laravel ->paginate() be used so the ajax request of jQuery can also chunk the data? How does this work (hopefully in a manner that doesn't force me to rewrite all the code).
How could I store large amounts of information on the client? It's only temporary but the users will hang around for a considerable amount of time on my webpage, and I don't want them to wait 5 minutes every time they press a button
Thanks.
If you want to provide an interface to a large amount of data stored in a backed, you should paginate the data. This is a standard approach, so I'm sure your client will be ok with that.
Using pagination is pretty simple - see the docs for Laravel 5.0 here: http://laravel.com/docs/5.0/pagination
In order to paginate results in the backend, you need to call paginate($perPage) on your query instead of get() in your controller, like that:
$users = User::whereIsActive(true)->paginate(15);
This will return paginated result with 15 records per page. Page number will be taken from page parameter of the request. In order to get 3rd page of users, you'll need your frontend jQuery app to send a request to URL like:
/users?page=3
I don't recommend caching data in the frontend application. The data can be changed by some other user and you won't even know about it. And with pagination, your requests should be lightweight enough to stop worrying about a request sent to fetch every page of results.
Not sure if you're subscribed to laracasts but Jeffery Way is amazing in explaining features of Laravel and I highly recommend his videos.
In short you can paginate the results, then on the view when you call the foreach on your items you can array_chunk() the results to display them how you need to. But the paginated results are going to be fetched using a query in the URL, and i'm not sure that is what you want if you're already using a lot of jQuery to keep everything on the same page.
https://laracasts.com/lessons/crazy-simple-pagination
But assuming you're already paginating the results with whatever jQuery you've already written for the json data...
You could also use a query scope to get the data you need to for the amount of time to scope to create a simple api to use with ajax. I think that's probably what you're looking for.
So here's what I would do assuming you're already doing some pagination manually with your javascript.
Create a few query scopes to filter the data for different lengths of time
Create simple routes to fetch results from URI using the query scopes
Get the json data from the route preforming an ajax requests to the URIs created
More information on Query Scopes: http://laravel.com/docs/5.1/eloquent#query-scopes

Staging area for JSON filled views

I have a SPA webapp that calls a webservice to gather 'X' amount of json objects ~(1 - 30+). I then use this data for multiple changing slides (all data is not displayed in the 1st slide).
I am using Node/Express/Angular/Jade.
How should I stage these slides when I gather the original data from the webservice (can only call the service once because $ constraints)? I would like the back button/urls to work as well. So, should I completely render out the data and use client side JS to hide/show the dom elements based on button clicks (incorporating messy hash bangs and JS methods to track location/flow). Or is there a sexier more efficient way? Should I store my data in the cache and pull from it (using my angularjs ng-view, note: changing my ng-view will be a pain... it would be ideal to have a ng-view within a ng-view in this particular situation, even though that doesnt exist) based on the slide? Or is there another way?
Thank you for your help, let me know if you need further explanation.
For mapping URLs to your angularjs pages, I would suggest using ui-router. You may or may not need ui-router for this particular problem. But, generally, it will help tremendously in organizing the structure of your site.
For the other questions:
I would store your results (which was retrieved from the service) in a $rootScope variable. The page index of your slides will be a parameter in your URL. Based on the value of this parameter, your controller can decide which page content it will display.

Categories

Resources