I have been working on AngularJS project recently and came over an interesting problem while trying to create a filter which is using data loaded via AJAX request.
First about the problem:
AngularJS filter is a synchronous piece of code (function) that returns a string which is inserted into your DOM. And in most of the cases it works perfectly fine e.g. following filter that capitilizes first letter:
angular.module('myApp.filters', []).filter('capitilize', function() {
return function (word) {
return word.charAt(0).toUpperCase() + word.substr(1);
}
});
And this works great. Now the question is what if I can't return the desired result right away? Say I need to load some data via AJAX request to get the desired result. If I make an AJAX request my return statement will return an empty result before I get my data. So really the question is how do I notify filter to update itself when my data is loaded?
Solution:
It turned out that the solution was right there in front of me, but it took me some time to figure out how the magic is happening. Say I need a filter that retrieves artist's biography based on their name (yeah, a little bit crazy example but it proves the point):
angular.module('myApp.filters', []).filter('biography', function($q, $http) {
var pending = {};
return function (artist) {
if ( !(artist in pending) ) {
pending[artist] = null;
$http.get('http://developer.echonest.com/api/v4/artist/biographies?api_key=FILDTEOIK2HBORODV&name='
+ artist + '&format=json&results=1&start=0&license=cc-by-sa')
.then(function(response){
pending[artist] = response.data.response.biographies[0].text;
});
}
return pending[artist] || '';
}
});
It works, but how? I made a get request, got my result, but how does it force filter to update itself. The key here is the angular's $q (A promise/deferred implementation inspired by Kris Kowal's Q.)
from angular documentation:
$q is integrated with the $rootScope.Scope Scope model observation mechanism in angular, which means faster propagation of resolution or rejection into your models and avoiding unnecessary browser repaints, which would result in flickering UI.
This means that whenever the promise is resolved it causes the update, in fact here is the code from angular:
function done(status, response, headersString, statusText) {
if (cache) {
if (isSuccess(status)) {
cache.put(url, [status, response, parseHeaders(headersString), statusText]);
} else {
// remove promise from the cache
cache.remove(url);
}
}
resolvePromise(response, status, headersString, statusText);
if (!$rootScope.$$phase) $rootScope.$apply();
}
Hope this was helpful for you. Here is the example, keep in mind that the example is making cross domain ajax calls, you will need to disable cross domain policy of your browser:
http://jsfiddle.net/pJuZ9/8/
IMPORTANT: Keep in mind not to overwhelm filter with same ajax request, otherwise you might end up with this:
Error: [$rootScope:infdig] 10 $digest() iterations reached. Aborting!
Watchers fired in the last 5 iterations: []
thank you for that post, at least I knew I'm not completely wrong. I have a very similar scenario:
1) a simple translate filter
module.filter('translate', ['Localization', function (localization) {
var translateFilter = function (key) {
return localization.get(key) || "[" + key + "]";
};
translateFilter.$stateful = true;
return translateFilter;
}]);
2) a localization service which stores a JS dictionary. That is updated via XHR. That means that there are multiple possible states of the dictionary:
no localization before init => empty
default localization after init => non-empty
user-defined localization after default is loaded => non-empty
My problem was that the filter result wasn't updated (re-rendered) after the XHR call ended, even though a digest was performed. I had to add
translateFilter.$stateful = true;
to make it work. Even changing filter dependency from service to value did not help. Someone might find this helpful, or event better, tell me, what I was doing wrong :-)
Related
I'm working on a project to develop a datasource plugin for Grafana. This means that I'm stuck with what appears to be reasonably old versions of some of the AngularJS libraries/modules. The Grafana project also seems to have pulled out the use of $q, and as a result I'm trying to work out how to use native Promise objects where possible (Promises are also something I'm new to).
I've got a bs-typeahead form input which is correctly calling the following promise which returns results:
getOptions(query) {
console.log('Getting options')
return this.datasource.metricFindQuery(query || '').then(a => {
console.log(a);
this.scope.$digest();
return a
});
}
However, the bs-typeahead drop down doesn't appear showing the results, despite an array showing the expected results being logged to the console.
With this.scope.$digest(); in the function, I get an error of $digest already in progress, and so now I'm stuck with where/how I should be calling $scope.$digest(), or if that's the best approach. If I remove that line I don't get an error, but no results appear.
I've taken a look at a few different suggestions to try and get this to work, but haven't had any success thus far.
If I swap out the getOptions return for a plain array (eg. ['a','b','c']) the lookahead works without any issue - so I'm confident the issue is with the Promise.
It seems like $scope.$apply, could be an option, but again I'm not sure where it should sit in the context of the codebase.
What should I be doing to get the promise to resolve appropriately in light of the bs-typeahead?
The whole Javascript file that the above function resides in is available here.
Beyond assistance with my immediate question, an explanation of how the Promise(s) resolve in my particular context would be a great help in making sure I'm understanding the concept correctly.
It seems that your typeahead is not able to handle a Promise.
You will have to manually provided an Array and update it once the Promise is resolved.
getOptions(query) {
let result = [];
console.log('Getting options')
this.datasource.metricFindQuery(query || '').then(a => {
console.log(a)
a.forEach(item => result.push(item));
});
return result;
}
So my question may come up as a bit weird but the problem i'm having is actually very strange aswell.
So for creating custom search filters i'm using javascript's function of .map and .reduce in this case i'm using these in my scope examples. Take a look at the following code:
$http.get('config/get/getOrders.php', {cache: true}).then(function(response){
$scope.orders = response.data.orders.order;
$scope.orders.map( function addPlace(item) {
item.firstname = $scope.customers.reduce(function(a,customers){
return item.id_customer === customers.id ? customers.firstname : a;
}, '');
return item;
});
});
So in the $http.get() i'm requesting a JSON file with data. This data consists of some orders with information. All these orders have a id_customer value. This value is connects to the customer information whom placed the order.
So i wanted to widen my search function with that customer name For more information about how and why take a look at this question i've asked yesterday.
And it worked. The filter could also search the customer_firstname. But then i wanted to use the same type of function in another controller. The function goal stays the same. Connecting data from multiple $scope types. But weird as it is the return is TypeError: Cannot read property 'reduce' of undefined. Yes i've placed this in the same app.js, but just in another controller. i've checked if all the data of $scope.products and $scope.stock_availables and $scope.productCombiantions exists and it does, i've checked the providers and etc and all suits fine to me so i have no clue why this is happening. The function in this case is:
$http.get('config/get/getProducts.php', {cache: true}).then(function(response){
$scope.products = response.data.products.product;
$scope.products.map( function addPlace(item) {
item.eanCombination = $scope.productCombinations.reduce(function(a, productCombinations, stock_availables){
return item.id === stock_availables.id + stock_availables.id === stock_availables.id_product_attribute + stock_availables.id_product_attribute === productCombinations.id ? productCombinations.ean13: a;
}, '');
return item;
});
});
Short summary: A function does work in one controller but the same function doesn't work in another controller.
If you have any questions please ask them in the comments.
As always, Thanks in advance!
So i made a mistake in the sequence in wich all data is loaded. Because of the fact that the $scope.productCombinations was loaded after the products sequence. So tip for everyone. Always check the sequence in wich your code is written.
I have a few controllers that call the method getData() from a service.
In order to not do extra http calls for the same json file, i'm using something like this:
QuizApp.service('quizService', ['$http', function($http) {
var quizService = {},
quizData;
quizService.getData = function() {
return quizData ? quizData : quizData = $http.get('quiz.json');
};
return quizService;
}]);
...but things don't work properly if I do it like that (the data is used to populate a slider and a thumbnail gallery with angular-slick and some problems arise. For now it maybe doesn't matter, I just want to know if the code above makes sense).
On the other hand, if I write getData() like this:
QuizApp.service('quizService', ['$http', function($http) {
var quizService = {},
quizData;
quizService.getData = function() {
quizData = $http.get('quiz.json');
return quizData;
};
return quizService;
}]);
... which will do various http requests for the same json file (doesn't look like a good practice to me), everything works fine and the slick angular gallery works properly. But not 100% of the times though: kind of randomly things don't work well too (same symptoms. I might describe them but again, I don't think that's the point here)
So, in general, regardless of the context, which one of those versions of getData() looks good and which doesn't and why?
UPDATE
As Mark pointed out, Angular has a built in cache, but it's set to false by default. Here is a post and here is the documentation.
If I cache the result of the http request though I get the same problem (I'm not describing it here) I was getting with my second option, and it has apparently nothing to do with that.
Actually, it seems that if I repeat the http request two times (as in my second snippet of code) things work by chance (90% of the time?).
So, by caching the result, at least I get a consistent result, which means in this case that the slick-angular thing won't work properly (never) and I have to look for a solution somewhere else.
Angular $http has a built in cache, which you could make use of here. You can cache all $http requests, which is probably a bad idea, or specific ones.
On a very simple level, this should work for you
quizService.getData = function() {
return $http.get('quiz.json', {cache: true}).then(quizData => {
return quizData;
});
};
You can find out more in the Angular docs for $http
I'm using Parse cloud code to update some counters on a user when after_delete is called on certain classes. A user has counters for subscriptions, followers and following that are incremented in the before_save for subscriptions and follows and decremented in the before_delete for the same classes.
The issue I'm running into is when a user is deleted. The after_delete function destroys all related subscriptions/follows, but this triggers an update to the (deleted) user via before_delete for subscriptions/follows. This always causes the before_delete to error out.
Perhaps I'm conceptually mixed up on the best way to accomplish this, but I can't figure out how to properly set up the following code in follow before_delete:
var fromUserPointer = follow.get("fromUser");
var toUserPointer = follow.get("toUser");
fromUserPointer.fetch().then( function(fromUser){
// update following counter
// if from user is already deleted, none of the rest of the promise chain is executed
}.then( function (fromUser){
return toUserPointer.fetch();
}.then( function(toUser){
// update followers count
}
Is there a way to determine if the fromUserPointer and toUserPointer point to a valid object short of actually performing the fetch?
Its not an error to not find the user, but by not handling the missing object case on the fetch, its being treating implicitly as an error.
So...
fromUserPointer.fetch().then(f(result) {
// good stuff
}).then(f(result) {
// good stuff
}).then(f(result) {
// good stuff
}, f(error) {
// this is good stuff too, if there's no mode of failure
// above that would cause you to want NOT to delete, then...
response.success();
});
i'm trying to reference one item in an array, and i have no idea why this is not working,
console.log($scope.Times);
console.log($scope.Times[0]);
these two lines of code are EXACTLY after eachother, but the output i get from the console is the following..
Output from the console
any ideas why this is not working? the commands are exactly after each other as I mentioned before and in the same function, the variable is global in my controller.
i can add more code if you think it can help, but i don't really understand how..
some more code:
$scope.Times = [];
$scope.getStatus = function(timer){
$http.post('getStatus.php', {timer : timer})
.success(function(response){
$scope.response = response;
if ($scope.response.Running === "0"){
$scope.model = { ItemNumber : $scope.response.Part };
$scope.loadTiming($scope.response.Part);
console.log($scope.Times);
console.log($scope.Times[0]);
}
});
};
$scope.loadTiming = function(itemNumber) {
$http.post('getPartTimings.php', {itemNumber : itemNumber})
.success(function(response){
$scope.selectedTiming = response;
$scope.Times.splice(0);
var i = 0;
angular.forEach($scope.selectedTiming, function(value) {
if (value !== 0)
$scope.Times.push({
"Process" : $scope.procedures[i],
"Duration" : value*60
});
i++;
});
});
};
<?php
$postData = file_get_contents("php://input");
$request = json_decode($postData);
require "conf/config.php";
mysqli_report(MYSQLI_REPORT_STRICT);
try {
$con=mysqli_connect(DBSERVER,DBUSER,DBPASS,DBNAME);
} catch (Exception $exp) {
echo "<label style='font-weight:bold; color:red'>MySQL Server Connection Failed. </label>";
exit;
}
$query = 'SELECT *,
TIME_TO_SEC(TIMEDIFF(NOW(),Timestamp))
FROM live_Timers
WHERE Timer='.$request->timer;
$result = mysqli_query($con, $query);
$data = mysqli_fetch_assoc($result);
echo JSON_ENCODE($data);
thanks for your help.
OK, so more code does help. It looks like you have asynchronous logic happening here. loadTiming is fired, which does a POST and then a splice on the Times array. One console.log could be firing before this POST and the other after. There's no easy way to tell.
One possible fix would be to only log these once the loadTiming async process runs. Return a promise from the loadTiming function and then in the then callback of the promise, log your array.
$scope.getStatus = function(timer){
$http.post('getStatus.php', {timer : timer})
.success(function(response){
$scope.response = response;
if ($scope.response.Running === "0"){
$scope.model = { ItemNumber : $scope.response.Part };
$scope.loadTiming($scope.response.Part).then(function () {
console.log($scope.Times);
console.log($scope.Times[0]);
});
}
});
};
$scope.loadTiming = function(itemNumber) {
return $http.post('getPartTimings.php', {itemNumber : itemNumber})
.success(function(response){
$scope.selectedTiming = response;
$scope.Times.splice(0);
var i = 0;
angular.forEach($scope.selectedTiming, function(value) {
if (value !== 0)
$scope.Times.push({
"Process" : $scope.procedures[i],
"Duration" : value*60
});
i++;
});
});
};
I think your issue is a $scope reference issue.
I would try this:
$scope.vm = {};
$scope.vm.Times = [];
Adding the "." is Angular best practice when attaching to $scope. This is best described here Understanding Scopes
I have experienced a similar situation a while ago, related with this issue.
Since then, I've encountered related issues a bunch of times (AngularJS, due to its cyclic nature seems prone to produce this behaviour).
In your case, using JSON.stringify($scope.Times) might "fix" this.
Context
Usually this happens in this context:
An async call or a expensive DOM manipulation is made.
You make 2 (or more) calls to console.log in between.
The state of the DOM or object is changed
The output shows inconsistent (and strange) results
How
Take this example:
console.log(someObject);
console.log(someObject.property);
After digging a lot (and talking to Webkit developers) this is what I've found:
The second call to console.log is "resolved" first.
Why?
In your case, this has to do how Console handles objects and "expressions" in a different way:
An "expression" is resolved in the time of call, while with objects, a reference to said object is stored instead
Note that expression is used loosely here. You can observe this behaviour in this fiddle
More in depth analysis
Regarding display discrepancies, the behaviour posted above is not the only gotcha with Console. In fact, it is related in how Console works.
Console is an external tool
First you must realize that Console is an external tool and not part of the ECMAScript spec. Implementations differ between browsers and it shouldn't be used in production. It certainly won't work the same for every user.
Console is a non-standard external tool and is not on a standards track.
Console is dynamic
Console as a very dynamic tool. With console you can make assertions (test), time and profile your code, group log entries, remote connect to your server and debug Server Side Code. You can even change code itself, at runtime. So..
Console is not just a static log displayer... Its dynamic nature is one its most features
Console has a slight delay
Being an external dynamic tool, Console works as a watcher process attached to the javascript engine.
This is useful in debugging and among other things prevents Console to inadvertently block the execution of the script. A simple and crude way of thinking about this is picturing console.log as a kind of non-blocking async call. This means that:
With Console, there's a slight delay between 1)call, 2)processing and 3)output.
However, calling Console is not "instant" per se. In fact, by itself, can delay script execution. If you mix this with complex DOM manipulations and events, it can cause weird behaviours.
I've encountered an issue with Chrome, when using MutationObserver and console.log. This happened because the DOM Painting was delaying the update of the DOM object but the event triggered by that DOM change was fired nevertheless. This meant the event callback was executed and finished before the DOM Object was fully updated, resulting in an invalid reference to the DOM object.
Using console.log in the observer caused a brief delay in the callback execution, that, in most of the times, was enough to let the DOM Object update first. This proves that console.log delays code execution.
But even when an invalid reference error occurred, console.log ALWAYS showed a valid object. Since the object couldn't have been changed by code itself, this proves there is a delay delay between the call of console.log and the processing.
Console log order matches the code path
Console log entries order is unaffected by entries update status. In other words,
The order of the log entries reflect the order in which they are called, not their "freshness"
So, if an object is updated, it does not move to the end of the log. (makes sense to me)
Counterintuitive behaviour
This can lead to a number of possible counterintuitive behaviours because one might expect a console.log to be some kind of snapshot of the object, not a reference to it.
For instance, in your case, the object is changed between the the call to console.log and the end of the script.
At the time of calling, $scope.Times is empty, so $scope.Times[0] is undefined.
However, the $scope.Time object is updated posteriorly.
When the Console report is displayed, it shows an updated version of the object.
Fix
In your case, transforming the object in an "expression" can solve the "issue". For instance, you can use JSON.stringify($scope.Times).
Debate
It is debatable if the way console handles objects is a Bug or a Feature. Some propose that, when called with an object, console.log should clone that object making a kind of snapshot. Some argue that storing a reference to the object is preferable, since you can easily create a snapshot yourself if you wish to do so.