i'm trying to use React with Flux architecture and stumbled on one restriction which i can't handle.
Problem is as following:
There's a store which listens to an event. Event has object id. We need to fetch object if needed and make it selected.
If store doesn't have object with this id - it's queried. In callback we dispatch another event to store which is responsible for selection.
If store has object - i'd like to dispatch selection event, but i can't because dispatch is in progress.
Best solution i came up with so far is wrapping inner dispatch in setTimeout(f, 0), but it looks scary.
Actually the problem is quite general - how should i organize dispatch chain without dispatch nesting (without violating current Flux restrictions) if each new dispatch is based on previous dispatch handling result.
Does anybody have any good approaches to solve such problems?
var selectItem(item) {
AppDispatcher.dispatch({
actionType: AppConstants.ITEM_SELECT,
item: item
});
}
// Item must be requested and selected.
// If it's in store - select it.
// Otherwise fetch and then select it.
SomeStore.dispatchToken = AppDispatcher.register((action) => {
switch(action.actionType) {
case AppConstants.ITEM_REQUESTED:
var item = SomeStore.getItem(action.itemId);
if (item) {
// Won't work because can't dispatch in the middle of dispatch
selectItem(item);
} else {
// Will work
$.getJSON(`some/${action.itemId}`, (item) => selectItem(item));
}
}
};
Are you writing your own dispatcher? setTimeout(f, 0) is a fine trick. I do the same thing in my minimal flux here. Nothing scary there. Javascript's concurrency model is pretty simple.
More robust flux dispatcher implementations should handle that for you.
If ITEM_SELECT is an event that another Store is going to handle:
You are looking for dispatcher.waitFor(array<string> ids): void, which lets you use the SomeStore.dispatchToken that register() returns to enforce the order in which Stores handle an event.
The store, say we call it OtherStore, that would handle the ITEM_SELECT event, should instead handle ITEM_REQUEST event, but call dispatcher.waitFor( [ SomeStore.dispatchToken ] ) first, and then get whatever result is interesting from SomeStore via a public method, like SomeStore.getItem().
But from your example, it seems like SomeStore doesn't do anything to its internal state with ITEM_REQUEST, so you just need to move the following lines into OtherStore with a few minor changes:
// OtherStore.js
case AppConstants.ITEM_REQUESTED:
dispatcher.waitFor( [ SomeStore.dispatchToken ] );// and don't even do this if SomeStore isn't doing anything with ITEM_REQUEST
var item = SomeStore.getItem(action.itemId);
if (item) {
// Don't dispatch an event, let other stores handle this event, if necessary
OtherStore.doSomethingWith(item);
} else {
// Will work
$.getJSON(`some/${action.itemId}`, (item) => OtherStore.doSomethingWith(item));
}
And again, if another store needs to handle the result of OtherStore.doSomethingWith(item), they can also handle ITEM_REQUESTED, but call dispatcher.waitFor( [ OtherStore.dispatchToken ] ) before proceeding.
So, in looking at your code, are you setting a "selected" property on the item so it will be checked/selected in your UI/Component? If so, just make that part of the function you are already in.
if(item) {
item.selected = true;
//we're done now, no need to create another Action at this point,
//we have changed the state of our data, now alert the components
//via emitChange()
emitChange();
}
If you're wanting to track the currently selected item in the Store, just have an ID or and object as a private var up there, and set it similarly.
var Store = (function(){
var _currentItem = {};
var _currentItemID = 1;
function selectItem(item) {
_currentItem = item;
_currentItemID = item.id;
emitChange();
}
(function() {
Dispatcher.register(function(action){
case AppConstants.ITEM_REQUESTED:
var item = SomeStore.getItem(action.itemId);
if (item) {
selectItem(item);
} else {
$.getJSON(`some/${action.itemId}`, (item) =>
selectItem(item);
}
});
})();
return {
getCurrentlySelectedItem: function() {
return _currentItem;
},
getCurrentlySelectedItemID: function() {
return _currentItemID;
}
}
})();
Ultimately, you don't have to create Actions for everything. Whatever the item is that you're operating on, it should be some domain entity, and it is your Store's job to manage the state of that specific entity. Having other internal functions is often a necessity, so just make selectItem(item) an internal function of your Store so you don't have to create a new Action to access it or use it.
Now, if you have cross-store concerns, and another Store cares about some specific change to some data in your initial Store, this is where the waitFor(ids) function will come in. It effectively blocks execution until the first Store is updated, then the other can continue executing, assured that the other Store's data is in a valid state.
I hope this makes sense and solves your problem, if not, let me know, and hopefully I can zero in better.
Related
I have 2 event listeners that operate on the same shared data/state. For instance:
let sharedState = {
username: 'Bob',
isOnline: false,
};
emitter.on('friendStatus', (status) => {
sharedState.isOnline = status.isOnline;
});
emitter.on('friendData', (friend) => {
if (sharedState.isOnline) {
sharedState.username = friend.username;
}
});
My problem is that these events are emitted at any order. The friendData event might come in before the friendStatus. But friendData does something with the data returned from friendStatus. In other words: I need the event handler for friendData to execute after friendStatus, but I don't have this assurance from the event emitter perspective. I need to somehow implement this in my code.
Now of course I could simply remove the if (sharedState.isOnline) { from the friendData listener and let it run its course. Then I'd have a function run after both handlers have finished and somewhat reconciliate the shared state dependencies:
emitter.on('friendStatus', (status) => {
sharedState.isOnline = status.isOnline;
reconcileStateBetweenUsernameAndIsOnline();
});
emitter.on('friendData', (friend) => {
sharedState.username = friend.username;
reconcileStateBetweenUsernameAndIsOnline();
});
Problem is that this reconciliation function knows about this specific data dependencies use case; hence cannot be very generic. With large interconnected data dependencies this seems a lot harder to achieve. For instance I am already dealing with other subscriptions and other data dependencies and my reconciliation function is becoming quite large and complicated.
My question is: is there a better way to model this? For instance if I had the assurance that the handlers would run in a specific order I wouldn't have this issue.
EDIT: expected behavior is to use the sharedState and render a UI where I want the username to show ONLY if the status isOnline is true.
From #Bergi's answer in the comments the solution I was hinting seems to be the most appropriate for such case. Simply let the event-handlers set their own independent state, then observe on the values changing and write appropriate logic based on what you need to do. For instance I need to show a username; this function shouldn't care about the order or have any knowledge of time: it should simply check whether the isOnline status is true and if there's a username. Then the observable pattern can be used to call this function whenever each dependency of the function changes. In this case the function depends on status.isOnline and friend.username hence it will observe and re-execute whenever those values change.
function showUsername() {
if (status.isOnline && friend.username != '') return true;
}
This function must observe the properties it depends on (status.isOnline and friend.username). You can have a look at RxJS or other libraries for achieving this in a more "standard" way.
I am using Firebase and Node with Redux. I am loading all objects from a key as follows.
firebaseDb.child('invites').on('child_added', snapshot => {
})
The idea behind this method is that we get a payload from the database and only use one action to updated my local data stores via the Reducers.
Next, I need to listen for any NEW or UPDATED children of the key invite.
The problem now, however, is that the child_added event triggers for all existing keys, as well as newly added ones. I do not want this behaviour, I only require new keys, as I have the existing data retrieved.
I am aware that child_added is typically used for this type of operation, however, i wish to reduce the number of actions fired, and renders triggered as a result.
What would be the best pattern to achieve this goal?
Thanks,
Although the limit method is pretty good and efficient, but you still need to add a check to the child_added for the last item that will be grabbed. Also I don't know if it's still the case, but you might get "old" events from previously deleted items, so you might need to watch at for this too.
Other solutions would be to either:
Use a boolean that will prevent old added objects to call the callback
let newItems = false
firebaseDb.child('invites').on('child_added', snapshot => {
if (!newItems) { return }
// do
})
firebaseDb.child('invites').once('value', () => {
newItems = true
})
The disadvantage of this method is that it would imply getting events that will do nothing but still if you have a big initial list might be problematic.
Or if you have a timestamp on your invites, do something like
firebaseDb.child('invites')
.orderByChild('timestamp')
.startAt(Date.now())
.on('child_added', snapshot => {
// do
})
I have solved the problem using the following method.
firebaseDb.child('invites').limitToLast(1).on('child_added', cb)
firebaseDb.child('invites').on('child_changed', cb)
limitToLast(1) gets the last child object of invites, and then listens for any new ones, passing a snapshot object to the cb callback.
child_changed listens for any child update to invites, passing a snapshot to the cb
I solved this by ignoring child_added all together, and using just child_changed. The way I did this was to perform an update() on any items i needed to handle after pushing them to the database. This solution will depend on your needs, but one example is to update a timestamp key whenever you want the event triggered. For example:
var newObj = { ... }
// push the new item with no events
fb.push(newObj)
// update a timestamp key on the item to trigger child_changed
fb.update({ updated: yourTimeStamp })
there was also another solution:
get the number of children and extract that value:
and it's working.
var ref = firebaseDb.child('invites')
ref.once('value').then((dataSnapshot) => {
return dataSnapshot.numChildren()
}).then((count) =>{
ref .on('child_added', (child) => {
if(count>0){
count--
return
}
console.log("child really added")
});
});
If your document keys are time based (unix epoch, ISO8601 or the firebase 'push' keys), this approach, similar to the second approach #balthazar proposed, worked well for us:
const maxDataPoints = 100;
const ref = firebase.database().ref("someKey").orderByKey();
// load the initial data, up to whatever max rows we want
const initialData = await ref.limitToLast(maxDataPoints).once("value")
// get the last key of the data we retrieved
const lastDataPoint = initialDataTimebasedKeys.length > 0 ? initialDataTimebasedKeys[initialDataTimebasedKeys.length - 1].toString() : "0"
// start listening for additions past this point...
// this works because we're fetching ordered by key
// and the key is timebased
const subscriptionRef = ref.startAt(lastDataPoint + "0");
const listener = subscriptionRef.on("child_added", async (snapshot) => {
// do something here
});
I'm trying to create an observable flow that fulfills the following requirements:
Loads data from storage at subscribe time
If the data has not yet expired, return an observable of the stored value
If the data has expired, return an HTTP request observable that uses the refresh token to get a new value and store it
If this code is reached again before the request has completed, return the same request observable
If this code is reached after the previous request completed or with a different refresh token, start a new request
I'm aware that there are many different answers on how to perform step (3), but as I'm trying to perform these steps together I am looking for guidance on whether the solution I've come up with is the most succinct it can be (which I doubt!).
Here's a sample demonstrating my current approach:
var cachedRequestToken;
var cachedRequest;
function getOrUpdateValue() {
return loadFromStorage()
.flatMap(data => {
// data doesn't exist, shortcut out
if (!data || !data.refreshtoken)
return Rx.Observable.empty();
// data still valid, return the existing value
if (data.expires > new Date().getTime())
return Rx.Observable.return(data.value);
// if the refresh token is different or the previous request is
// complete, start a new request, otherwise return the cached request
if (!cachedRequest || cachedRequestToken !== data.refreshtoken) {
cachedRequestToken = data.refreshtoken;
var pretendHttpBody = {
value: Math.random(),
refreshToken: Math.random(),
expires: new Date().getTime() + (10 * 60 * 1000) // set by server, expires in ten minutes
};
cachedRequest = Rx.Observable.create(ob => {
// this would really be a http request that exchanges
// the one use refreshtoken for new data, then saves it
// to storage for later use before passing on the value
window.setTimeout(() => { // emulate slow response
saveToStorage(pretendHttpBody);
ob.next(pretendHttpBody.value);
ob.completed();
cachedRequest = null; // clear the request now we're complete
}, 2500);
});
}
return cachedRequest;
});
}
function loadFromStorage() {
return Rx.Observable.create(ob => {
var storedData = { // loading from storage goes here
value: 15, // wrapped in observable to delay loading until subscribed
refreshtoken: 63, // other process may have updated this between requests
expires: new Date().getTime() - (60 * 1000) // pretend to have already expired
};
ob.next(storedData);
ob.completed();
})
}
function saveToStorage(data) {
// save goes here
}
// first request
getOrUpdateValue().subscribe(function(v) { console.log('sub1: ' + v); });
// second request, can occur before or after first request finishes
window.setTimeout(
() => getOrUpdateValue().subscribe(function(v) { console.log('sub2: ' + v); }),
1500);
First, have a look at a working jsbin example.
The solution is a tad different then your initial code, and I'd like to explain why. The need to keep returning to your local storage, save it, save flags (cache and token) didn't not fit for me with reactive, functional approach. The heart of the solution I gave is:
var data$ = new Rx.BehaviorSubject(storageMock);
var request$ = new Rx.Subject();
request$.flatMapFirst(loadFromServer).share().startWith(storageMock).subscribe(data$);
data$.subscribe(saveToStorage);
function getOrUpdateValue() {
return data$.take(1)
.filter(data => (data && data.refreshtoken))
.switchMap(data => (data.expires > new Date().getTime()
? data$.take(1)
: (console.log('expired ...'), request$.onNext(true) ,data$.skip(1).take(1))));
}
The key is that data$ holds your latest data and is always up to date, it is easily accessible by doing a data$.take(1). The take(1) is important to make sure your subscription gets a single values and terminates (because you attempt to work in a procedural, as opposed to functional, manner). Without the take(1) your subscription would stay active and you would have multiple handlers out there, that is you'll handle future updates as well in a code that was meant only for the current update.
In addition, I hold a request$ subject which is your way to start fetching new data from the server. The function works like so:
The filter ensures that if your data is empty or has no token, nothing passes through, similar to the return Rx.Observable.empty() you had.
If the data is up to date, it returns data$.take(1) which is a single element sequence you can subscribe to.
If not, it needs a refresh. To do so, it triggers request$.onNext(true) and returns data$.skip(1).take(1). The skip(1) is to avoid the current, out dated value.
For brevity I used (console.log('expired ...'), request$.onNext(true) ,data$.skip(1).take(1))). This might look a bit cryptic. It uses the js comma separated syntax which is common in minifiers/uglifiers. It executes all statements and returns the result of the last statement. If you want a more readable code, you could rewrite it like so:
.switchMap(data => {
if(data.expires > new Date().getTime()){
return data$.take(1);
} else {
console.log('expired ...');
request$.onNext(true);
return data$.skip(1).take(1);
}
});
The last part is the usage of flatMapFirst. This ensures that once a request is in progress, all following requests are dropped. You can see it works in the console printout. The 'load from server' is printed several times, yet the actual sequence is invoked only once and you get a single 'loading from server done' printout. This is a more reactive oriented solution to your original refreshtoken flag checking.
Though I didn't need the saved data, it is saved because you mentioned that you might want to read it on future sessions.
A few tips on rxjs:
Instead of using the setTimeout, which can cause many problems, you can simply do Rx.Observable.timer(time_out_value).subscribe(...).
Creating an observable is cumbersome (you even had to call next(...) and complete()). You have a much cleaner way to do this using Rx.Subject. Note that you have specifications of this class, the BehaviorSubject and ReplaySubject. These classes are worth knowing and can help a lot.
One last note. This was quite a challange :-) I'm not familiar with your server side code and design considerations yet the need to suppress calls felt uncomfortable to me. Unless there is a very good reason related to your backend, my natural approach would be to use flatMap and let the last request 'win', i.e. drop previous un terminated calls and set the value.
The code is rxjs 4 based (so it can run in jsbin), if you're using angular2 (hence rxjs 5), you'll need to adapt it. Have a look at the migration guide.
================ answers to Steve's other questions (in comments below) =======
There is one article I can recommend. It's title says it all :-)
As for the procedural vs. functional approach, I'd add another variable to the service:
let token$ = data$.pluck('refreshtoken');
and then consume it when needed.
My general approach is to first map my data flows and relations and then like a good "keyboard plumber" (like we all are), build the piping. My top level draft for a service would be (skipping the angular2 formalities and provider for brevity):
class UserService {
data$: <as above>;
token$: data$.pluck('refreshtoken');
private request$: <as above>;
refresh(){
request.onNext(true);
}
}
You might need to do some checking so the pluck does not fail.
Then, each component that needs the data or the token can access it directly.
Now lets suppose you have a service that needs to act on a change to the data or the token:
class SomeService {
constructor(private userSvc: UserService){
this.userSvc.token$.subscribe(() => this.doMyUpdates());
}
}
If your need to synthesize data, meaning, use the data/token and some local data:
Rx.Observable.combineLatest(this.userSvc.data$, this.myRelevantData$)
.subscribe(([data, myData] => this.doMyUpdates(data.someField, myData.someField));
Again, the philosophy is that you build the data flow and pipes, wire them up and then all you have to do is trigger stuff.
The 'mini pattern' I've come up with is to pass to a service once my trigger sequence and register to the result. Lets take for example autocomplete:
class ACService {
fetch(text: string): Observable<Array<string>> {
return http.get(text).map(response => response.json().data;
}
}
Then you have to call it every time your text changes and assign the result to your component:
<div class="suggestions" *ngFor="let suggestion; of suggestions | async;">
<div>{{suggestion}}</div>
</div>
and in your component:
onTextChange(text) {
this.suggestions = acSVC.fetch(text);
}
but this could be done like this as well:
class ACService {
createFetcher(textStream: Observable<string>): Observable<Array<string>> {
return textStream.flatMap(text => http.get(text))
.map(response => response.json().data;
}
}
And then in your component:
textStream: Subject<string> = new Subject<string>();
suggestions: Observable<string>;
constructor(private acSVC: ACService){
this.suggestions = acSVC.createFetcher(textStream);
}
onTextChange(text) {
this.textStream.next(text);
}
template code stays the same.
It seems like a small thing here, but once the app grows bigger, and the data flow complicated, this works much better. You have a sequence that holds you data and you can use it around the component wherever you need it, you can even further transform it. For example, lets say you need to know the number of suggestions, in the first method, once you get the result, you need to further query it to get it, thus:
onTextChange(text) {
this.suggestions = acSVC.fetch(text);
this.suggestionsCount = suggestions.pluck('length'); // in a sequence
// or
this.suggestions.subscribe(suggestions => this.suggestionsCount = suggestions.length); // in a numeric variable.
}
Now in the second method, you just define:
constructor(private acSVC: ACService){
this.suggestions = acSVC.createFetcher(textStream);
this.suggestionsCount = this.suggestions.pluck('length');
}
Hope this helps :-)
While writing, I tried to reflect about the path I took to getting to use reactive like this. Needless to say that on going experimentation, numerous jsbins and strange failures are big part of it. Another thing that I think helped shape my approach (though I'm not currently using it) is learning redux and reading/trying a bit of ngrx (angular's redux port). The philosophy and the approach does not let you even think procedural so you have to tune in to functional, data, relations and flows based mindset.
(Note: My question was not clearly written, and I was thinking about some things wrong. The current version of the question is just an attempt to write something that could make the accepted answer useful to as many people as possible.)
I want to have an action that adds an item to a store and registers it with an external dependency.
I could use the thunk middleware and write
export function addItem(item) {
return dispatch => {
dispatch(_addItemWithoutRegisteringIt(item));
externalDependency.register(item);
};
}
But the subscribers would be notified before the item was registered, and they might depend on it being registered.
I could reverse the order and write
export function addItem(item) {
return dispatch => {
externalDependency.register(item);
dispatch(_addItemWithoutRegisteringIt(item));
};
}
But I track the item in the external dependency by a unique id that it is natural to only assign in the reducer.
I could register the item in the reducer, but I am given to understand that it is very bad form to do side effects in a reducer and might lead to problems down the line.
So what is the best approach?
(My conclusion is: there are a number of approaches that would work, but probably the best one for my use case is to store a handle into the external dependency in Redux rather than a handle into Redux in the external dependency.)
If you use Redux Thunk middleware, you can encapsulate it in an action creator:
function addItem(id) {
return { type: 'ADD_ITEM', id };
}
function showNotification(text) {
return { type: 'SHOW_NOTIFICATION', text };
}
export function addItemWithNotification(id) {
return dispatch => {
dispatch(addItem(id));
doSomeSideEffect();
dispatch(showNotification('Item was added.');
};
}
Elaborating, based on the comments to this answer:
Then maybe this is the wrong pattern for my case. I don't want subscribers invoked between dispatch(addItem(id)) and doSomeSideEffect().
In 95% cases you shouldn't worry about whether the subscribers were invoked. Bindings like React Redux won't re-render if the data hasn't changed.
Would putting doSomeSideEffect() in the reducer be an acceptable approach or does it have hidden pitfalls?
No, putting side effects into the reducer is never acceptable. This goes against the central premise of Redux and breaks pretty much any tool in its ecosystem: Redux DevTools, Redux Undo, any record/replay solution, tests, etc. Never do this.
If you really need to perform a side effect together with an action, and you also really care about subscribers only being notified once, just dispatch one action and use [Redux Thunk] to “attach” a side effect to it:
function addItem(id, item) {
return { type: 'ADD_ITEM', id, item };
}
export function addItemWithSomeSideEffect(id) {
return dispatch => {
let item = doSomeSideEffect(); // note: you can use return value
dispatch(addItem(id, item));
};
}
In this case you'd need to handle ADD_ITEM from different reducers. There is no need to dispatch two actions without notifying the subscribers twice.
Here is the one point I still definitely don't understand. Dan suggested that the thunk middleware couldn't defer subscriber notification because that would break a common use case with async requests. I still don't understand this this.
Consider this:
export function doSomethinAsync() {
return dispatch => {
dispatch({ type: 'A' });
dispatch({ type: 'B' });
setTimeout(() => {
dispatch({ type: 'C' });
dispatch({ type: 'D' });
}, 1000);
};
}
When would you want the subscriptions to be notified? Definitely, if we notify the subscribers only when the thunk exits, we won't notify them at all for C and D.
Either way, this is impossible with the current middleware architecture. Middleware isn't meant to prevent subscribers from firing.
However what you described can be accomplished with a store enhancer like redux-batched-subscribe. It is unrelated to Redux Thunk, but it causes any group of actions dispatched synchronously to be debounced. This way you'd get one notification for A and B, and another one notification for C and D. That said writing code relying on this behavior would be fragile in my opinion.
I'm still in the process of learning Redux; however my gut instinct says that this is could be a potential candiate for some custom middleware?
Within my Flux architected React application I am retrieving data from a store, and would like to create an action to request that information if it does not exist. However I am running into an error where the dispatcher is already dispatching.
My desired code is something like:
getAll: function(options) {
options = options || {};
var key = JSON.stringify(options);
var ratings = _data.ratings[key];
if (!ratings) {
RatingActions.fetchAll(options);
}
return ratings || [];
}
However intermittently fails when the dispatcher is already dispatching an action, with the message Invariant Violation: Dispatch.dispatch(...): Cannot dispatch in the middle of a dispatch.. I am often making requests in response to a change in application state (eg date range). My component where I make the request, in response to a change event from the AppStore has the following:
getStateFromStores: function() {
var dateOptions = {
startDate: AppStore.getStartISOString(),
endDate: AppStore.getEndISOString()
};
return {
ratings: RatingStore.getAll(dateOptions),
};
},
I am aware that event chaining is a Flux antipattern, but I am unsure what architecture is better for retrieving data when it does not yet exist. Currently I am using this terrible hack:
getAll: function(options) {
options = options || {};
var key = JSON.stringify(options);
var ratings = _data.ratings[key];
if (!ratings) {
setTimeout(function() {
if (!RatingActions.dispatcher.isDispatching()) {
RatingActions.fetchAll(options);
}
}, 0);
}
return ratings || [];
},
What would be a better architecture, that avoids event chaining or the dispatcher error? Is this really event chaining? I just want to change the data based on the parameters the application has set.
Thanks!
You can use Flux waitFor() function instead of a setTimeout
For example you have 2 stores registered to the same dispatcher and have one store waitFor the other store to process the action first then the one waiting can update after and dispatch the change event. See Flux docs example
My particular error was occurring because my stores emitted their change event during the action dispatch, while it was still cycling through the listeners. This meant any listeners (ie components) that then triggered an action due to a data change in the store would interrupt the dispatch. I fixed it by emitting the change event after the dispatch had completed.
So this:
this.emit(CHANGE_EVENT);
Became
var self = this;
setTimeout(function() { // Run after dispatcher has finished
self.emit(CHANGE_EVENT);
}, 0);
Still a little hacky (will probably rewrite so doesn't require a setTimeout). Open to solutions that address the architectural problem, rather than this implementation detail.
The reason you get a dispatch in the middle of a previous dispatch, is that your store dispatches an action (invokes an action creator) synchronously in the handler for another action. The dispatcher is technically dispatching until all its registered callbacks have been executed. So, if you dispatch a new action from either of the registered callbacks, you'll get that error.
However, if you do some async work, e.g. make an ajax request, you can still dispatch an action in the ajax callbacks, or the async callback generally. This works, because as soon as the async function has been invoked, it per definition immediately continues the execution of the function and puts the callback on the event queue.
As pointed out by Amida and in the comments of that answer, it's a matter of choice whether to make ajax requests from the store in response to an action, or whether to do it in the store. The key is that a store should only mutate its state in response to an action, not in an ajax/async callback.
In your particular case, this would be exemplified by something like this for your store's registered callback, if you prefer to make the ajax calls from the store:
onGetAll: function(options) {
// ...do some work
request(ajaxOptions) // example for some promise-based ajax lib
.then(function(data) {
getAllSuccessAction(data); // run after dispatch
})
.error(function(data) {
getAllFailedAction(data); // run after dispatch
});
// this will be immediately run during getAllAction dispatch
return this.state[options];
},
onGetAllSuccess: function(data) {
// update state or something and then trigger change event, or whatever
},
onGetAllFailed: function(data) {
// handle failure somehow
}
Or you can just put the ajax call in your action creator and dispatch the "success/failed" actions from there.
you can user the "defer" option in the dispatcher.
In your case it would be like:
RatingActions.fetchAll.defer(options);
In my case, I fetch data through the actions/actions creators. The store is only a dump place that receives the payload of an action.
This means that I would "fetchall" in an action and then pass the result to the store which will do whatever with it and then emit a change event.
Some people consider using stores like me, others think like you.
Some people at Facebook uses "my" approach:
https://github.com/facebook/flux/blob/19a24975462234ddc583ad740354e115c20b881d/examples/flux-chat/js/utils/ChatWebAPIUtils.js#L51
I think it would probably avoid the dispatch problem treating your stores like this, but I may be wrong.
An interesting discussion is this one: https://groups.google.com/forum/#!topic/reactjs/jBPHH4Q-8Sc
where Jing Chen (Facebook engineer) explains what she thinks about how to use stores.