I'm playing with Reactive Programming, using RxJS, and stumbled upon something I'm not sure how to solve.
Let's say we implement a vending machine. You insert a coin, select an item, and the machine dispenses an item and returns change. We'll assume that price is always 1 cent, so inserting a quarter (25 cents) should return 24 cents back, and so on.
The "tricky" part is that I'd like to be able to handle cases like user inserting 2 coins and then selecting an item. Or selecting an item without inserting a coin.
It seems natural to implement inserted coins and selected items as streams. We can then introduce some sort of dependency between these 2 actions — merging or zipping or combining latest.
However, I quickly ran into an issue where I'd like coins to be accumulated up until an item is dispensed but not further. AFAIU, this means I can't use sum or scan since there's no way to "reset" previous accumulation at some point.
Here's an example diagram:
coins: ---25---5-----10------------|->
acc: ---25---30----40------------|->
items: ------------foo-----bar-----|->
combined: ---------30,foo--40,bar--|->
change:------------29------39------|->
And a corresponding code:
this.getCoinsStream()
.scan(function(sum, current) { return sum + current })
.combineLatest(this.getSelectedItemsStream())
.subscribe(function(cents, item) {
dispenseItem(item);
dispenseChange(cents - 1);
});
25 and 5 cents were inserted and then "foo" item was selected. Accumulating coins and then combining latest would lead to "foo" being combined with "30" (which is correct) and then "bar" with "40" (which is incorrect; should be "bar" and "10").
I looked through all of the methods for grouping and filtering and don't see anything that I can use.
An alternative solution I could use is to accumulate coins separately. But this introduces state outside of a stream and I'd really like to avoid that:
var centsDeposited = 0;
this.getCoinsStream().subscribe(function(cents) {
return centsDeposited += cents;
});
this.getSelectedItemsStream().subscribe(function(item) {
dispenseItem(item);
dispenseChange(centsDeposited - 1);
centsDeposited = 0;
});
Moreover, this doesn't allow for making streams dependent on each other, such as to wait for coin to be inserted until selected action can return an item.
Am I missing already existing method? What's the best way to achieve something like this — accumulating values up until the moment when they need to be merged with another stream, but also waiting for at least 1 value in 1st stream before merging it with the one from the 2nd?
You could use your scan/combineLatest approach and then finish the stream with a first followed up with a repeat so that it "starts over" the stream but your Observers would not see it.
var coinStream = Rx.Observable.merge(
Rx.Observable.fromEvent($('#add5'), 'click').map(5),
Rx.Observable.fromEvent($('#add10'), 'click').map(10),
Rx.Observable.fromEvent($('#add25'), 'click').map(25)
);
var selectedStream = Rx.Observable.merge(
Rx.Observable.fromEvent($('#coke'), 'click').map('Coke'),
Rx.Observable.fromEvent($('#sprite'), 'click').map('sprite')
);
var $selection = $('#selection');
var $change = $('#change');
function dispense(selection) {
$selection.text('Dispensed: ' + selection);
console.log("Dispensing Drink: " + selection);
}
function dispenseChange(change) {
$change.text('Dispensed change: ' + change);
console.log("Dispensing Change: " + change);
}
var dispenser = coinStream.scan(function(acc, delta) { return acc + delta; }, 0)
.combineLatest(selectedStream,
function(coins, selection) {
return {coins : coins, selection : selection};
})
//Combine latest won't emit until both Observables have a value
//so you can safely get the first which will be the point that
//both Observables have emitted.
.first()
//First will complete the stream above so use repeat
//to resubscribe to the stream transparently
//You could also do this conditionally with while or doWhile
.repeat()
//If you only will subscribe once, then you won't need this but
//here I am showing how to do it with two subscribers
.publish();
//Dole out the change
dispenser.pluck('coins')
.map(function(c) { return c - 1;})
.subscribe(dispenseChange);
//Get the selection for dispensation
dispenser.pluck('selection').subscribe(dispense);
//Wire it up
dispenser.connect();
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/4.0.6/rx.all.js"></script>
<button id="coke">Coke</button>
<button id="sprite">Sprite</button>
<button id="add5">5</button>
<button id="add10">10</button>
<button id="add25">25</button>
<div id="change"></div>
<div id="selection"></div>
Generally speaking you have the following set of equations:
inserted_coins :: independent source
items :: independent source
accumulated_coins :: sum(inserted_coins)
accumulated_paid :: sum(price(items))
change :: accumulated_coins - accumulated_paid
coins_in_machine :: when items : 0, when inserted_coins : sum(inserted_coins) starting after last emission of item
The hard part is coins_in_machine. You need to switch the source observable based on some emissions from two sources.
function emits ( who ) {
return function ( x ) { console.log([who, ": "].join(" ") + x);};
}
function sum ( a, b ) {return a + b;}
var inserted_coins = Rx.Observable.fromEvent(document.getElementById("insert"), 'click').map(function ( x ) {return 15;});
var items = Rx.Observable.fromEvent(document.getElementById("item"), 'click').map(function ( x ) {return "snickers";});
console.log("running");
var accumulated_coins = inserted_coins.scan(sum);
var coins_in_machine =
Rx.Observable.merge(
items.tap(emits("items")).map(function ( x ) {return {value : x, flag : 1};}),
inserted_coins.tap(emits("coins inserted ")).map(function ( x ) {return {value : x, flag : 0};}))
.distinctUntilChanged(function(x){return x.flag;})
.flatMapLatest(function ( x ) {
switch (x.flag) {
case 1 :
return Rx.Observable.just(0);
case 0 :
return inserted_coins.scan(sum, x.value).startWith(x.value);
}
}
).startWith(0);
coins_in_machine.subscribe(emits("coins in machine"));
jsbin : http://jsbin.com/mejoneteyo/edit?html,js,console,output
[UPDATE]
Explanations:
We merge the insert_coins stream with the items stream while attaching a flag to them to know which one of the two emitted when we receive a value in the merged stream
When it is the items stream emitting, we want to put 0 in coins_in_machine. When it is the the insert_coins we want to sum the incoming values, as that sum will represent the new amount of coins in the machine. That means the definition of insert_coins switches from one stream to another under the logic defined before. That logic is what is implemented in the switchMapLatest.
I use switchMapLatest and not not switchMap as otherwise the coins_in_machine stream would continue to receive emission from former switched streams, i.e. duplicated emission as in the end there are ever only two streams to and from which we switch. If I may, I would say this is a close and switch that we need.
switchMapLatest has to return a stream, so we jump through hoops to make a stream that emits 0 and never ends (and does not block the computer, as using the repeat operator would in that case)
we jump through some extra hoops to make the inserted_coins emit the values we want. My first implementation was inserted_coins.scan(sum,0) and that never worked. The key and I found that quite tricky, is that when we get to that point in the flow, inserted_coins already emitted one of the values that is a part of the sum. That value is the one passed as a parameter of flatMapLatest but it is not in the source anymore, so calling scan after the fact won-t get it, so it is necessary to get that value from the flatMapLatest and reconstitute the correct behaviour.
You can also use Window to group together multiple coin events, and use item selection as the window boundary.
Next we can use zip to acquire the item value.
Notice we instantly try to give out items. So the user does have to insert coins before he decide on an item.
Notice i decided to publish both selectedStream and dispenser for safety reasons, we don't want to cause a race-condition where events fire while we're building up the query and zip becomes unbalanced. That would be a very rare condition, but notice that when our sources had been cold Observables, they pretty much start generating as soon as we subscribe, and we must use Publish to safeguard ourselves.
(Shamelessly stolen paulpdaniels example code).
var coinStream = Rx.Observable.merge(
Rx.Observable.fromEvent($('#add5'), 'click').map(5),
Rx.Observable.fromEvent($('#add10'), 'click').map(10),
Rx.Observable.fromEvent($('#add25'), 'click').map(25)
);
var selectedStream = Rx.Observable.merge(
Rx.Observable.fromEvent($('#coke'), 'click').map('Coke'),
Rx.Observable.fromEvent($('#sprite'), 'click').map('Sprite')
).publish();
var $selection = $('#selection');
var $change = $('#change');
function dispense(selection) {
$selection.text('Dispensed: ' + selection);
console.log("Dispensing Drink: " + selection);
}
function dispenseChange(change) {
$change.text('Dispensed change: ' + change);
console.log("Dispensing Change: " + change);
}
// Build the query.
var dispenser = Rx.Observable.zip(
coinStream
.window(selectedStream)
.flatMap(ob => ob.reduce((acc, cur) => acc + cur, 0)),
selectedStream,
(coins, selection) => ({coins : coins, selection: selection})
).filter(pay => pay.coins != 0) // Do not give out items if there are no coins.
.publish();
var dispose = new Rx.CompositeDisposable(
//Dole out the change
dispenser
.pluck('coins')
.map(function(c) { return c - 1;})
.subscribe(dispenseChange),
//Get the selection for dispensation
dispenser
.pluck('selection')
.subscribe(dispense),
//Wire it up
dispenser.connect(),
selectedStream.connect()
);
<script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.1/jquery.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/4.0.6/rx.all.js"></script>
<button id="coke">Coke</button>
<button id="sprite">Sprite</button>
<button id="add5">5</button>
<button id="add10">10</button>
<button id="add25">25</button>
<div id="change"></div>
<div id="selection"></div>
Related
I'm doing a Konami Code exercise in JavaScript and while I got it to work on my own, the answer makes no sense to me. Would someone care to explain?
My solution:
const pressed = [];
var secretCode = 'wesbos';
window.addEventListener('keyup', e => {
//My code
if (pressed.length < 6) {
pressed.push(e.key)
} else if (pressed.length === 6) {
pressed.shift()
pressed.push(e.key)
console.log(pressed)
}
//End my code
if (pressed.join('').toLowerCase() === secretCode) {
console.log("SECRET COMMAND ACTION CODE TRIGGERED! COMMENCE KAMEHAMEHA");
$.getScript('http://www.cornify.com/js/cornify.js', function() {
cornify_add();
$(document).keydown(cornify_add);
});
}
})
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js"></script>
An answer from https://medium.com/#nikkoortega/key-sequence-detection-f90773e3aa60 which I don't understand:
const pressed = [];
const secretCode = 'wesbos';
window.addEventListener('keyup', e => {
//Answer code
pressed.push(e.key)
pressed.splice(-secretCode.length - 1, pressed.length - secretCode.length)
//End answer code
if (pressed.join('').toLowerCase() === secretCode) {
console.log("SECRET COMMAND ACTION CODE TRIGGERED! COMMENCE KAMEHAMEHA");
}
})
The point of the code is to create a queue, a FIFO structure.
Array#splice is a confusing in-place function that removes elements starting at the first parameter up to the second parameter, with negative indices wrapping. The third parameter optionally adds new elements but it's not used here.
In the solution, -secretCode.length - 1 is basically a constant, -7 if the length of the secret code is 6. This is totally pointless and can just be replaced with 0 since they're really trying to access the first element, which is what should be dequeued.
The second parameter is pressed.length - secretCode.length which takes the difference between the number of keys collected so far and the total length of the secret code. This is <= 0 up until the pressed queue exceeds the size of the secret code, at which point it's 1, meaning the first element is dequeued because the splice call will look like splice(0, 1). When splice is called with a negative number like splice(0, -1) or splice(0, 0) it doesn't have any effect.
Here's a simplified and annotated version:
const pressed = [];
var secretCode = 'wesbos';
window.addEventListener('keyup', e => {
pressed.push(e.key);
console.log(
"after push, before splice",
pressed + "",
pressed.length - secretCode.length
);
pressed.splice(0, pressed.length - secretCode.length);
console.log("after splice", pressed + "");
console.log("______________");
if (pressed.join('').toLowerCase() === secretCode) {
console.log("SECRET COMMAND ACTION CODE TRIGGERED! COMMENCE KAMEHAMEHA");
}
})
<p>type: "wesbos"</p>
My opinion is that splice should usually be avoided, especially when messing with negative indices and adding elements. It's linear, clever, hard to understand and you're usually using the wrong data structure if you have to pop elements out of the middle of an array.
I prefer your approach but I'd write it like:
const pressed = [];
var secretCode = 'wesbos';
window.addEventListener('keyup', e => {
pressed.push(e.key);
while (pressed.length > secretCode.length) {
pressed.shift();
}
if (pressed.join('').toLowerCase() === secretCode) {
console.log("SECRET COMMAND ACTION CODE TRIGGERED! COMMENCE KAMEHAMEHA");
}
})
<p>type: "wesbos"</p>
The while could be if since we know we're always adding 1 element, but it also doesn't really hurt to keep it while either -- the point is that it's enforcing dequeues until the queue is the same size as the target word.
One of the annoying things about JS is that it doesn't have a good builtin queue structure, so we have to shift() an array. This is still linear, but at least it communicates intent of implementing a queue more clearly than a splice that's always operating at index 0 and always removing no more than 1 element, in spite of negative indexing obfuscation.
I need some help from RxJS professionals :)
I try to recursively load data from a REST API via http request.
Recursive calls are working fine, however when I susbscribe to the final Observable (returned by GetTemperatures), no data is returned within subscribe.
Seems like no data is passed back in the call chain.
Whats going wrong here?
GetTemperatures().subscribe((data: MeasureData) => {
// add data to a chart, etc...
})
GetTemperatures(): Observable<MeasureData> {
const l_startDate = new Date(2019, 0, 1);
var l_httpParams = new HttpParams()
.set('device_id', this._deviceId)
.set('module_id', this._moduleId)
.set('scale', '1hour')
.set('type', 'Temperature')
.set('date_begin', Math.floor(l_startDate.getTime() / 1000).toString())
.set('real_time', 'true')
.set('optimize', 'true');
return this._http.post<MeasureDataInternal>(this._getMeasureUrl, l_httpParams)
.pipe(
map((data: MeasureDataInternal): MeasureData => this.transformMeasureData(data)),
flatMap((data: MeasureData) => {
return this.recursiveLoadData(data);
})
);
}
recursiveLoadData(data: MeasureData): Observable<MeasureData> {
// search until now minus 1,5 hours
const endDate = new Date(Date.now() - (1.5 * 60 * 60 * 1000));
console.error('RECURSIVE begin: ' + data.value[0].date + ' end: ' + data.value[data.value.length - 1].date);
// check if complete
if (data.value[data.value.length - 1].date.getTime() >= endDate.getTime()) {
console.error('recursive ENDs here');
return EMPTY;
}
var l_httpParams = new HttpParams()
.set('device_id', this._deviceId)
.set('module_id', this._moduleId)
.set('scale', '1hour')
.set('type', 'Temperature')
.set('date_begin', Math.floor(data.value[data.value.length - 1].date.getTime() / 1000).toString())
.set('real_time', 'true')
.set('optimize', 'true');
return this._http.post<MeasureDataInternal>(this._getMeasureUrl, l_httpParams)
.pipe(
map((data2: MeasureDataInternal): MeasureData => this.transformMeasureData(data2)),
flatMap((data2: MeasureData) => {
return this.recursiveLoadData(data2);
})
)
}
I have no idea what you're really trying to accomplish, but each new step in your recursion doesn't do anything other than bringing you to the next step. So you'll want to include what you're hoping each step does.
This isn't specific to streams, this is also true of general recursion.
General Recursion
This really isn't any different from how a regular recursive function works. Say you're recursively adding up the numbers in an array, you need to add the tail of the array to the first value. If you just keep recursing on a smaller array without adding up the numbers you've popped off, you'd get the base-case value back.
This returns the last value of the array (The last value of the array is the base-case):
recursiveAdd(array){
if(array.length === 1) return array[0];
return recursiveAdd(array.shift());
}
This adds the array:
recursiveAdd(array){
if(array.length === 1) return array[0];
return array[0] + recursiveAdd(array.shift());
}
In this simple case, the + operand is doing the work at each step of the recursion. Without it, the array isn't summed up. And, of course, I could do anything. Subtract the array from 1000, average the numbers in the array, build an object from the values. Anything.
Before you make a recursive call, you have to do something. Unless what you're after is the value of the base-case (In your case, an empty stream)
Recursion with Streams
When you mergeMap a value into a stream, you don't also pass forward that value.
from([69,70,71]).pipe(
mergeMap(val => from([
String.fromCharCode(val),
String.fromCharCode(val),
String.fromCharCode(val)
]))
).subscribe(console.log);
output
e e e f f f g g g
Notice how the output doesn't include any numbers? When you mergeMap, you map values into streams. If you want the values you're mapping to be part of the stream, you must include them somehow. This is the same as with general recursion.
So, here are two examples that both include your data in the returned stream. They're very basic, but hopefully, you can take some understanding from them and apply that.
This transforms the returned steam to include your data as its first value (recursively, of course)
return this._http.post<MeasureDataInternal>(this._getMeasureUrl, l_httpParams)
.pipe(
map((data: MeasureDataInternal): MeasureData =>
this.transformMeasureData(data)
),
mergeMap((data: MeasureData) =>
this.recursiveLoadData(data).pipe(
startWith(data)
)
)
);
This creates a stream of your data, a stream of your recursive call, and merges the two streams together.
return this._http.post<MeasureDataInternal>(this._getMeasureUrl, l_httpParams)
.pipe(
map((data: MeasureDataInternal): MeasureData =>
this.transformMeasureData(data)
),
mergeMap((data: MeasureData) =>
merge (
of(data),
this.recursiveLoadData(data)
)
)
);
I have an application that populates the array continuously until it is stopped and it does two things:
If you click Stop button, it writes values in the DB.
Every 1000sec it checks the size of array and if it is > 2000 write the values in the db.
Now I have a problem:
I use the first element of the array to do some calculations, before writing the data to the db.
So if the array exceeds the size of 2000, it performs a splice and passes the array to another page, taking the first element as the basis for the calculation that will be performed on the next page.
At this point, if the user clicks the stop key as the basis for the operations, the last element of the array previously passed must be used.
For example:
array = [0, 20, 40, ......, 2000,..]
array.length > 2000
arrayBase = 0 // I use it for operations.
// Do a splice
array = [2020, 2040, ...... ]
array.length < 2000
//User click stop Button
//I should pass as arrayBase the last value of array (2000)
I hope at least I have explained myself by example.
This is my code:
//this function populate array until I click stop
populateArray(){
this.arrayTimestamp.push(`${buf.readInt16LE(0)}`);
this.firstElementTimestamp = this.arrayTimestamp[0];
//.....
}
//This function check the size and write in the DB if > 2000
checkSize(){
that.timeout = setInterval(function() {
if( (that.arrayTimestamp.length > 2000 ){
that.arrayTimestampCopy = that.arrayTimestamp.splice( 0, 2000 );
//this is a function in other page where I do some operations
scrittura.write({
counterTimestamp: that.firstElementTimestamp,
//...
})
.then(response => {
//...
})
// I have tried something like this:
that.firstElementTimestamp = that.arrayTimestamp[2000] //obviously it is undefined as the array with the splice has been emptied
}, 1000);
}
//this is the function when the Stop button is clicked.
stopConnection(){
Actions.Activity({
counterTimestamp: this.firstElementTimestamp,
//...
})
}
So my goal is to find a way to always use the same base in the calculations, without it being updated.
How can I do?
I think you should use array reduce or Promise All (based on which one you need, parallel or not)
arr.reduce((prom, item) => {
return prom.then(() => {
return scrittura.write(item).then((result) => ... );
});
}, Promise.resolve()).then(function() {
// all done here
}).catch(function(err) {
// error here
});
or use Promise All for parallel
You can see another example here
Synchronous loop in Promise all
Im creating my batch and inserting it to collection using command i specified below
batch = []
time = 1.day.ago
(1..2000).each{ |i| a = {:name => 'invbatch2k'+i.to_s, :user_id => BSON::ObjectId.from_string('533956cd4d616323cf000000'), :out_id => 'out', :created_at => time, :updated_at => time, :random => '0.5' }; batch.push a; }
Invitation.collection.insert batch
As stated above, every single invitation record has user_id fields value set to '533956cd4d616323cf000000'
after inserting my batch with created_at: 1.day.ago i get:
2.1.1 :102 > Invitation.lte(created_at: 1.week.ago).count
=> 48
2.1.1 :103 > Invitation.lte(created_at: Date.today).count
=> 2048
also:
2.1.1 :104 > Invitation.lte(created_at: 1.week.ago).where(user_id: '533956cd4d616323cf000000').count
=> 14
2.1.1 :105 > Invitation.where(user_id: '533956cd4d616323cf000000').count
=> 2014
Also, I've got a map reduce which counts invitations sent by each unique User (both total and sent to unique out_id)
class Invitation
[...]
def self.get_user_invites_count
map = %q{
function() {
var user_id = this.user_id;
emit(user_id, {user_id : this.user_id, out_id: this.out_id, count: 1, countUnique: 1})
}
}
reduce = %q{
function(key, values) {
var result = {
user_id: key,
count: 0,
countUnique : 0
};
var values_arr = [];
values.forEach(function(value) {
values_arr.push(value.out_id);
result.count += 1
});
var unique = values_arr.filter(function(item, i, ar){ return ar.indexOf(item) === i; });
result.countUnique = unique.length;
return result;
}
}
map_reduce(map,reduce).out(inline: true).to_a.map{|d| d['value']} rescue []
end
end
The issue is:
Invitation.lte(created_at: Date.today.end_of_day).get_user_invites_count
returns
[{"user_id"=>BSON::ObjectId('533956cd4d616323cf000000'), "count"=>49.0, "countUnique"=>2.0} ...]
instead of "count" => 2014, "countUnique" => 6.0 while:
Invitation.lte(created_at: 1.week.ago).get_user_invites_count returns:
[{"user_id"=>BSON::ObjectId('533956cd4d616323cf000000'), "count"=>14.0, "countUnique"=>6.0} ...]
Data provided by query, is accurate before inserting the batch.
I cant wrap my head around whats going on here. Am i missing something?
The part that you seemed to have missed in the documentation seem to be the problem here:
MongoDB can invoke the reduce function more than once for the same key. In this case, the previous output from the reduce function for that key will become one of the input values to the next reduce function invocation for that key.
And also later:
the type of the return object must be identical to the type of the value emitted by the map function to ensure that the following operations is true:
So what you see is your reduce function is returning a signature different to the input it receives from the mapper. This is important since the reducer may not get all of the values for a given key in a single pass. Instead it gets some of them, "reduces" the result and that reduced output may be combined with other values for the key ( possibly also reduced ) in a further pass through the reduce function.
As a result of your fields not matching, subsequent reduce passes do not see those values and do not count towards your totals. So you need to align the signatures of the values:
def self.get_user_invites_count
map = %q{
function() {
var user_id = this.user_id;
emit(user_id, {out_id: this.out_id, count: 1, countUnique: 0})
}
}
reduce = %q{
function(key, values) {
var result = {
out_id: null,
count: 0,
countUnique : 0
};
var values_arr = [];
values.forEach(function(value) {
if (value.out_id != null)
values_arr.push(value.out_id);
result.count += value.count;
result.countUnique += value.countUnique;
});
var unique = values_arr.filter(function(item, i, ar){ return ar.indexOf(item) === i; });
result.countUnique += unique.length;
return result;
}
}
map_reduce(map,reduce).out(inline: true).to_a.map{|d| d['value']} rescue []
end
You also do not need user_id in the values emitted or kept as it is already the "key" value for the mapReduce. The remaining alterations consider that both "count" and "countUnique" can contain an exiting value that needs to be considered, where you were simply resetting the value to 0 on each pass.
Then of course if the "input" has already been through a "reduce" pass, then you do not need the "out_id" values to be filtered for "uniqueness" as you already have the count and that is now included. So any null values are not added to the array of things to count, which is also "added" to the total rather than replacing it.
So the reducer does get called several times. For 20 key values the input will likely not be split, which is why your sample with less input works. For pretty much anything more than that, then the "groups" of the same key values will be split up, which is how mapReduce optimizes for large data processing. As the "reduced" output will be sent back to the reducer again, you need to be mindful that you are considering the values you already sent to output in the previous pass.
I have a stream holding an array, each element of which has an id. I need to split this into a stream per id, which will complete when the source stream no longer carries the id.
E.g. input stream sequence with these three values
[{a:1}, {b:1}] [{a:2}, {b:2}, {c:1}] [{b:3}, {c:2}]
should return three streams
a -> 1 2 |
b -> 1 2 3
c -> 1 2
Where a has completed on the 3rd value, since its id is gone, and c has been created on the 2nd value, since its id has appeared.
I'm trying groupByUntil, a bit like
var input = foo.share();
var output = input.selectMany(function (s) {
return rx.Observable.fromArray(s);
}).groupByUntil(
function (s) { return s.keys()[0]; },
null,
function (g) { return input.filter(
function (s) { return !findkey(s, g.key); }
); }
)
So, group by the id, and dispose of the group when the input stream no longer has the id. This seems to work, but the two uses of input look odd to me, like there could a weird order dependency when using a single stream to control the input of the groupByUntil, and the disposal of the groups.
Is there a better way?
update
There is, indeed, a weird timing problem here. fromArray by default uses the currentThread scheduler, which will result in events from that array being interleaved with events from input. The dispose conditions on the group are then evaluated at the wrong time (before the groups from the previous input have been processed).
A possible workaround is to do fromArray(.., rx.Scheduler.immediate), which will keep the grouped events in sync with input.
yeah the only alternative I can think of is to manage the state yourself. I don't know that it is better though.
var d = Object.create(null);
var output = input
.flatMap(function (s) {
// end completed groups
Object
.keys(d)
.filter(function (k) { return !findKey(s, k); })
.forEach(function (k) {
d[k].onNext(1);
d[k].onCompleted();
delete d[k];
});
return Rx.Observable.fromArray(s);
})
.groupByUntil(
function (s) { return s.keys()[0]; },
null,
function (g) { return d[g.key] = new Rx.AsyncSubject(); });