Implementing Stream in JavaScript - javascript

I want to implement a stream object that can do this:
// a -------1------2----3
// map -----\------\----\
// b --------2------4----6
const a = new Stream();
const b = a.map(value => value * 2);
b.subscribe(console.log);
a.push(1);
// 2
a.push(2);
// 4
a.push(3);
// 6
The idea here is that the object b can subscribe new callbacks to stream a. The map function should listen when push is called and apply the mapped out function as well as the originally subscribed one. This is the implementation I have so far:
class Stream {
constructor(queue = []) {
this.queue = queue;
}
subscribe(action) {
if (typeof action === 'function') {
this.queue.push(action);
}
}
map(callback) {
this.queue = this.queue.map(
actionFn => arg => action(callback(arg))
);
return this;
}
push(value) {
this.queue.forEach(actionFn => {
actionFn.call(this, value);
});
}
}
The problem with current implementation is that originally the queue in class Stream is empty so it doesn't go through it. Would appreciate any suggestions or help. I would like to not use any library for this.

Your map needs to create a new Transform stream and return it. Instead of the subscribe you could simply use the standard on('data') event or, better use the read method.
Lastly - you could simply use my work and have your map method already efficiently implemented by using scramjet, which does exactly what you shown above - and moreover it supports async functions. :)
Here's how you'd use it (in some getStream function):
const {DataStream} = require('scramjet');
const stream = new DataStream();
stream.write(1); // you can also use await stream.whenWrote(1);
stream.write(2);
stream.write(3);
return stream.map(x => x * 2);
and then read it somewhere else:
stream.on('data', x => console.log(`x: ${x}`));
// x: 2
// x: 4
// x: 6
Take a look at the scramjet docs here

After such a long time of asking this question I was able to go back to the problem and come up with a simple solution. Since one stream should listen to the one it is subscribed to, we should return the instance of the original in order to preserve the values from the previous stream. Here is the code that I found to be working well:
class Stream {
constructor() {
this.subscriptions = [];
this.mappedActions = [];
}
subscribe(callback) {
this.subscriptions.push(callback);
}
map(actionFunc) {
this.mappedActions.push(actionFunc);
return this;
}
push(opValue) {
this.subscriptions.forEach(cb => {
if (this.mappedActions.length) {
this.mappedActions.forEach(action => {
cb(action.call(this, opValue));
});
} else {
cb(opValue);
}
});
}
}
const a = new Stream();
const b = a.map(value => value * 1 / 2);
const c = b.map(value => value * 3);
c.subscribe(console.log);
c.push(1);
c.push(2);
c.push(3);
// expected output in the console:
// 0.5
// 3
// 1
// 6
// 1.5
// 9
Hope anyone who stumbles upon this interesting problem will find my solution useful. If there is any changes you would like to make, feel free to do so or ping me!

Related

How to use web worker inside a for loop in javascript?

Following is the code to create a 2d matrix in javascript:
function Create2DArray(rows) {
var arr = [];
for (var i=0;i<rows;i++) {
arr[i] = [];
}
return arr;
}
now I have a couple of 2d matrices inside an array:
const matrices = []
for(let i=1; i<10000; i++){
matrices.push(new Create2DArray(i*100))
}
// I'm just mocking it here. In reality we have data available in matrix form.
I want to do operations on each matrix like this:
for(let i=0; i<matrices.length; i++){
...domeAnythingWithEachMatrix()
}
& since it will be a computationally expensive process, I would like to do it via a web worker so that the main thread is not blocked.
I'm using paralleljs for this purpose since it will provide nice api for multithreading. (Or should I use the native Webworker? Please suggest.)
update() {
for(let i=0; i<matrices.length; i++){
var p = new Parallel(matrices[i]);
p.spawn(function (matrix) {
return doanythingOnMatrix(matrix)
// can be anything like transpose, scaling, translate etc...
}).then(function (matrix) {
return back so that I can use those values to update the DOM or directly update the DOM here.
// suggest a best way so that I can prevent crashes and improve performance.
});
}
requestAnimationFrame(update)
}
So my question is what is the best way of doing this?
Is it ok to use a new Webworker or Parallel instance inside a for loop?
Would it cause memory issues?
Or is it ok to create a global instance of Parallel or Webworker and use it for manipulating each matrix?
Or suggest a better approach.
I'm using Parallel.js for as alternative for Webworker
Is it ok to use parallel.js for multithreading? (Or do I need to use the native Webworker?)
In reality, the matrices would contain position data & this data is processed by the Webworker or parallel.js instance behind the scenes and returns the processed result back to the main app, which is then used to draw items / update canvas
UPDATE NOTE
Actually, this is an animation. So it will have to be updated for each matrix during each tick.
Currently, I'm creating a new Instance of parallel inside the for loop. I fear that this would be a non conventional approach. Or it would cause memory leaks. I need the best way of doing this. Please suggest.
UPDATE
This is my example:
Following our discussion in the comments, here is an attempt at using chunks. The data is processed by groups of 10 (a chunk), so that you can receive their results regularly, and we only start the animation after receiving 200 of them (buffer) to get a head start (think of it like a video stream). But these values may need to be adjusted depending on how long each matrix takes to process.
That being said, you added details afterwards about the lag you get. I'm not sure if this will solve it, or if the problem lays in your canvas update function. That's just a path to explore:
/*
* A helper function to process data in chunks
*/
async function processInChunks({ items, processingFunc, chunkSize, bufferSize, onData, onComplete }) {
const results = [];
// For each group of {chunkSize} items
for (let i = 0; i < items.length; i += chunkSize) {
// Process this group in parallel
const p = new Parallel( items.slice(i, i + chunkSize) );
// p.map is no a real Promise, so we create one
// to be able to await it
const chunkResults = await new Promise(resolve => {
return p.map(processingFunc).then(resolve);
});
// Add to the results
results.push(...chunkResults);
// Pass the results to a callback if we're above the {bufferSize}
if (i >= bufferSize && typeof onData === 'function') {
// Flush the results
onData(results.splice(0, results.length));
}
}
// In case there was less data than the wanted {bufferSize},
// pass the results anyway
if (results.length) {
onData(results.splice(0, results.length));
}
if (typeof onComplete === 'function') {
onComplete();
}
}
/*
* Usage
*/
// For the demo, a fake matrix Array
const matrices = new Array(3000).fill(null).map((_, i) => i + 1);
const results = [];
let animationRunning = false;
// For the demo, a function which takes time to complete
function doAnythingWithMatrix(matrix) {
const start = new Date().getTime();
while (new Date().getTime() - start < 30) { /* sleep */ }
return matrix;
}
processInChunks({
items: matrices,
processingFunc: doAnythingWithMatrix,
chunkSize: 10, // Receive results after each group of 10
bufferSize: 200, // But wait for at least 200 before starting to receive them
onData: (chunkResults) => {
results.push(...chunkResults);
if (!animationRunning) { runAnimation(); }
},
onComplete: () => {
console.log('All the matrices were processed');
}
});
function runAnimation() {
animationRunning = results.length > 0;
if (animationRunning) {
updateCanvas(results.shift());
requestAnimationFrame(runAnimation);
}
}
function updateCanvas(currentMatrixResult) {
// Just for the demo, we're not really using a canvas
canvas.innerHTML = `Frame ${currentMatrixResult} out of ${matrices.length}`;
info.innerHTML = results.length;
}
<script src="https://unpkg.com/paralleljs#1.0/lib/parallel.js"></script>
<h1 id="canvas">Buffering...</h1>
<h3>(we've got a headstart of <span id="info">0</span> matrix results)</h3>

Tone.js Tone.BufferSource: buffer is either not set or not loaded

Tone.BufferSource: buffer is either not set or not loaded. This error occurs in try/catch block. It only occurs, when I trigger update function constantly or sometimes randomly.
When this error occurs my audio just turns off for a brief moment.
The logic behind my code. When program starts create function is invoked in the constructor creating Tone.sequence later on when I change/update track parameters i call update fuction,
which calls loopprocessor with new/updated tracks. But when i trigger update which triggers loopprocessor function it runs into tone.sourcebuffer is either not set ir loaded. How can i work around this problem?
My code:
import Tone from "tone";
export function create(tracks, beatNotifier){
const loop = new Tone.Sequence(
loopProcessor(tracks, beatNotifier),
[...new Array(16)].map((_, i) => i),
"16n"
);
Tone.Transport.bpm.value = 120;
Tone.Transport.start();
return loop;
}
export function update(loop, tracks, beatNotifier){
loop.callback = loopProcessor(tracks, beatNotifier);
return loop;
}
function loopProcessor (tracks, beatNotifier) {
const urls = tracks.reduce((acc, {name}) => {
return {...acc, [name]: `http://localhost:3000/src/sounds/${name}.[wav|wav]`};
}, {});
const keys = new Tone.Players(urls, {
fadeOut: "64n"
}).toMaster();
return (time, index) => {
beatNotifier(index);
tracks.forEach(({name, vol, muted, note, beats}) => {
if (beats[index]) {
try {
var vel = Math.random() * 0.5 + 0.5;
keys
.get(name)
.start(time, 0, note, 0, vel);
keys
.get(name).volume.value = muted
? -Infinity
: vol;
} catch(e) {
console.log("error", e);
}
}
});
};
}
I had this problem recently and found a solution that worked for my case.
Tone.js doesn't like it when you initialise an audio buffer inside a function (what you're doing when you call new Tone.Players inside loopprocessor).
To get around this at the top of your code declare a new global variable buffer1 = new Tone.Buffer(url1) for each url that you need. https://tonejs.github.io/docs/r13/Buffer
Then inside loopprocessor just replace urls with each buffer and a name tag and you shouldn't have any problems. So new Tone.Players({"name1": buffer1, "name2": buffer2, ...})

MobX action schedule / execution order isn't preserved due to race condition

Here's a running example of what I've got so far:
https://codesandbox.io/s/github/BruceL33t/mobx-action-synchronous-execution-order/tree/master/
store.js:
import { observable, action } from "mobx";
import Sensor from "../models/Sensor";
export default class RootStore {
#observable sensors = new Map();
constructor() {
let self = this;
const sensorIds = [
"sensor1",
"sensor2",
"sensor3",
"sensor4",
"sensor5",
"sensor6",
"sensor7",
"sensor8",
"sensor9",
"sensor10"
];
for (let sensor of sensorIds) {
self.sensors.set(sensor, new Sensor(5));
}
// setInterval simulates some incoming data (originally from SignalR, and roughly each second)
setInterval(function() {
let out = {};
const x = +new Date(); // unix timestamp
for (let sensor of sensorIds) {
const y = Math.floor(Math.random() * 10000) + 1;
const m = { x: x, y: y };
out[sensor] = m;
}
self.addMeasurement(out); // the problem starts here.
}, 1000);
}
// the problem!
#action
addMeasurement(sensorMeasurementMap) {
let self = this;
// this timeout is to try and simulate a race condition
// since each measurement is incoming each second,
// here some of them will take as long as 6 seconds to add,
// due to the timeout.
// the point is that they should always be added,
// in the order they were called in.
// so if the first measurement takes 20 seconds to be added,
// the next measurements that were received on 2, 3, 4, 5..., 19th second etc,
// should all "wait" for the prev measurement, so they're added
// in the right order (order can be checked by timestamp, x)
setTimeout(() => {
const keys = self.sensors.keys();
if (keys.length === 0) {
// never really gonna happen, since we already set them above
} else {
for (const key in sensorMeasurementMap) {
if (self.sensors.keys().indexOf(key) > -1) {
self.sensors.get(key).add(sensorMeasurementMap[key]);
} else {
// also not gonna happen in this example
}
}
}
}, Math.floor(Math.random() * 20 + 1) * 1000);
}
}
Sensor.js:
import Queue from './Queue';
import {observable, action} from 'mobx';
export default class Sensor {
#observable queue;
constructor(n) {
this.n = n;
this.queue = new Queue(this.n);
}
#action add(measurement) {
this.queue.add(measurement);
}
}
Queue.js:
import {observable, action} from 'mobx';
export default class Queue {
#observable data;
constructor(maxSize) {
this.maxSize = maxSize;
this.size = 0;
this.data = [];
}
#action add(measurement) {
let removedItem = undefined;
if(this.size >= this.maxSize) {
let temp = this.data[0];
removedItem = temp && temp.y ? temp.y+'' : undefined;
this.data.shift();
}
this.data.push(measurement);
if (removedItem === undefined && this.size < this.maxSize) {
this.size++;
}
return removedItem;
}
}
There's a few comments in the code but you absolutely need to see the output https://codesandbox.io/s/github/BruceL33t/mobx-action-synchronous-execution-order/tree/master/ to understand it.
Let me also try to explain it here, what this is all about.
This is basically a overly simplified version of a part of a real application, where setInterval is just used instead to simulate a SignalR event handler to indicate incoming data each second. The incoming data is what we create inside the setInterval func above the addMeasurement action.
So given some incoming data is received each second, we want to add this to the observable map sensors on the store. Since this data is used for drawing charts in the real application, we need to make sure it is indeed added in the order which the actions are invoked in - no matter how long the action takes to complete.
In the real application I saw some inconsistency in the order of how the data were pushed to the MobX state, so I isolated it and extracted the relevant parts into this example and tried to exaggerate it a bit by using the setTimeout func inside the addMeasurement action.
Since each data is fetched each second, but some measurements could take up to 20 seconds to fetch (not realisticly, but to clearly show race condition problem), as the code is right now, it often happens that we end up with something like:
[
{"x":1519637083193,"y":4411},
{"x":1519637080192,"y":7562},
{"x":1519637084193,"y":1269},
{"x":1519637085192,"y":8916},
{"x":1519637081192,"y":7365}
]
Which really should never happen, since 1519637083193 is greater/later than 1519637080192.
This is a real problem when drawing charts from this data and ordering it afterwards is way too expensive, so I'm looking for a way to improve this code so we can trust each addMeasurement is only fired once the previous action has completely finished. Or at least a way to update the MobX state in the right order
Hope it makes sense.
should all "wait" for the prev measurement, so they're added in the right order (order can be checked by timestamp, x).
Could you elaborate on that? How could one ever know that no timestamp larger than the current one will be received in the future, and hence wait indefinitely? Isn't what you are looking for just a sorted insertion to the array of measurements (instead of waiting)?
If sorted insertion doesn't solve the problem, I would probably do the following (untested):
lastAddition = Promise.resolve() // start with already finishied addition
addMeasurement(sensorMeasurementMap) {
this.lastAddition = this.lastAddition.then(() => {
return new Promise((resolve, reject) => {
setTimeout(action(() => {
const keys = self.sensors.keys();
if (keys.length === 0) {
// never really gonna happen, since we already set them above
} else {
for (const key in sensorMeasurementMap) {
if (self.sensors.keys().indexOf(key) > -1) {
self.sensors.get(key).add(sensorMeasurementMap[key]);
} else {
// also not gonna happen in this example
}
}
}
resolve()
}), Math.floor(Math.random() * 20 + 1) * 1000);
})
})
}
}
N.B.: Note that I moved action inside, as you need it at the place where you are actually modifying the state, not where the scheduling happens

Merge events from a changing list of Observables

I'm using rxjs.
I have a Browser that's responsible for a number of Page objects. Each page has an Observable<Event> that yields a stream of events.
Page objects are closed and opened at various times. I want to create one observable, called TheOneObservable that will merge all the events from all the currently active Page objects, and also merge in custom events from the Browser object itself.
Closing a Page means that the subscription to it should be closed so it doesn't prevent it from being GC'd.
My problem is that Pages can be closed at any time, which means that the number of Observables being merged is always changing. I've thought of using an Observable of Pages and using mergeMap, but there are problems with this. For example, a subscriber will only receive events of Pages that are opened after it subscribes.
Note that this question has been answered here for .NET, but using an ObservableCollection that isn't available in rxjs.
Here is some code to illustrate the problem:
class Page {
private _events = new Subject<Event>();
get events(): Observable<Event> {
return this._events.asObservable();
}
}
class Browser {
pages = [] as Page[];
private _ownEvents = new Subject<Event>();
addPage(page : Page) {
this.pages.push(page);
}
removePage(page : Page) {
let ixPage = this.pages.indexOf(page);
if (ixPage < 0) return;
this.pages.splice(ixPage, 1);
}
get oneObservable() {
//this won't work for aforementioned reasons
return Observable.from(this.pages).mergeMap(x => x.events).merge(this._ownEvents);
}
}
It's in TypeScript, but it should be understandable.
You can switchMap() on a Subject() linked to array changes, replacing oneObservable with a fresh one when the array changes.
pagesChanged = new Rx.Subject();
addPage(page : Page) {
this.pages.push(page);
this.pagesChanged.next();
}
removePage(page : Page) {
let ixPage = this.pages.indexOf(page);
if (ixPage < 0) return;
this.pages.splice(ixPage, 1);
this.pagesChanged.next();
}
get oneObservable() {
return pagesChanged
.switchMap(changeEvent =>
Observable.from(this.pages).mergeMap(x => x.events).merge(this._ownEvents)
)
}
Testing,
const page1 = { events: Rx.Observable.of('page1Event') }
const page2 = { events: Rx.Observable.of('page2Event') }
let pages = [];
const pagesChanged = new Rx.Subject();
const addPage = (page) => {
pages.push(page);
pagesChanged.next();
}
const removePage = (page) => {
let ixPage = pages.indexOf(page);
if (ixPage < 0) return;
pages.splice(ixPage, 1);
pagesChanged.next();
}
const _ownEvents = Rx.Observable.of('ownEvent')
const oneObservable =
pagesChanged
.switchMap(pp =>
Rx.Observable.from(pages)
.mergeMap(x => x.events)
.merge(_ownEvents)
)
oneObservable.subscribe(x => console.log('subscribe', x))
console.log('adding 1')
addPage(page1)
console.log('adding 2')
addPage(page2)
console.log('removing 1')
removePage(page1)
.as-console-wrapper { max-height: 100% !important; top: 0; }
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.5.6/Rx.js"></script>
You will need to manage the subscriptions to the pages yourself and feed its events into the resulting subject yourself:
const theOneObservable$ = new Subject<Event>();
function openPage(page: Page): Subscription {
return page.events$.subscribe(val => this.theOneObservable$.next(val));
}
Closing the page, i.e. calling unsubscribe on the returned subscription, will already do everything it has to do.
Note that theOneObservable$ is a hot observable here.
You can, of course, take this a bit further by writing your own observable type which encapsulates all of this API. In particular, this would allow you to unsubscribe all inner observables when it is being closed.
A slightly different approach is this:
const observables$ = new Subject<Observable<Event>>();
const theOneObservable$ = observables$.mergeMap(obs$ => obs$);
// Add a page's events; note that takeUntil takes care of the
// unsubscription process here.
observables$.next(page.events$.takeUntil(page.closed$));
This approach is superior in the sense that it will unsubscribe the inner observables automatically when the observable is unsubscribed.

RXJS Observable stretch

I have a Rx.Observable.webSocket Subject. My server endpoint can not handle messages receiving the same time (<25ms). Now I need a way to stretch the next() calls of my websocket subject.
I have created another Subject requestSubject and subscribe to this.
Then calling next of the websocket inside the subscription.
requestSubject.delay(1000).subscribe((request) => {
console.log(`SENDING: ${JSON.stringify(request)}`);
socketServer.next(JSON.stringify(request));
});
Using delay shifts each next call the same delay time, then all next calls emit the same time later ... thats not what I want.
I tried delay, throttle, debounce but it does not fit.
The following should illustrate my problem
Stream 1 | ---1-------2-3-4-5---------6----
after some operation ...
Stream 2 | ---1-------2----3----4----5----6-
Had to tinker a bit, its not as easy as it looks:
//example source stream
const source = Rx.Observable.from([100,500,1500,1501,1502,1503])
.mergeMap(i => Rx.Observable.of(i).delay(i))
.share();
stretchEmissions(source, 1000)
.subscribe(val => console.log(val));
function stretchEmissions(source, spacingDelayMs) {
return source
.timestamp()
.scan((acc, curr) => {
// calculate delay needed to offset next emission
let delay = 0;
if (acc !== null) {
const timeDelta = curr.timestamp - acc.timestamp;
delay = timeDelta > spacingDelayMs ? 0 : (spacingDelayMs - timeDelta);
}
return {
timestamp: curr.timestamp,
delay: delay,
value: curr.value
};
}, null)
.mergeMap(i => Rx.Observable.of(i.value).delay(i.delay), undefined, 1);
}
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.4.2/Rx.js"></script>
Basically we need to calculate the needed delay between emissions so we can space them. We do this using timestamp() of original emissions and the mergeMap overload with a concurrency of 1 to only subscribe to the next delayed value when the previous is emitted. This is a pure Rx solution without further side effects.
Here are two solutions using a custom stream and using only rxjs-operators - since it looks quite complicated I would not advice you to use this solution, but to use a custom stream (see 1st example below):
Custom stream (MUCH easier to read and maintain, probably with better performance as well):
const click$ = Rx.Observable
.fromEvent(document.getElementById("btn"), "click")
.map((click, i) => i);
const spreadDelay = 1000;
let prevEmitTime = 0;
click$
.concatMap(i => { // in this case you could also use "flatMap" or "mergeMap" instead of "concatMap"
const now = Date.now();
if (now - prevEmitTime > spreadDelay) {
prevEmitTime = now;
return Rx.Observable.of(i); // emit immediately
} else {
prevEmitTime += spreadDelay;
return Rx.Observable.of(i).delay(prevEmitTime - now); // emit somewhere in the future
}
})
.subscribe((request) => {
console.log(`SENDING: ${request}`);
});
<script src="https://unpkg.com/rxjs/bundles/Rx.min.js"></script>
<button id="btn">Click me!</button>
Using only RxJS Operators (contains issues, probably shouldn't use):
const click$ = Rx.Observable
.fromEvent(document.getElementById("btn"), "click")
.map((click, i) => i);
click$
// window will create a new substream whenever no click happened for 1001ms (with the spread out
.window(click$
.concatMap(i => Rx.Observable.of(i).delay(1000))
.debounceTime(1001)
)
.mergeMap(win$ => Rx.Observable.merge(
win$.take(1).merge(), // emitting the "first" click immediately
win$.skip(1)
.merge()
.concatMap(i => Rx.Observable.of(i).delay(1000)) // each emission after the "first" one will be spread out to 1 seconds
))
.subscribe((request) => {
console.log(`SENDING: ${request}`);
});
<script src="https://unpkg.com/rxjs/bundles/Rx.min.js"></script>
<button id="btn">Click me!</button>
Mark van Straten's solution didn't work completely accurately for me. I found a much more simple and accurate solution based from here.
const source = from([100,500,1500,1501,1502,1503]).pipe(
mergeMap(i => of(i).pipe(delay(i)))
);
const delayMs = 500;
const stretchedSource = source.pipe(
concatMap(e => concat(of(e), EMPTY.pipe(delay(delayMs))))
);

Categories

Resources