I have a svelte component named [symbol].svelte in which I want to initiate a connection to a streaming service to receive server-sent events. I've not found a way to do this successfully.
Since EventSource only runs in the browser, I initialized it in the onMount function like so:
<script>
export let quote;
let sse = {};
onMount(async () => {
sse = new EventSource(`https://myurl.com?symbol=${quote.symbol}`);
sse.onmessage = (event) => {
let response = JSON.parse(event.data);
if(!response.length) return;
quote = response[0];
}
});
onDestroy(() => {
if(sse.readyState && sse.readyState === 1) {
sse.close();
}
})
</script>
<div>{quote.symbol}</div>
This works fine, except when I navigate to another route that uses the same component- since the component doesn't unmount and remount, onMount() doesn't fire and thus doesn't instantiate a new SSE request. I don't know of any way to easily force the component to remount, which would be simplest (relevant github issue here)
Another try was using a reactive statement like so:
<script>
export let quote;
let sse = {};
$: {
if(process.browser === true) { //again, this stuff won't run on the server
if(sse.readyState && sse.readyState === 1) {
sse.close();
}
sse = new EventSource(`https://myurl.com?symbol=${quote.symbol}`);
}
}
sse.onmessage = (event) => {
let response = JSON.parse(event.data);
quote = response[0];
console.log(quote);
}
</script>
<div>{quote.symbol}</div>
When changing routes, the quote variable changed, thus triggering the reactive statement to kill the existing SSE and instantiate a new one. Exceptthe onmessage handler wouldn't fire, probably because the onmessage handler gets attached before the eventsource object is created.
Last take was to try with the onmessage handler in the reactive statement like so:
<script>
export let quote;
let sse = {};
$: {
if(process.browser === true) { //again, this stuff won't run on the server
if(sse.readyState && sse.readyState === 1) {
sse.close();
}
sse = new EventSource(`https://myurl.com?symbol=${quote.symbol}`);
sse.onmessage = (event) => {
let response = JSON.parse(event.data);
quote = response[0];
console.log(quote);
}
}
}
</script>
<div>{quote.symbol}</div>
The problem here is that since quote gets reassigned as a product of the onmessage handler, the reactive statement keeps firing circularly.
At this point I'm at a loss, any input would be appreciated!
It sounds like you want to use {#key ...}, which causes its contents to be torn down and recreated when the value changes, including components:
{#key quote}
<!-- destroyed and recreated whenever `quote` changes -->
<Quote {quote}/>
{/key}
Docs here: https://svelte.dev/docs#key
Incidentally, using onDestroy is unnecessary if it's only used to clean up work that happens in onMount:
onMount(() => {
const sse = new EventSource(`https://myurl.com?symbol=${quote.symbol}`);
sse.onmessage = (event) => {
let response = JSON.parse(event.data);
if(!response.length) return;
quote = response[0];
}
};
return () => {
if(sse.readyState === 1) {
sse.close();
}
});
});
This is better because you don't have the top-level sse variable, and because the returned cleanup function only needs in the browser, you don't need to have the placeholder ssr = {} assignment or check for sse.readyState.
Related
Im creating an app with react native and face the problem that I create multiple firebase listeners troughout the app, listeners on different screens to be precise and also listeners that listen to the firebase-database and others listening to the firestore.
What I want to accomplish is to kill all those listeners with one call or if necessary with multiple lines but as compact as possible - and also from an entire different screen where the listeners arent even running, this is important.
I know that there is the possibility to use Firebase.goOffline() but this only disconnects me from the Firebase - it doesnt stop the listeners. As soon as I goOnline() again, the listeners are all back.
I didnt find any solution yet for this problem from google etc thats why I try to ask here now, I would appriciate if anybody would have an idea how maybe an approach how to handle this type of behavior.
The following code samples provide you with listeners I included inside my app, they are located in in the same screen but I have nearly identical ones in other screens.
Database listener:
const statusListener = () => {
var partnerRef = firebase.database().ref(`users/${partnerId}/onlineState`);
partnerRef.on('value', function(snapshot){
setPartnerState(snapshot.val())
})
};
Firestore Listener: (this one is very long, thats only because I filter the documents I retrieve from the listener)
const loadnewmessages = () =>{ firebase.firestore().collection("chatrooms").doc(`${chatId}`).collection(`${chatId}`).orderBy("timestamp").limit(50).onSnapshot((snapshot) => {
var newmessages = [];
var deletedmesssages = [];
snapshot.docChanges().forEach((change) => {
if(change.type === "added"){
newmessages.push({
counter: change.doc.data().counter,
sender: change.doc.data().sender,
timestamp: change.doc.data().timestamp.toString(),
value: change.doc.data().value,
displayedTime: new Date(change.doc.data().displayedTime)
})
};
if(change.type === "removed"){
deletedmesssages.push({
counter: change.doc.data().counter,
sender: change.doc.data().sender,
timestamp: change.doc.data().timestamp.toString(),
value: change.doc.data().value,
displayedTime: new Date(change.doc.data().displayedTime)
})
};
})
if(newmessages.length > 0){
setChatMessages(chatmessages => {
return chatmessages.concat(newmessages)
});
};
if(deletedmesssages.length > 0){
setChatMessages(chatmessages => {
var modifythisarray = chatmessages;
let index = chatmessages.map(e => e.timestamp).indexOf(`${deletedmesssages[0].timestamp}`);
let pasttime = Date.now() - parseInt(modifythisarray[index].timestamp);
modifythisarray.splice(index, 1);
if(pasttime > 300000){
return chatmessages
}else{
return modifythisarray
}
});
setRefreshFlatList(refreshFlatlist => {
//console.log("Aktueller Status von refresher: ", refreshFlatlist);
return !refreshFlatlist
});
}
newmessages = [];
deletedmesssages = [];
})
};
Both those listeners are called within a useEffect hook just like that: (useEffect with empty braces at the end makes sure those listeners are called only once and not multiple times.)
useEffect(() => {
loadnewmessages();
statusListener();
}, []);
All of the subscribe functions return the unsubscribe function
const unSubscriptions = [];
... Where you subscribe
const unSub = document.onSnapshot(listener);
subscriptions.push(unSub);
... Where you unsubscribe all
function unSubAll () {
unSubscriptions.forEach((unSub) => unSub());
// Clear the array
unSubscriptions.length = 0;
}
I have a readable store in Svelte that looks like this:
const state = {};
export const channels = readable(state, set => {
let st = state;
let socket = new WebSocket("ws://127.0.0.1:5999");
socket.onmessage = function (event) {
var datastr = event.data.split(':');
st[datastr[0]].value = datastr[1];
st[datastr[0]].timestamp = Date.now();
set(st)
};
return () => {
socket.close()
}
});
When I import it to my Svelte App works. But if I put that App.svelte as my index.svelte running on Sapper, it doesnt work at first. It says error 500 websocket is not defined. Once I reload the page in the browser start to work...
I have try to parse a function that creates the store instead:
export const getChannel = () => {
// here my store
return {...store}
}
and then creating the store inside a onMount() like this:
onMount( ()=> {
const channel = getChannel();
});
But doesnt seem to do the trick... What do I miss?
Note: If a just replace the store by a simple writable, and create the websocket onMount(), it works without any problem. I just only wanted to put all the communication inside the store as a readable...
In Sapper, code in components (or imported into components) is executed in Node during server-side rendering unless it's put inside onMount (which doesn't run on the server, because there's no 'mounting' happening) or an if (process.browser) {...} block, or something equivalent.
That includes things like references to $channels causing channels.subscribe(...) to be called during initialisation.
Since there's no WebSocket global in Node, creating that subscription will fail. The simplest workaround is probably a simple feature check:
const state = {};
export const channels = readable(state, (set) => {
if (typeof WebSocket === 'undefined') return;
let st = state;
let socket = new WebSocket("ws://127.0.0.1:5999");
socket.onmessage = function (event) {
var datastr = event.data.split(":");
st[datastr[0]].value = datastr[1];
st[datastr[0]].timestamp = Date.now();
set(st);
};
return () => {
socket.close();
};
});
I'm using rxjs.
I have a Browser that's responsible for a number of Page objects. Each page has an Observable<Event> that yields a stream of events.
Page objects are closed and opened at various times. I want to create one observable, called TheOneObservable that will merge all the events from all the currently active Page objects, and also merge in custom events from the Browser object itself.
Closing a Page means that the subscription to it should be closed so it doesn't prevent it from being GC'd.
My problem is that Pages can be closed at any time, which means that the number of Observables being merged is always changing. I've thought of using an Observable of Pages and using mergeMap, but there are problems with this. For example, a subscriber will only receive events of Pages that are opened after it subscribes.
Note that this question has been answered here for .NET, but using an ObservableCollection that isn't available in rxjs.
Here is some code to illustrate the problem:
class Page {
private _events = new Subject<Event>();
get events(): Observable<Event> {
return this._events.asObservable();
}
}
class Browser {
pages = [] as Page[];
private _ownEvents = new Subject<Event>();
addPage(page : Page) {
this.pages.push(page);
}
removePage(page : Page) {
let ixPage = this.pages.indexOf(page);
if (ixPage < 0) return;
this.pages.splice(ixPage, 1);
}
get oneObservable() {
//this won't work for aforementioned reasons
return Observable.from(this.pages).mergeMap(x => x.events).merge(this._ownEvents);
}
}
It's in TypeScript, but it should be understandable.
You can switchMap() on a Subject() linked to array changes, replacing oneObservable with a fresh one when the array changes.
pagesChanged = new Rx.Subject();
addPage(page : Page) {
this.pages.push(page);
this.pagesChanged.next();
}
removePage(page : Page) {
let ixPage = this.pages.indexOf(page);
if (ixPage < 0) return;
this.pages.splice(ixPage, 1);
this.pagesChanged.next();
}
get oneObservable() {
return pagesChanged
.switchMap(changeEvent =>
Observable.from(this.pages).mergeMap(x => x.events).merge(this._ownEvents)
)
}
Testing,
const page1 = { events: Rx.Observable.of('page1Event') }
const page2 = { events: Rx.Observable.of('page2Event') }
let pages = [];
const pagesChanged = new Rx.Subject();
const addPage = (page) => {
pages.push(page);
pagesChanged.next();
}
const removePage = (page) => {
let ixPage = pages.indexOf(page);
if (ixPage < 0) return;
pages.splice(ixPage, 1);
pagesChanged.next();
}
const _ownEvents = Rx.Observable.of('ownEvent')
const oneObservable =
pagesChanged
.switchMap(pp =>
Rx.Observable.from(pages)
.mergeMap(x => x.events)
.merge(_ownEvents)
)
oneObservable.subscribe(x => console.log('subscribe', x))
console.log('adding 1')
addPage(page1)
console.log('adding 2')
addPage(page2)
console.log('removing 1')
removePage(page1)
.as-console-wrapper { max-height: 100% !important; top: 0; }
<script src="https://cdnjs.cloudflare.com/ajax/libs/rxjs/5.5.6/Rx.js"></script>
You will need to manage the subscriptions to the pages yourself and feed its events into the resulting subject yourself:
const theOneObservable$ = new Subject<Event>();
function openPage(page: Page): Subscription {
return page.events$.subscribe(val => this.theOneObservable$.next(val));
}
Closing the page, i.e. calling unsubscribe on the returned subscription, will already do everything it has to do.
Note that theOneObservable$ is a hot observable here.
You can, of course, take this a bit further by writing your own observable type which encapsulates all of this API. In particular, this would allow you to unsubscribe all inner observables when it is being closed.
A slightly different approach is this:
const observables$ = new Subject<Observable<Event>>();
const theOneObservable$ = observables$.mergeMap(obs$ => obs$);
// Add a page's events; note that takeUntil takes care of the
// unsubscription process here.
observables$.next(page.events$.takeUntil(page.closed$));
This approach is superior in the sense that it will unsubscribe the inner observables automatically when the observable is unsubscribed.
I'm new to react/flux architecture, and I'm missing something...I think. I have two Stores, SubjectsStore.js and WorkDoneStore.js with an AppActions which does the dispatch (code snippets all below). I'm under the impression that any Store that registers with the AppDispatcher will get notice of the event, and it is incumbent on each store to handle the proper action types. There doesn't seem to be any other way of controlling which Store gets called. In my case, I've gotten as far as getting one the SubjectStores registration to be called, but my WorkDoneStore is not getting called. What am I overlooking / doing wrong.
AppActions.js
import AppDispatcher from './AppDispatcher.js';
import WorkDoneConstants from '../constants/WorkDoneConstants.js';
import SubjectConstants from '../constants/SubjectConstants.js';
var AppActions = {
addWorkDoneItem:function(item){
console.log("In app actions addWorkDone");
console.log(WorkDoneConstants.WORKDONE_INSERT);
AppDispatcher.dispatch({
actionType:WorkDoneConstants.WORKDONE_INSERT,
item:item
})
}
}
module.exports = AppActions;
SubjectsStore.js
var AppDispatcher = require('../dispatcher/AppDispatcher');
var SubjectConstants = require('../constants/SubjectConstants');
var EventEmitter = require('events').EventEmitter;
...
AppDispatcher.register(function(action) {
var text;
console.log("why am I in the subjectStore?");
console.log(action.actionType);
console.log(action.item);
switch(action.actionType) {
case SubjectConstants.SUBJECT_CREATE:
text = action.text.trim();
...
WorkDoneStore.js
...
AppDispatcher.register(function(action) {
var text;
console.log("In WorkDoneStore");
console.log(action);
switch(action.actionType) {
case WorkDoneConstants.WORKDONE_INSERT:
item = action.item;
if (item.subject !== '') {
create(item);
WorkDoneStore.emitChange();
}
break;
...
My component
...
handleSubmit: function(e){
e.preventDefault();
var item = {
subject:this.state.subject,
workDone:this.state.workDone,
minutes:this.state.totalMinutes,
startStop:this.state.startStop,
};
console.log("before AppActions.");
AppActions.addWorkDoneItem(item);
},
...
In looking through my Webpack output I noticed that the WorkDoneStore.js wasn't getting included. By forcing it to be included via a call to it, it's now working.
I have a websocket connection that is generating internal message events with a ReplaySubect. I process these events and add a delay to certain messages. Internally I use publish().refCount() twice, once on the internal ReplaySubject and again on the published output stream.
Should the internal subject have both 'publish' and 'refCount' called on it? I use 'publish' because I have multiple subscribers but I'm not entirely sure when to use 'refCount'.
Is it okay to just dispose of the internal subject? Will that clean up everything else?
Whoever subscribes to 'eventStream' should get the latest revision but the connection shouldn't wait for any subscribers
Example code:
function Connection(...) {
var messageSubject = new Rx.ReplaySubject(1);
var messageStream = messageSubject.publish().refCount();
// please ignore that we're not using rxdom's websocket.
var ws = new WebSocket(...);
ws.onmessage = function(messageEvent) {
var message = JSON.parse(messageEvent.data);
messageSubject.onNext(message);
}
ws.onclose = function(closeEvent) {
messageSubject.dispose(); // is this all I need to dispose?
}
var immediateRevisions = messageStream
.filter((e) => e[0] === "immediate")
.map((e) => ["revision", e[1]]);
var delayedRevisions = messageStream
.filter((e) => e[0] === "delayed")
.map((e) => ["revision", e[1]]).delay(1000);
var eventStream = Rx.Observable.merge(immediateRevisions, delayedRevisions).publish().refCount();
Object.defineProperties(this, {
"eventStream": { get: function() { return eventStream; }},
});
}
// using the eventStream
var cxn = new Connection(...)
cxn.eventStream.subscribe((e) => {
if (e[0] === "revision") {
// ...
}
});
publish and refCounting is basically what shareReplay does in RxJS4. Honestly though, you should just let your observable be "warm" and then use a ReplaySubject as a subscriber if you really want to guarantee that the last message gets pushed to new subscribers even if subscription count falls below one. e.g:
const wsStream = Observable.create(observer => {
ws.onmessage = message => observer.next(message);
ws.onclose = () => observer.complete();
});
const latestWsMessages = new ReplaySubject(1);
wsStream.subscribe(latestWsMessages);
Make sure you review how Observables work: after creating an observable, normally, each subscriber will call the subscription (cold), but in this case, you probably want a hot observable so that you have multiple subscribers sharing a subscription. See Andre's video here and the RxJS docs on creating observables for some more info.
Also, as useful as classes can be, looks like in this case you just want a function of makeWebsocketObservable(WebsocketConfig): Observable<WebsocketEvent>