How to share a single instance of an object among multiple actions? - javascript

I have a React/Redux application that talks alot to an API and deals with a lot of rarely changing data from a DB. In order to reduce traffic and improve UE, I now want to create a caching mechanism that stores data on the client by automatically using the best technology that is available (descending from IndexedDB to LocalStorage etc.).
I created a cache object that does an initial check which determines the storage mechanism (which gets saved to an engine property, so the check just needs to run once). It also has some basic methods save(key, value) and load(key), which then call the appropriate functions for the initially determined mechanism.
The cache object and its methods do work, but I wonder how to create the cache in my main index.js only once when the application loads, and then use this very object in my actions without recreating another cache object every time?
BTW: It feels wrong to make the cache part of my application state as it does not really contain substantial data to run the application (if there is no caching available, it falls back to just calling the API).
Do I need to inject the cache into my actions somehow? Or do I need to create a global/static cache object in the main window object?
Thanks for clarification and thoughts on this issue.

redux-thunk middleware offers a custom argument injection feature you could use.
When creating the store
const cache = createCache()
const store = createStore(
reducer,
applyMiddleware(thunk.withExtraArgument(cache))
)
Then in your action creator
function getValue(id) {
return (dispatch, getState, cache) => {
// use cache
}
}

Related

How to share data between components in angular

I use observable to send a value from one component to another (data is sent to the subscriber after clicking on this other component, i.e. via subject) I subscribe in another component and everything works fine until I refresh the page, after refreshing the page the component is recreated and after recreation the subscriber has no data as he did not go through the first component.How can I solve the problem?
I tried using rxjs operators as shareReplay but it didn't work like shareReplay
As your Angular app is destroyed and rebuilt when the page is refreshed, unfortunately you will lose all user state that is not saved somewhere. This is a common problem in building UIs so there are a number of tools available to combat this
Strategy:
Store your user state when an important change is made. This is called persisting state
Fetch and reapply your saved state on reload. This is called hydrating state
Options:
Persist to local storage and check for local storage values on reload to hydrate with
Persist within the users URL (simple values only), e.g. modifying the URL in some way which can be checked on reload. Assuming you are dealing with a single page, query parameters or fragments may be the way to go
Persist to a database via a POST/PATCH call and perform a GET request on reload to check for values to hydrate with
None of these methods are inbuilt into an RxJS operator (as far as I know) but we can easily leverage RxJS to achieve any of the above strategies with little effort. The tap operator is often used specifically to handle side effects, i.e. operations which should happen as a coincidence of an RxJS emission. That is precisely what we want here, in simple terms:
"If the subject emits a value, also trigger an operation which
persists the user state"
"On page load, check for any available saved user state and emit via the
relevant subject, hydrating the observables which the components will consume"
See example implementation below
tab.service.ts
type TabType = 'first' | 'second'
#Injectable({
providedIn: 'root'
})
export class TabService {
tabSelectedSubject: BehaviorSubject<TabType> = new BehaviorSubject<TabType>('first')
tabSelected$: Observable<TabType> =
this.tabSelectedSubject
.pipe(
tap(tab: TabType) => {
// ... your persist code here
this.saveTab()
},
distinctUntilChanged()
)
constructor() {
// ... your hydrate code here
this.fetchAndApplyTab();
}
saveTab(): void {
localStorage.setItem('tab', tab)
}
fetchAndApplyTab(): void {
const savedTab: TabType | null = localStorage.getItem('tab');
if (savedTab) {
this.tabSelectedSubject.next(savedTab)
}
}
}
In this case, we are exploiting the fact that our service is:
A singleton, so only loaded once per app (i.e. provided in the app root)
The service will be instantiated in the first component that loads which also injects it
This allows us to put our fetchAndApplyTab() logic in tab.service.ts's constructor and keep the code self-contained. However, depending on your use case, you may instead want to run fetchAndApplyTab() from your component manually itself.
This is happening because everything is in memory, and on page refresh all is lost, due the fact that angular app is re-initializing. You need to persist the state, for example write it into local storage, for this you could use "tap" operator from rxjs. And also in loading you could read data from localstorage end emit-it, for this you could use app_initializer hook.
there are 2 days majority to pass data between components
If both components are interconnected it means the parent or child
relationships then you can pass data with input-output decorators.
you can use the common service to share data between 2 components.
In SPA application if you refresh the browser then all in memory objects and observables are not present you need to again go back to the screen where it will be initialize.

Redux State Resets On Window Reload (Client Side)

I have very large and complicated objects like userInfo, chatInfo, and etc as in objects & arrays with very large and nested information. The thing is in my react app every time I refresh my page the redux state gets reset and I have to call all those API's again.
I did some research on this topic. I checked Dan Abramov's egghead tutorial on redux. What he does is maintain the redux state in localStorage of the browser and updated the localStorage after every 100 or 500 ms. I feel as if this is a code smell.
Continuously watching the localStorage state and updating it, wouldn't it effect the performance of the browser. I mean wasn't this on of the reasons Angular 1 failed because it continuously kept on watching state variables and after a while if the site was kept live in the browser it just slowed down. Because our script continuously kept on checking the state of the variables. i feel as if we are doing the same thing here.
If maintaining the redux state in localStorage is the right approach can someone tell me why so? And if not is there a better approach?
This is not a duplicate of How can I persist redux state tree on refresh? because I am asking for advice whether persisting state in local storage is a code smell or not
I think using localStorage is your best option here, since it seems the data you are storing there is needed on the client. If the data is not changing, you shouldn't need to repeatedly query, or watch, the localStorage.
Another thing you can do is wrap a closure around your localStorage, so that you are not always hitting disk when retrieving your "large" data. Every browser implements localStorage differently, so there are no guarantees on consistent behaviour or I/O performance.
This also adds a simple layer of abstraction which hides the implementation, and controls everything related to your user data in one place.
Here is a simple example of user profile data closure:
// UserProfile.js
var UserProfile = (function() {
var userData = {};
var getUserData = function() {
if (!userData) {
userData = JSON.parse(localStorage.getItem("userData"));
}
return userData;
};
var setUserData = function(userData) {
localStorage.setItem("userData", JSON.stringify(userData));
userData = JSON.parse(localStorage.getItem("userData"));
};
return {
getUserData: getUserData,
setUserData: setUserData
}
})();
export default UserProfile;
Set the user data object: This will overwrite the localStorage, and set the local variable inside the closure.
import UserProfile from '/UserProfile';
UserProfile.setUserData(newUserData);
Get the user data object: This will get the data from the local variable inside the closure, or else go get it from localStorage if it is not set.
import UserProfile from '/UserProfile';
var userData = UserProfile.getUserData();
The idea here is to load data into memory, from localStorage, the first time, when your app loads, or on the first API call. Until such a time when the user profile data changes, (i.e. a user updates their profile, for example), then you would query the API again, and update the data again via the UserProfile.setUserData(..) call.
The question is at what point you need to achieve persistent. I feel that the answer in your case is on a page reload. So if you are worry about performance I'll say:
* Update the localStorage only when the state changes. Inside your reducer when you update the state.
* Read from localStorage when you boot the app.
(This way you write only when the state changes and you read only once)
P.S.
I'll recommend https://github.com/rt2zz/redux-persist package for achieving persistent in Redux apps.
I would only do this as a weak cache and would not rely on it. Local storage is limited (5mb on Chrome e.g.), and may not be available. You'd have to carefully verify that your data was written.
As others have said, you wouldn't be watching localStorage you'd be periodically flushing the store. But I would agree that it seems like a rather coarse hack to blindly assume that all of your state was appropriate to persist in local storage. As with all caching solutions, you need to carefully consider the implications (freshness, expiration, etc.). It sounds like you might want to do this incrementally - pick off a few low-hanging pieces of fruit that are appropriate for caching and consider the implications of caching that state in local storage.
Try redux-persist you can do optimistic persistence, agnostic of the underlying platform (web/mobile).
If you still find performance is a bottleneck. You can do either of the following in steps.
Cache Middleware
Create a middleware that listen's in on changes and records them but only flushes them out every 5 seconds.
Attach an event handler to window.beforeunload to detect user navigating away or closing the browser to flush changes
Persistence Strategy
To persist the data you can use a either of the two strategies below.
Store it in localStorage if there is no performance bottle neck.
Send a JSON blob to server like file-upload. Fetch last JSON state when app loads and persist locally.
I'd suggest you start of with using redux-persist. If there is still a performance bottle neck then use a cache middleware along with one of the two persistence strategies.
I think that in most cases you want to persist your state in localStorage after certain action(s) happens. At least that's always been the case in the projects where I've needed to persist it.
So, if you are using redux-saga, redux-observable or redux-cycles for orchestrating your side-effects you can easily make that side-effect (persisting the state into localStorage) happen whenever one of those actions take place. I think that's a much better approach than "randomly" persisting the data based on a time-interval.
It seems that your understanding of watching state is that you have some kind interval which will keep checking your state and updating it in localStorage. However, I think you can achieve this same thing by updating your localStorage in the react lifecycle method componentDidUpdate. This method will fire every time your component updates, so you can take advantage of it and update your localStorage every time it fires without incurring any performance hits.
One option is to load the data in INITIAL_STATE
window.__INITIAL_STATE__ = { ...state... }
and load the reducer:
window.__APP_STATE__.__REDUCERS__.push(reducer)
It is completely ok to persist state in large redux+react applications.
Regarding angular1, watchers and digest cycle though on every state change even if you rehydrate state not all components are rendered because of redux connect API and vrtualDOM of react.
You can check:
https://github.com/redux-offline/redux-offline
https://github.com/rt2zz/redux-persist
Performance should not be your primary concern. If it comes to that normalise your state and only persist important information like drafts etc(Less info-fewer rehydrations).
The bigger problem with this kind of setup is usually if you have any background sync or socket updates in the app. Having multiple tabs of browser causes async writes to local DB causing to overwrite with previous states.
Here is a thin wrapper on top of it for cross tab sync redux-persist-crosstab
You can check some of these implementations in mattermost webapp and how they use it for realtime app.
They go through some hoops for extra stability and performance - link to store file in mattermost webapp
I think you can use npm module called redux-storage. It provides Persistence layer for redux with flexible backends. You can combine redux-storage with any redux-storage-engine you want. And in my view you can combine this with redux-storage-engine-sessionstorage assuming you will only want to share and save information till browser is open. No need to bloat browser with localStorage. You will need extra javascript support for this to get it clear.
Redux-storage will trigger load and save actions after every state changes. Providing all more flexibility about what to do in case of load and save. Also if you don't to save some state changes then you define array of state which you want to filter out.

How can I access Web Workers in a Service Worker?

I'm working with serviceworker quite a bit lately and often run into a situation where I would for like to process some raw data fetched before storing it into the serviceworker cache - example use case would be to process a large raw text file to remove unnecessary white space. This way the cached response to my http request would already be "optimized".
I was thinking that why not do this in webworker, but alas, after much searching I have not found any idea of how a webworker could be made accessible inside the serviceworker. It's not like I can pass in the webworker context using postMessage.
Question:
How can I access Web Workers in a Service Worker?
It's currently not possible to access a web worker from within a service worker. This might change in the future, and the relevant standards issue is https://github.com/whatwg/html/issues/411
Note that it's possible to use the Cache Storage API from within a web worker that's spawned by a normal web page, so you could theoretically do what you suggest outside the context of a service worker.
This is a matter of personal preference rather than a strict guideline, but I don't like the pattern of modifying the data you get back from the network and then using the Cache Storage API to persist it in a synthetic Response object. I prefer using the Cache Storage API for keeping exact copies of what you get back from the network, so that things look the same to your controlled page regardless of whether the request is fulfilled from the network or from the cache.
A pattern that I've used before, and has the added benefit of using of web workers in the way you suggest, is to use IndexedDB in a similar manner. If the response is already in IndexedDB, then you just use it, and if it's not, then you kick off a web worker to handle the network request and processing, and then store the result in IndexedDB for future use.
Here's an example of some code to do this, making use of a lot of ES2015+ features, along with the promise-worker and idb-keyval libraries for the asynchronous code.
import PromiseWorker from 'promise-worker';
import idbKeyValue from 'idb-keyval';
export default async (url, Worker) => {
let value = await idbKeyValue.get(url);
if (!value) {
const promiseWorker = new PromiseWorker(new Worker());
value = await promiseWorker.postMessage(url);
// Don't await here, so that we can return right away.
idbKeyValue.set(url, value);
}
return value;
};
And then the worker could look something like this (which converts Markdown to HTML):
import 'whatwg-fetch';
import MarkdownIt from 'markdown-it';
import registerPromiseWorker from 'promise-worker/register';
const markdown = new MarkdownIt();
registerPromiseWorker(async url => {
const response = await fetch(url);
const text = await response.text();
return markdown.render(text);
});
This approach would start making less sense if you're dealing with large amounts of data, because there's an overhead in serialization, and lack of streaming support, compared to what would be possible with just using the Cache Storage API directly.

Storing websocket (channels) connection objects in Redux

I want to use websockets in my redux app and have problems with storing connection objects (phoenix channels).
I have a dynamic collection with possibility to add and remove items. When user adds an item, app should create a new phoenix channel based on connection, subscribe and store because I have to do some stuff on it (for example I have to call a method leave() on channel when user removes an item). Unfortunately, store in redux is all immutable, so there is no option to handle this. Any help would be appreciated.
Definitely don't put it in the store. Per Redux FAQ, only serializable data should go into the store. The standard place to put things like persistent connection objects is inside middleware. And, in fact, there's literally dozens of existing middlewares that demonstrate that approach, with most of them listed over at redux-ecosystem-links. You should be able to use some of those as examples.

Single record persistence with ember-data

In Ember.js with ember-data (using the 1.0pre versions) all changes to the data are saved into a defaultTransaction on the store. When the store is committed with store.commit() ALL changes to the data are saved back to the API (using the RESTAdapter).
I would like more control over objects being persisted. So for now, I have been getting instances of store and adapter, then calling something like adapter.createRecord(store, type, record) or updateRecord where type is the App.Person model and record is an instance of that model.
This is using internal bits of the DS.RESTAdapter that I don't think are meant to be used directly. While it works I'm hoping there is a better way to gain more control over persistence then store.commit(). The business logic and UX of my application require finer control.
transaction = router.get('store').transaction();
person = transaction.createRecord(App.Person);
person.set('name', 'Thanatos');
transaction.commit();
watch yehuda presentation regarding this.
http://www.cloudee.com/preview/collection/4fdfec8517ee3d671800001d

Categories

Resources