RxJS scan can only access last pushed partial data - javascript

I have a BehaviorSubject that I want to update by passing a Partial<Type> so I have this code. Then the scan handles the merge between new and old data.
personUpdates$ = new BehaviorSubject<Partial<Person>>({});
person$ = this.personUpdates$.pipe(
scan((acc, curr) => ({ ...acc, ...curr }), {
name: '',
firstName: '',
lastName: '',
} as Person)
);
updatePerson = (person: Partial<Person>) => {
this.personUpdates$.next(person);
}
But the problem I have is to access the data in certain places. For example if i subscribe and log person$ in my constructor I can see the whole object no issues. But if I try to access it in other places I only receive the last updated value.
constructor() {
// Always have the full object
this.person$.subscribe((x) => console.log(x));
}
checkValues = () => {
// Only have the last Partial values
this.person$.pipe(first()).subscribe((x) => console.log(x));
};
How can I make sure I always get the whole object?
I have reproduced the issue in a StackBlitz Sandbox

person$ needs to save its calculations for late subscribers,
use shareReplay(1)
personUpdates$ = new BehaviorSubject<Partial<Person>>({});
person$ = this.personUpdates$.pipe(
scan((acc, curr) => ({ ...acc, ...curr }), {
name: '',
firstName: '',
lastName: '',
} as Person),
shareReplay(1)
);

Related

subscribe to array of nested Observables

I have array containing objects each object has two props one of it is a observable value
let myArray = [{def: 'name1', value: EventEmitter_}, {def: 'name2', value: EventEmitter_}]
What im trying to do is subscribe to the Observables and and return the root Object where the change occurred
so far I only get the specific value
myArray.forEach(e => {
e.value.subscribe(e => console.log(e))
})
I tried using merge
merge(myArray).subscribe((v)=> {
console.log(v)
})
but that does not work if the Observable is nested
You could pipe the value stream.
myArray.forEach(e => {
e.value.pipe(
map(v => [e.def, v]),
).subscribe(([d, v]) => console.log(`Root ${d} returned %{v}`));
})
obs = merge(
...this.array.map((x: any) =>
x.value.pipe(map((res) => ({ def: x.def, response: res })))
)
);
You merge the "observables" transformed with a property "def", and a property "response". So you only need one subscripcion
See stackblitz

exclude already existing items from array in React

I am facing a challenge. I have 2 tables in firebase that I need to merge into 1 array. The only thing is that items are the same, they have to be removed from the array. So if an item already exists in an array, it should no longer be added to the array. So at the moment I get to see the items twice. Is there a possibility to prevent this? My function looks like this:
fetchItems(){
this.setState({
personalItems:[],
inactiveItems:[],
sprintItems:[],
})
let ref = Firebase.database().ref('/sprints/1/items/');
ref.on('value' , snapshot => {
snapshot.forEach((childSnap) => {
let state = childSnap.val();
console.log('firebase output:'+state)
var newelement = {title: state.title, author: state.author,user: state.user, public: state.public, x: state.x, y: state.y, key: state.key, match: state.match, notes: state.notes, status: state.status};
this.setState(prevState => ({
personalItems: [...prevState.personalItems, newelement],
}));
console.log('firebase'+state.name);
});
})
let refInactive = Firebase.database().ref('/users/'+this.state.user+'/items/');
refInactive.on('value' , snapshot => {
snapshot.forEach((childSnap) => {
let state = childSnap.val();
var newelement = {author: state.author, title: state.postit, status: state.status, key: state.key, user: state.user, match: state.match, x: state.x, y: state.y, public: state.public, notes:state.notes };
this.setState(prevState => ({
personalItems: [...prevState.personalItems, newelement],
}));
});
})
}
My database looks like this:
So you see that these items have the same key. These are also identical to each other. However, if 1 has already been added, this is sufficient, now they are both added.
Here is how you can merge an array into another without duplicates.
for(arr1Element of array1){
if(!array2.some(arr2Element => arr2Element.id === arr1Element.id)){
array2.push(arr1Element)
}
}
You could use a Set, which doesn't let you add duplicate values.
So the easiest you could achieve is something like:
const yourFinalArray = [...new Set([...arr1, ...arr2]);
Hope this helps.

add row without push in es6 for react state

I'm not sure I'm doing the right thing, I mutate variable outside of setState, it's fine right? or there's more elegant way to do it?
state = {
persons: [{
name: 'jay',
age: 10
}]
}
addRow = () => {
const temp = this.state
temp.persons.push({
name: '',
age: ''
})
this.setState({
...temp
})
}
App demo https://codesandbox.io/s/ppqw4wjqzq
In javascript, object assignment works by referece and hence Even if you mutate the variable outside of setState, it will still refer to the same reference of state as long as you do not clone your object. However if you clone it, a new instance will be created and the original one will not be affected
addRow = () => {
const persons = [...this.state.persons] // Clone it one level deep using spread
persons.push({
name: '',
age: ''
})
this.setState({
persons
})
}
The above can be done using simply spread syntax and functional setState like
addRow = () => {
this.setState(prevState => ({
persons: [...prevState.persons, { name: '', age: ''}]
}))
}
Although in your example there seems no difference between the two actions, there is major flaw in the initial implementation that you provided. In order to see the difference between cloning and pushing and just assigning the reference and pushing, you can see the codesandbox demo.
Basically when you create a new component to which if you pass the state persons as props, and you mutate at its original reference, in the componentWillReceiveProps method, you see that the currentProps and the nextProps are both the same and hence if you have any check in the child component to take action if the persons prop changed, that would fail. Hence its extremely important to not mutate the value at its own reference
Without push and spread syntax, you can still avoid the mutation issue by using concat which create a new copy of the original array
addRow = () => {
this.setState(prevState => ({
persons: prevState.persons.concat([{ name: '', age: ''}])
}))
}
In my opinion, more elegant way would be to use functional setState:
const newPerson = { name: '', age: -1 };
this.setState(prevState => ({ persons: [...prevState.persons, newPerson] })

Destructing and/or mass assigning in ES6

I have 2 sources of data. One of the sources is the "template" to what is acceptable for the data. However, the second source may have a large amount of data that I don't care about (100+ properties in the JSON). Here are the schemas:
// Only store the data we care about. Only a small subset of
// data that I need for this particular dataset.
state = {
isDirty: false,
data: {
name: '',
address: '',
city: '',
state: ''
}
}
The second source will have the 4 attributes in the data schema above (plus many many more I don't care about). Currently, I am assigning them like this:
let updatedData = {};
for(const key in this.state.data) {
updatedData[key] = someDataSource[key];
}
this.state.data = updatedData;
Using ES6, and perhaps destructing, is there a better way to mass assign variables like this?
Thanks again!
EDIT
Added for clarification the assignment after the loop.
Lodash pick can be used to pick specific keys, or helper function can be used for same purpose:
const pick = (obj, keys) => Object.keys(obj)
.filter((key) => keys.indexOf(key) >= 0)
.reduce(
(newObj, key) => Object.assign(newObj, { [key]: obj[key] }),
{}
);
This is already suggested in many related questions. The thing that is specific to this question is:
this.state.data = pick(someDataSource, Object.keys(this.state.data));
Properties can be excluded and modified in the JSON.parse reviver :
var o = JSON.parse('{"a":1, "b":2}', (k, v) => k === 'a' ? void 0 : k === 'b' ? 3 : v)
console.log( o )
A trick you can do (trick because it requires to swallow an error) is to use an non extensible object, using the Object.preventExtensions and then use Object.assign to fill it with data (in a try/catch block).
// Only store the data we care about. Only a small subset of
// data that I need for this particular dataset.
state = {
isDirty: false,
data: {
name: '',
address: '',
city: '',
state: ''
}
}
const newData = {
name:'name',
address:'address',
city:'city',
state:'state',
phone:'phone',
zip:'zip'
}
const updatedData = Object.preventExtensions({...state.data});
try{
Object.assign(updatedData, newData);
} catch(throwaway){};
console.log(updatedData);
And as a function for reuse
function schemaMerge(schema, data) {
const mergedData = Object.preventExtensions({...schema});
try {
Object.assign(mergedData, data);
} catch (throwaway) {};
return ({...mergedData}); // create a new object from the merged one so that it no longer is extensionless
}
// Only store the data we care about. Only a small subset of
// data that I need for this particular dataset.
state = {
isDirty: false,
data: {
name: '',
address: '',
city: '',
state: ''
}
}
const newData = {
name: 'name',
address: 'address',
city: 'city',
state: 'state',
phone: 'phone',
zip: 'zip'
}
const updatedData = schemaMerge(state.data, newData);
state.data = updatedData;
console.log(state.data);

Correct way to transform Object from Firebase list?

What is the correct way to transform a object from a firebase list observable?
The following code has some side effects and create duplicated results in my templates. The first time it loads correctly but after visiting the page a second time, duplicated results are showing up.
loadAccountUsers() {
return this.af.database.list(`accounts/${this.authService.accountUID}/accountUsers`)
.flatMap(list => list)
.map((data: any) => {
let role: string = (data.accountLevel === 10 ? 'Administrator' : 'User');
return {
firstName: data.firstName,
lastName: data.lastName,
emailAddress: data.emailAddress,
role: role
};
})
.scan((arr, val) => arr.concat([val]), [])
}
When I don't transform the objects everything is fine
loadAccountUsers() {
return this.af.database.list(`accounts/${this.authService.accountUID}/accountUsers`)
}
flatMap emits each element of the list into the observable stream. When the database changes and another list is emitted, its elements are emitted, too. However, the scan operation is combining the emitted elements into a single array - hence the duplicates.
You could instead use the RxJS map operator and the Array.prototype.map method to solve the problem:
loadAccountUsers() {
return this.af.database.list(`accounts/${this.authService.accountUID}/accountUsers`)
.map((list) => list.map((data) => {
let role: string = (data.accountLevel === 10 ? 'Administrator' : 'User');
return {
firstName: data.firstName,
lastName: data.lastName,
emailAddress: data.emailAddress,
role: role
};
}));
}

Categories

Resources