I have an app where I spawn several BrowserWindows, with html forms, and I'd like to collect all the data (in order to save it, to be able to spawn them in the same state at a restart) at a press of a button.
At the moment, the only solution I found to do so, is to have each BrowserWindow do ipcRenderer.send every single time any variable changes (not too hard to do with Vuejs 'watchers'), but this seems demanding and inefficient.
I also thought of doing 'executeJavascript' to each window but that does not allow to capture the return value afaik.
I'd just like to be able to send a message from main when a request for saving is made, and wait for the windows to respond before saving all.
EDIT
I found a slightly better way, it looks like this
app.js
// wait for update reponses
ipc.on('update-response', (evt,args) => {
updates[evt.sender.id] = args;
if(Object.keys(updates).length == BrowserWindow.getAllWindows().length) {
// here I do what I need to save my settings, using what is stored in 'updates'
// ...
// and now reset updates for next time
updates = {}
}
});
// now send the requests for updates
BrowserWindow.getAllWindows().map(w => w.send('update'));
renderer.js
ipcRenderer.on('update', () => {
// collect the data
// var data = ...
ipcRenderer.send('update-response', data);
})
and obviously on the renderer side I am listening to these 'update' messages and sending data with 'udpate-response'.
But it seems a bit complicated and so I am sure there is a simpler way to achieve this using the framework.
EDIT 2
I realized that the above does not always work, because for some reason, the evt.sender.id do not match the ids obtained from BrowserWindows.getAllWindows(). I worked around that by sending ids in the request, and having the responder include it. But this is all so much fine for so very little...
Related
I am having some issues trying to connect to a matrix server using the matrix-js-sdk in a react app.
I have provided a simple code example below, and made sure that credentials are valid (login works) and that the environment variable containing the URL for the matrix client is set. I have signed into element in a browser and created two rooms for testing purposes, and was expecting these two rooms would be returned from matrixClient.getRooms(). However, this simply returns an empty array. With some further testing it seems like the asynchronous functions provided for fetching room, member and group ID's only, works as expected.
According to https://matrix.org/docs/guides/usage-of-the-matrix-js-sd these should be valid steps for setting up the matrix-js-sdk, however the sync is never executed either.
const matrixClient = sdk.createClient(
process.env.REACT_APP_MATRIX_CLIENT_URL!
);
await matrixClient.long("m.login.password", credentials);
matrixClient.once('sync', () => {
debugger; // Never hit
}
for (const room of matrixClient.getRooms()) {
debugger; // Never hit
}
I did manage to use the roomId's returned from await matrixClient.roomInitialSync(roomId, limit, callback), however this lead me to another issue where I can't figure out how to decrypt messages, as the events containing the messages sent in the room seems to be of type 'm.room.encrypted' instead of 'm.room.message'.
Does anyone have any good examples of working implementations for the matrix-js-sdk, or any other good resources for properly understanding how to put this all together? I need to be able to load rooms, persons, messages etc. and display these respectively in a ReactJS application.
It turns out I simply forgot to run startClient on the matrix client, resulting in it not fetching any data.
I'm making an application where when user submits a form, i want DOM to be updated for everyone on that page, realtime, without refreshing the page.
I have tried doing that with Socket.IO, and it kinda works, but the problem is that it only works if someone is already on that page and i don't need that functionality, i need that when users SUBMITS a form, view is not updated only for existing connection but also when someone loads the page first time and requests were already done.
So, i decided to create a database and check for changes and it works as expected
basically the work flow of the app is this
user submits form => fetch function that checks database for changes is fired => it finds new database entry => updates React state => change is sent to the view
But the problem is that if i do the updating of dom this way, i'm afraid i may be overloading the server unnecessary. I Checked, and every new open instance of
"http://localhost:3000/seek" checks to see if database is changed and so, if i had 1000 users on my web app that would be 1000 requests every second :o
Maybe i should combine both socket.io and database and use that approach for updating dom realtime?
Seek.js (Server Side)
router.post('/', (req,res) => {
// Processes form
// Saving to database
// Sending response
});
router.get('/:fetch', (req,res,next) => {
if(req.params.fetch === 'fetch'){
Seek.find(function(err,games){
if(err) console.log(err);
console.log('FETCHED')
res.status(200).send({games});
})
}else{
next();
}
});
seekDiv.jsx
class MyComponent extends React.Component {
constructor(props) {
super(props);
this.state = { games:[] };
}
componentWillMount(){
this.fetchGames()
}
fetchGames(){
fetch('http://www.localhost:3000/seek/fetch')
.then(res => res.json())
.then(data => {this.setState({games: data.games})})
.catch(err => console.log(err))
}
componentDidMount(){
setInterval( () => this.fetchGames(), 1000)
}
render() {
var games = this.state.games;
let zero;
if(games.length === 0){
zero = ''
}else{
zero = games.map(x => <div>{x.userAlias}</div>)
}
return(
<div>
{zero}
</div>
);
}
}
I'm hoping that i presented my problem clear enough but in case i didn't this is the functionality i want
users submits form => the DOM is updated for EVERY user without refresh containing that form data and is also there until it's manually removed.
Any help on how to proceed is greatly appreciated.
But the problem is that if i do the updating of dom this way, i'm afraid i may be overloading the server unnecessary. I Checked, and every new open instance of "http://localhost:3000/seek" checks to see if database is changed and so, if i had 1000 users on my web app that would be 1000 requests every second :o
Yeah - that is a problem. Conceptually, you need to make this a "push" not "pull" system.
Rather than having every client constantly ask the server if there are updates, you simply need to leave a socket connection open to every page (very low resource use) and on your server, after receiving a new form/post - you then push to every connected client the update.
The socket.io docs have a good example of how to do this in the "broadcast" section. It's for chat messages, but it works the same way for your forms.
You'll want to minimize the data you send to every client to the bare minimum needed. So if you record any additional data (say, a timestamp of when the new post was added) unless you are displaying or using that data on the front end, you wouldn't want to send it to all of the listening client.
You'll want your front end to be monitoring for incoming updates, and when it does, use react to update the DOM accordingly.
I have thig angularJS frontend and I use express, node and mongo on the backend.
My situation looks like:
//my data to push on server
$scope.things = [{title:"title", other proprieties}, {title:"title", other proprieties}, {title:"title", other proprieties}]
$scope.update = function() {
$scope.things.forEach(function(t) {
Thing.create({
title: t.title,
//other values here
}, function() {
console.log('Thing added');
})
})
};
//where Thing.create its just an $http.post factory
The HTML part looks like:
//html part
<button ng-click="update()">Update Thing</button>
Then on the same page the user has the ability to change the $scope.things and my problem is that when I call update() again all the things are posted twice because literally thats what I'm doing.
Can someone explain me how to check if the 'thing' its already posted to the server just to update the values ($http.put) and if its not posted on server to $http.post.
Or maybe its other way to do this?
I see a few decisions to be made:
1) Should you send the request after the user clicks the "Update" button (like you're currently doing)? Or should you send the request when the user changes the Thing (using ngChange)?
2) If going with the button approach for (1), should you send a request for each Thing (like you're currently doing), or should you first check to see if the Thing has been updated/newly created on the front end.
3) How can you deal with the fact that some Thing's are newly created and others are simply updated? Multiple routes? If so, then how do you know which route to send the request to? Same route? How?
1
To me, the upside of using the "Update" button seems to be that it's clear to the user how it works. By clicking "Update" (and maybe seeing a flash message afterwards), the user knows (and gets visual feedback) that the Thing's have been updated.
The cost to using the "Update" button is that there might be unnecessary requests being made. Network communication is slow, so if you have a lot of Thing's, having a request being made for each Thing could be notably slow.
Ultimately, this seems to be a UX vs. speed decision to me. It depends on the situation and goals, but personally I'd lean towards the "Update" button.
2
The trade-off here seems to be between code simplicity and performance. The simpler solution would just be to make a request for each Thing regardless of whether it has been updated/newly created (for the Thing's that previously existed and haven't changed, no harm will be done - they simply won't get changed).
The more complex but more performant approach would be to keep track of whether or not a Thing has been updated/newly created. You could add a flag called dirty to Thing's to keep track of this.
When a user clicks to create a new Thing, the new Thing would be given a flag of dirty: true.
When you query to get all things from the database, they all should have dirty: false (whether or not you want to store the dirty property on the database or simply append it on the server/front end is up to you).
When a user changes an existing Thing, the dirty property would be set to true.
Then, using the dirty property you could only make requests for the Thing's that are dirty:
$scope.things.forEach(function(thing) {
if (thing.dirty) {
// make request
}
});
The right solution depends on the specifics of your situation, but I tend to err on the side of code simplicity over performance.
3
If you're using Mongoose, the default behavior is to add an _id field to created documents (it's also the default behavior as MongoDB itself as well). So if you haven't overridden this default behavior, and if you aren't explicitly preventing this _id field from being sent back to the client, it should exist for Thing's that have been previously created, thus allow you to distinguish them from newly created Thing's (because newly created Thing's won't have the _id field).
With this, you can conditionally call create or update like so:
$scope.things.forEach(function(thing) {
if (thing._id) {
Thing.update(thing._id, thing);
}
else {
Thing.create(thing);
}
});
Alternatively, you could use a single route that performs "create or update" for you. You can do this by setting { upsert: true } in your update call.
In general, upsert will check to see if a document matches the query criteria... if there's a match, it updates it, if not, it creates it. In your situation, you could probably use upsert in the context of Mongoose's findByIdAndUpdate like so:
Thing.findByIdAndUpdate(id, newThing, { upsert: true }, function(err, doc) {
...
});
See this SO post.
#Adam Zemer neatly addressed concerns I raised in a comment, however I disagree on some points.
Firstly, to answer the question of having an update button or not, you have to ask yourself. Is there any reason why the user would like to discard his changes and not save the work he did. If the answer is no, then it is clear to me that the update should not be place and here is why.
To avoid your user from loosing his work you would need to add confirmations if he attempts to change the page, or close his browser, etc. On the other if everything is continuously saved he has the peace of mind that his work is always saved and you dont have to implement anything to prevent him from loosing his work.
You reduce his workload, one less click for a task may seem insignificant but he might click it many time be sure to have his work save. Also, if its a recurrent tasks it will definitely improve his experience.
Performance wise and code readability wise, you do small requests and do not have to implement any complicated logic to do so. Simple ng-change on inputs.
To make it clear to him that his work is continuously save you can simply say somewhere all your changes are saved and change this to saving changes... when you make a request. For exemple uses, look at office online or google docs.
Then all you would have to do is use the upsert parameter on your mongoDB query to be able to create and update your things with a single request. Here is how your controller would look.
$scope.update = function(changedThing) { // Using the ng-change you send the thing itself in parammeter
var $scope.saving = true; // To display the saving... message
Thing.update({ // This service call your method that update with upsert
title: changedThing.title,
//other values here
}).then( // If you made an http request, I suppose it returns a promise.
function success() {
$scope.saving = false;
console.log('Thing added');
},
function error() {
//handle errors
})
};
First, some background:
I need to develop a web app that will in background collect all mouse actions by a user (during a visit to a web page), store them in appropriate format in a file, and than have a separate replay app that will be fed with that file, and will produce something like this:
Curves are mouse movements, circles are either clicks or staying stationary.
I have more or less solution for replay app.
I need a solution that captures user mouse actions and saves it in a file on server.
For each user there should be separate file. Format of the file is not predetermined, but following would be reasonable:
<timestamp1> MOVE TO <x1>, <y1>
<timestamp2> MOVE TO <x2>, <y2>
<timestamp3> MOVE TO <x3>, <y3>
<timestamp4> CLICK
<timestamp5> RIGHT-CLICK
<timestamp6> MOVE TO <x6>, <y6>
<timestamp7> MOVE TO <x7>, <y7>
I wonder if you could help me on approach how to design and implement such mouse action capture. All best.
You can easily capture the mouse actions using the click, mousemove, etc. events, in the comments you mentioned you know how to do this, so I'll not detail this.
You can't directly `open' a file on the server, since the code is executed on a completely different machine (ie. the client), so what you'll need to do is send the data from the client to the server every second, or every few seconds.
There are several ways of doing this, here's one way:
Check (& get) a unique userid from document.cookie, or localStorage, if there isn't one, generate one (using Date() and/or Math.random())
Bind events to capture the mouse actions, these events write data (in the format you want) to the Array window.captureMouse.
Send an Ajax request to the server every second (depending on the amount of users, speed of server, you may want to change the interval).
A piece of example code might illustrate the idea better (using jQuery)
userId = fetchOrSetUserId() // Make this function
captureMouse = []
$('#id').on('click', function(e) {
captureMouse.push({
event: 'click',
target: $(this).attr('id'),
})
})
// ... more events ...
sendData = function() {
// You probably need to do locking here, since captureMouse may be changed in an event before it's reset
send = captureMouse
captureMouse = []
jQuery.ajax({
url: '/store-data',
type: 'post',
data: {
userId: userId,
captureMouse: JSON.stringify(send)
},
success: function() {
// Once this request is complete, run it again in a second ... It keeps sending data until the page is closed
setTimeout(sendData, 1000)
}
})
}
// Start sending data
sendData()
On your server, you'll get captureMouse as POST, you will need to decode JSON and append it to a file (which is identified using the userId parameter).
I'm looking for a solution for dealing with an issue of state between models using backbone.js.
I have a time tracking app where a user can start/stops jobs and it will record the time the job was worked on. I have a job model which holds the job's data and whether it is currently 'on'.
Only 1 job can be worked on at a time. So if a user starts a job the currently running job must be stopped. I'm wondering what the best solution to do this is. I mean I could simply toggle each job's 'on' parameter accordingly and then call save on each but that results in 2 requests to the server each with a complete representation of each job.
Ideally it would be great if I could piggyback additional data in the save request similarly to how it's possible to send extra data in a fetch request. I only need to send the id of the currently running job and since this really is unrelated to the model it needs to be sent alongside the model, not part of it.
Is there a good way to do this? I guess I could find a way to maintain a reference to the current job server side if need be :\
when you call a save function, the first parameter is an object of the data that's going to be saved. Instead of just calling model.save(), create an object that has the model data and your extra stuff.
inside of your method that fires off the save:
...
var data = this.model.toJSON();
data.extras = { myParam : someData };
this.model.save(data, {success: function( model, response ) {
console.log('hooray it saved: ', model, response);
});
...