Listen to server changes - javascript

a server has certain variables with values which change on a regular basis. These changes don't emit events or anything. My JavaScript Application receives the values through HTTP-Requests with the Server, where i can access the current variable values of the server.
Since i always want the latest value my approach so far is a normal GET Request inside an interval.
But this does not seem right...
Does anyone have a better idea? I can't use sockets since the server isn't emitting anything, right?
Thanks!

In a node.js app (server), you could :
set up a socket.io connection with your client
create an event EventEmitter A
subscribe to your event emitter
do a setInterval to check your variables
when one of your variables has changed -> A.emit('variable-name', newValue)
You recieve the event from A and send it to your client through socket.io

Related

Is there a way to refresh static files without server restart?

My simple Node.js app serves static files, e.g. html/txt
app.use(express.static('public'))
I want to reflect my changes to those files without restarting the app.
I.e. want to be able to modify any txt/help/read.me static files to it's available right away for the clients without restarting the app. How do I do it?
To determine if a file has changed, you can use fs.watch from the built-in fs module. It works like this: fs.watch('filename_or_directory', function(event, file) {}) . The callback's first parameter is "event", a string which is either "rename" or "change", depending on what type of change happened. The second parameter is the filename that changed.
For the client to automatically know that a change has occurred on the server is a bit more complex. There needs to be some form of communication with the server; you can do this with polling or with web sockets. If you go the socket route, you can use a library like socket.io or ws to establish a connection between server and client; when the server notices a change in a file from fs.watch, it can broadcast that change as a JSON "event object" to all clients, which can then receive the message and determine how to proceed (reload the current page, request updated data via AJAX, ignore it because it's an unrelated file, etc.).
If you go the polling route, you don't need any web socket libraries. You'd just keep track of the timestamp of each change from fs.watch in an array, then set up a route like /api/getChanges or something. You have the client, at regular intervals, post the timestamp of the last client update to that route, and the server can respond with all change objects in the array after that time.
Note: Express doesn't need to know that a file has changed. It will re-read files as they're requested. It's the client that needs to know when to refresh.
Fixed this way:
app.use(express.static('public’, {etag: false}))

Jsf get ajax queue

Is it possible to get the current jsf-ajax-queue in javascript?
I need to detect if there is currently any jsf-ajax request running.
The
jsf.ajax
object seems not to expose this information.
I dont want to add event listeners becuase this is only for selenium testing and not needed for production.
object seems not to expose this information.
There's indeed no public API for that, as confirmed by the jsf.ajax jsdoc.
Your best bet is to let Selenium programmatically register a global ajax listener beforehand via jsf.ajax.addOnEvent and jsf.ajax.addOnError. This way you can keep track of started, completed and errored requests yourself.

Running code from Socket.io notification

I'm running a NodeJS server which is sending notifications to the clients when somebody does something, for example, when a client deletes a row from a grid, Socket.io informs the rest of the clients that a row got deleted.
In that example, I could add something like actionType: rowdeleted to the socket.io message and then just detect the actionType on the client side and refresh the grid. Anyways, the problem is that there can be infinite number of actions (and new ones can be added), so I can't code a function for each action type on the client side.
Then I thought maybe I can send some code via socket.io and make the client run it, but I'm not sure if that is the best way for doing what I want. Also, how are the clients going to run that code? Via eval?
I'm open to any suggestion :)
Have you considered something similar, but not as eval. You clearly must have the code to execute somewhere, be it on the server side. Why not create a way to let the client know what script/code/action to get and execute it.
I have used something similar out of a similar need. The action type referenced a script in a specific path on my server (/js/actions/ACTION.js). Upon getting the command to run the action, the client would check if it has the action, if not, it would go get the action. After that it would run the action on the script. RequireJS is good for this kind of thing. It will keep track of what actions you have and what actions you don't have. It will also make sure to get the action if it doesn't have it before it run some function that needs it.
eval is evil (c)
so I can't code a function for each action type on the client side.
there's no point emiting events from server if they wont be handled on the client(s)
have a client handle funcion for each type of event your server is emiting.
Otherwise bind on all events and handle then

How to bind server side events on client objects and vice versa with meteor

Is it possible to directly bind server side events to client side objects in meteor?
I would like to update a view for example when a server side event triggers. On the other hand I'd like to fire a server side method when a user clicks a view item.
I could use Meteor#methods for all the events but that seems odd.
Or can I specify an eventhandler for example using EventEmitter outside the client- and server-scope so that it is available on both sides and trigger/bind events ob that very object?
Some confused about that I am thankful for hints into the right direction.
Regards
Felix
Update:
Using Meteor#methods works great in case user events should be mapped to server side actions. The other way around is still unclear. Asynchronous actions on serverside could persist their results in a collection which is pub/sub'ed to the client, which in turn could update some view due to the reactive context. But thats odd, cause persisting that kind of info is slow, wasted space and time. Any suggestions?
I believe you can use the Collection.observe on the server side to 'observe' events on the Collection as clients are inserting, updating, removing, etc... That might be a start if you are focused on Collections alone. I used it like a sort of server side event loop to watch for collection changes.
When a user clicks on something in a view try binding a Template Event to the View css selector and then calling a Meteor method which will notify the server of the event. See the examples of binding a key handler and/or button handlers to a Template.entry.event which then call a Meteor method notifying the server that something happened.
What about storing the progress in Session? You could do something like this:
Template.progress.value = function() {
return Session.get('progress');
}
Then, whenever you update the Session on the server, the client template will automatically get those changes.
Out of curiosity, how exactly are you performing asynchronous actions on the server? I'm still trying to figure that out.

Node.js: Connecting to a Server Using Sockets

I'm just starting to play with Node.js today, and thought I'd start with what I thought would be a simple script: Connecting to a server via sockets, and sending a bit of data, and receiving it back. I'm creating a command line utility. Nothing in the browser.
An example of a server would be memcached, beanstalkd, etc. It seems the net module is the right tool for the job, but I'm still a bit fuzzy on the Node.js way of doing things. Some help would be appreciated.
Update #1
Let me see if I can break this down in into a couple smaller questions. I hate even asking questions like this, but the Node.js documentation is very sparse, and most documentation written 6 months ago is already out dated.
1) So I can use net.stream.write() to send data to the remote server, but I don't know how to get a response back. I'm not even sure how to test when write() is finished, because it doesn't take a callback.
2) A few clues on how the whole event.emit thing works would be great. I think that's really the key stone I'm missing in those whole thing.
Update #2
Here's where I'm still confused on implementing a client program. Let me diagram a typical send request => get response system:
1) I bind callbacks to the net module to get responses and other events, including the necessary bindings to get a response from the server.
2) I use stream.write() to send a request to the server.
3) I then do nothing, because my bound "data" event will get the response from the server.
Here's where things get tricky. Suppose I call stream.write() twice before my bound "data" event is called. Now I have a problem. When the "data" event does happen, how do I know which of the 2 requests it's a response for? Am I guaranteed that responses will take place in the same order as requests? What if responses come back in a different order?
First of all, let's make clear what a EventEmitter is. JavaScript and therefore Node.js are asynchronous. That means, instead of having to wait for incoming connections on a server object, you add a listener to the object and pass it a callback function, which then, "as soon" as the event happens, gets executed.
There's still waiting here and there going on in the background but that has been abstracted away from you.
Let's take a look at this simple example:
// #1) create a new server object, and pass it a function as the callback
var server = net.createServer(function (stream) {
// #2) register a callback for the 'connect' event
stream.on('connect', function () {
stream.write('hello\r\n'); // as
});
// #3) register a callback for the 'data' event
stream.on('data', function (data) {
stream.write(data);
});
// #4) register a callback for the 'end' event
stream.on('end', function () {
stream.write('goodbye\r\n');
stream.end();
});
});
// #5) make the server listen on localhost:8124
server.listen(8124, 'localhost');
So we create the server and pass it the callback function, this function is not yet executed. Passing the function here is basically a shortcut for adding a listener for the connection event of the server object. After that we start the server at #5.
Now what happens in the case of an incoming connection?
Since the function we passed to createServer was bound to the connection event, it now gets executed.
It adds the connect, data and end event listeners to the stream object (which represents the individual connection) by hooking up callbacks for the events.
After that, the stream fires the connect event, therefore the function passed at #2 gets executed and writes hello\r\n to the stream. How does the function know which stream it should write to? Closures are the answer, the function inherits the scope it was created in, therefore inside the function stream is still referencing to the individual connection that triggered this very callback we're in right now.
Now the client sends some data over the connection, which makes the stream object call its data event, since we bound a function to this event at #3 we now echo the incoming data back to the client.
In case the client closes the connection, the function we've bound at #4 gets called, which writes goodbye\r\n and after that closes the connection from our side.
Does this make things a little bit more clear? Well it definitely makes the whole thing a lot easier. Node is, just as well as JavaScript is inside Browsers, single threaded. There's only one thing happening at a given point time.
To describe it simple, all these callbacks end up in a global queue and are then called one after another, so this queue may(abstracted) look like this:
| connection event for a new stream
| data event for stream #12
| callback set via setTimeout
v close event of yet another stream
These are now get executed top to bottom, nothing will ever happen in between those. There's no chance, that while you're doing something in the callback bound to the data event, something will other will happen and magically change the state of the system. Even if there is a new incoming connection on the server, its event will get queued up and it will have to wait until everything before it, including the data event you're currently in, finishes.

Categories

Resources