I've just started learning Node.js and was very interested in its real-time capabilities, especially with Socket.io. Since then, I've written a very basic script to connect to Twitter's streaming server and broadcast tweets to all connected clients.
To build that, I used http.createClient to connect to stream.twitter.com and added in the relevant response and data event handlers. Everything works quite well.
Obviously, Twitter's Streaming API pretty much outputs an infinitely loading webpage and what why using a data event handler works fairly well with it. However, is it possible to make other types of websites 'streamable'?
For example, if a client (browser) updates a website periodically using an XMLHTTPRequest, would it be possible to track the output of those requests using the HTTP API of Node.js? Or similar Node.js extension?
Thanks.
websites do not periodically use XMLHTTPRequest. Clients periodically send XMLHTTPRequests to an URL.
A simple call to http.request(options, callback) with the correct headers should emulate XHR's. Most of these servers will also accept normal POST or GET requests.
If you want node.js to connect to a server and simulate a browser then something like zombie would serve you well. It claims to support XMLHTTPRequest.
The best case for you would be to use web-sockets between your dashboard and node server. This way node will be notified immediately that something has updated at your dashboard ( I am assuming that you can modify your dashboard a bit to accept such connections, won't be difficult as long as you have access).
Then you can use long polling at client-end i.e. send a request to the node server and wait. Node will receive the request and then register an event to it. The moment it receives the updates from dashboard, it'll fire the event which will send the response to all the clients one by one waiting.
I would recommend take a look at http://github.com/andrewdavey/vogue . It does something similar but the functionality is ofcourse different.
Related
I am working on a home automation hub -- a Raspberry Pi running locally that displays weather info, controls my lights, etc. It is "networked" (and I use that term loosely) to a website via a shared MongoDB. Both the site and the hub are running Node.js/Express servers.
Essentially, I am looking to be able to enter text into a field on my website and then display it on my hub.
I'm struggling to figure out how to pass data between them. I can think of a couple ways that might get it done, but the only way I know I could get working is to implement some sort of Mongo watcher/listener to watch for changes on a specific collection. Essentially, you enter the text into the site, that updates the document in Mongo, the watcher informs the locally-running hub, which then fetches and displays the new content.
This seems hacky. Is there a better way? Is this something socket.io could manage? Maybe I'm overthinking it? Help!
You can use Socket.io, WebSocket or TCP socket to connect the two servers together and communicate that way. Or you can use a queue system like ZeroMQ or RabbitMQ and communicate that way. Or you can even make an HTTP request from one server to the other one every time you want it to grab new data - or you could even sent that data right in the request.
It would be much easier if you used Redis that supports pub/sub, see:
https://redis.io/topics/pubsub
or CouchDB that supports the changes feed:
http://docs.couchdb.org/en/2.0.0/api/database/changes.html
or RethinkDB that supports changefeeds:
https://rethinkdb.com/docs/changefeeds/javascript/
I don't think Mongo supports anything like that.
Im building a Chat website that uses Websockets(Socket.io) to send and receive messages to the server. In fact my website should use Websocket and now my problem is that for other transmissions like checking username at login or fetching JSON and updation DOM and other stuff, Can i use the same technology(Websockets) or i have to use Ajax? i know that in Websockets way, server and client will have a persistant connection.
What is the best way? Using Websockets is not good for these purposes? Why?
You can use websockets. The difference is with websockets the client is always connected. You'll have a handler that handles messages (which could be just json blobs maybe with some kind of messageType field) as they stream in from the client.
This means the server side handling is basically the same except instead of serving your responses via over different HTTP request (via different routes), you dispatch the request to the appropriate handler by something not much more complicated than a switch statement. Any results are then sent back to client over the websocket which has a similar handling mechanism.
One downside is not all browsers support websockets so if you need to support a fallback path to JSON then it's certainly easier to use the fallback JSON handlers for the aux requests (since you'll be writing them anyway).
Otherwise the differences are probably marginal. I'd be more concerned about code cleanliness.
I just started learning Node.js and as I was learning about the fs.watchFile() method, I was wondering if a chat website could be efficiently built with it (and fs.writeFile()), against for example Socket.IO which is stable, but I believe not 100% stable (several fallbacks, including flash).
Using fs.watchFile could perhaps also be used to keep histories of the chats quite simply (as JSON would be used on the spot).
The chat files could be formatted in JSON in such a way that only the last chatter's message is brought up to the DOM (or whatever to make it efficient to 'fetch' messages when the file gets updated).
I haven't tried it yet as I still need to learn more about Node, and even more to be able to compare it with Socket.IO, but what's your opinion about it? Could it be an efficient/stable way of doing chats?
fs.watchFile() can be used to watch changes to the file in the local filesystem (on the server). This will not solve your need to update all clients chat messages in their browsers. You'll still need web sockets, AJAX or Flash for that (or socket.io, which handles all of those).
What you could typically do in the client is to try to use Web Sockets. If browser does not support them, try to use XMLHttpRequest. If that fails, fallback to Flash. It's a lot of programming to do, and it has to be handled by node.js server as well. Socket.io does that for you.
Also, socket.io is pretty stable. Fallback to Flash is not due to it's instability but due to lack of browser support for better solutions (like Web Sockets).
Storing chat files in flatfile JSON is not a good idea, because if you are going to manipulating the files, you would have to parse and serialize entire JSON objects, which would become very slow as the size of the JSON object increased. The watch methods for the filesystem module also don't work on all operating systems.
You also can't compare Node.js to Socket.IO because they are entirely different things. Socket.IO is a Node module for realtime transport between the browser and the server. What you need is dependent on what you're doing. If you need chat history, then you should be using a database such as MongoDB or MySQL. Watching files for changes is not an efficient way and you should just send messages as they received.
In conclusion no, using fs.watchFile() and fs.writeFile() is a very bad idea, because race conditions would occur due to concurrent file writes, besides that fs.watchFile() uses polling to check if a file has changed. You should instead use Socket.IO and push messages to other clients / store them in a database as they are received.
You can use long pooling method using javascript setTimeout and setInterval
long pooling
basically long pooling working on Ajax reqest and server responce time.
server will respond after a certain time (like after 50 seconds ) if there is not notification or message else it will respond with data and from client side when client gets response client javascript makes another request for new update and wait till response this process is endless until server is running
I have X amount of activity sensors connected to a server that inserts data to a database everytime a sensor is triggered. What I'm trying to do is create a web interface with a blue print of the facility (svg) and whenever a sensor is triggered, besides the db insert, I want it to show some sort of alert in my blue print. For that I need to keep an open connection to the server I think.
I was thinking of using web sockets, but it might be overkill since I only need to retrieve data from the server. But running an ajax call every second doesn't sound very efficient either. Are there any other alternatives?
Thank you
Some potential choices include:
WebSocket
Adobe® Flash® Socket
AJAX long polling
AJAX multipart streaming
Forever Iframe
JSONP Polling
Which actual transport you end up using will depend on the your requirements for browser support and what technology you are using on the server to handle these requests. The transport choice may also depend on your network topology - what types of load balancers you need to integrate with, proxies, etc.
There are many libraries available on both the client and server sides, many of which support more than one of these transports.
For example (not an exhaustive list):
socket.io for nodejs
WebSocket
Adobe® Flash® Socket
AJAX long polling
AJAX multipart streaming
Forever Iframe
JSONP Polling
SignalR for an asp/.net backend
WebSockets
Server-Sent Events
ForeverFrame
Long Polling
Atmosphere for a java backend
WebSockets
Server Side Events (SSE)
Long-Polling
Forever frame
JSONP
IMO - Websockets is NOT overkill for this type of problem and would lend itself nicely to this type of application.
Without specifically discussing frameworks or knowing what is running in the backend of your server(s), we have a few options to consider for the frontend:
Websockets
Websockets are designed for bidirectional communication, although it is kind of shocking how many users are surfing the web in a browser that doesn't support websockets. I always recommend a fallback for this, such as the other methods listed below.
SSE
SSE is an HTML5 spec and is still shaky at best. Try scrolling on a page while when an SSE event fires... It may be a little easier on the backend, put it sometimes hangs on the client side since it runs inside the same thread that the DOM is running in.
Long Polling
Keeps your connection open. It doesn't scale well with PHP, but performs swimmingly with Python+Twisted on the backend, or Node.Js
Good Old Ajax
Keep your requests small, and you still have a scalable solution. Yes, a full GET request is the most expensive, but is supported in just about every browser rolled out the past ten years. It is also worth noting that GET requests are easy to scale horizontally with more hardware.
In a perfect world:
You would break up your application into a few components, operating behind a reverse proxy such as Nginx. Then use Node.Js + Socket.IO handle the realtime aspects of your app.
Another option would be to use small Ajax requests, and offer websocket support for the browsers that support it. This is advice specifically for PHP in the backend.
WebSocket is certainly not overkill. On the contrary. With websockets, you have a bi-directional communication channel; this means, that the server can initiate communication whenever it seems fit (e.g. when sensor data changes).
In a previous project, I have used node.js together with socket.io, to monitor 50+ sensors. Data was updated in real-time in a browser. The data was visualized using smoothie.js.
Whenever a sensor value was updated, it was communicated to the browser. Some sensors only updated once a minute, others once a second, ...
Polling would have been overkill, because it would retrieve all data for all sensors, even from those that were not updated yet.
I had a similar problem and did a lot of research on this. As I understand it, there are three main options:
Short polling: Have an endpoint that your javascript client pings every second. This is the worst option, because the pings add latency up to one second to your communication, and depending on how you implement, the endpoint could query the database every second, adding unnecessary overhead.
Long polling: Have an endpoint that your javascript client pings that holds the connection until a) the event occurs or b) the connection times out. If the endpoint returns a response, the client gets the event information. If the endpoint does not return a response, no event has occurred, and the client sends a new request. This is a good option because the events can immediately trigger the response to the client, assuming you have an asynchronous interprocess communication layer (like 0MQ) to send the message without any sort of polling.
Websocket: Have your javascript client connect to a websocket server, which will send a message to your client immediately upon the event trigger.
I think a websocket is your best option, because it accommodates immediate communication of the event without all the request/response overhead. And most importantly, this is exactly what websockets are designed to do! As such, you will probably have to write the least amount of custom code with this solution.
There are two great commercial services that might work for you.
Firebase - a javascript hierarchical database and realtime
messaging/ synchronization platform, uses websockets and has other fallbacks
PubNub - a real time message passing and queue system, uses websockets
I am writing a debug/admin node server that allows users to execute a long-running process on the machine. I want to stream the output of the child process to the form they began the action from.
I can do this with sockets, but I have to have the client subscribe to a channel, and I have to post messages to the whole channel when they only have to do with the one client.
I'd prefer to be able to stream the http body down to the client. I can do this fairly easily with node: just keep writing to the request's socket, call end when I'm done.
Is there any way to use XhrHttpRequest to call a web service, have it fire events whenever new data is available, and a final event when it closes? Possible with jQuery?
Note that this isn't really the same use case as normal real-time updates, for which sockets are a good choice. This is a single request. I just want to get the response in pieces.
What I was hoping isn't possible: you can't make an xhr http request and keep it open, parsing chunks at a time.
Here is a summary of people's suggestions
Use socket.io anyway, and change your architecture to support pushing events.
Use socket.io, but make requests through it, as if you were hitting urls. Make a little url router on the server side of socket.io and stream stuff down all you want.
Keep the initial html page open and parse it as you go (not feasible for my implementation)
(3), but in a hidden iframe.
I went with 2.
As an update to this question, nowadays, you can use Sever-sent events (SSE). That way, you don't need to do anything particularly special on the server side, or setup websockets, which is overkill when you don't need full duplex. And XHR will keep the entire data in memory, which is non-ideal for large files. I had the same question, and I answered it here:
How to process streaming HTTP GET data?
some years ago i used "javascript" streaming over open http response. (years before ajax appeared)
the idea here : write chunks of
<script type="text/javascript">do js stuff here</script>
for each step of the process you want the client to react on.
it may still work.