I'm trying to stream images from a Node.js server to a client site. I've started with socket.io, but my implementation is fairly memory intensive (and possibly leaking as I'm not fluent in JavaScript). I'm just prototyping at this point with 10 sample images:
Server Side
Responds to a socket.io trigger with the following function that "streams" 10 images to the client at roughly 100ms intervals.
socket.on('img_trigger', function(data) {
var img_num = 0;
var timeoutHandle = null;
function startTimeout() {
stopTimeout();
if (img_num < 10) {
timeoutHandle = setTimeout(updateStream, 100);
}
}
function stopTimeout() {
clearTimeout(timeoutHandle);
}
function updateStream() {
var file = './sampleframes/sample-' + img_num + '.png';
fs.readFile(file , function(err, file_buff) {
if (err !== null) {
console.log('readFile error: ' + err);
} else {
socket.emit('img_stream', { buffer: file_buff });
}
file_buff = null;
++img_num;
});
startTimeout();
}
// kicks off first image
startTimeout();
});
Client Side
Capture the raw buffer data and generate a PNG with an <img> element.
socket.on('img_stream', function(data) {
var img_data = arrayBufferToBase64(data.buffer);
var panel = $('#frame-panel');
$('#frame').html('<img src="data:image/png;base64,' +
img_data + '" width="' + panel.width() + '" height="' +
panel.height() + '" />');
});
If I trigger the server once, it works fine but not great. I notice the memory usage go up significantly, and it crashes after several triggers. Can I improve my code here to be efficient or should I try a new approach?
I've looked into using Node's File Streams, socket.io-streams, and even Binary.js (though I hesitate to require our clients to have too-modern of browsers) and they look promising, but I don't quite know which would be best for my use-case. Any help or guidance would be greatly appreciated.
The web interface I'm developing is for an FPGA (Zynq-7000) based camera running PetaLinux with Node.js cross-compiled for the ARM processor, so I don't have a lot of server-side resources to work with. As such, I'd like to have the client-side do as much of the processing as possible. Eventually, streaming video would be incredible, but I'd be satisfied with reading and displaying successive frames at a reasonable rate.
This may be due to a memory leak within the socket.io library (see here for a description, and here for a proposed fix).
To fix this, download and use the latest version of socket.io from here.
Related
I've built an angular/express/node app that runs in google cloud which currently uses a JSON file that serves as a data source for my application. For some reason, (and this only happens in the cloud) when saving data through an ajax call and writing it to the json file, everything seems to work fine. However, when refreshing the page, the server (sometimes!) sends me the version before the edit. I can't tell whether this is an Express-related, Node-related or even Angular-related problem, but what I know for sure is that I'm checking the JSON that comes in the response from the server, and it really is sometimes the modified version, sometimes not, so it most probably isn't angular cache-related.
The GET:
router.get('/concerts', function (request, response) {
delete require.cache[require.resolve('../database/data.json')];
var db = require('../database/data.json');
response.send(db.concerts);
});
The POST:
router.post('/concerts/save', function (request, response) {
delete require.cache[require.resolve('../database/data.json')];
var db = require('../database/data.json');
var concert = request.body;
console.log('Received concert id ' + concert.id + ' for saving.');
if (concert.id != 0) {
var indexOfItemToSave = db.concerts.map(function (e) {
return e.id;
}).indexOf(concert.id);
if (indexOfItemToSave == -1) {
console.log('Couldn\'t find concert with id ' + concert.id + 'in database!');
response.sendStatus(404);
return;
}
db.concerts[indexOfItemToSave] = concert;
}
else if (concert.id == 0) {
concert.id = db.concerts[db.concerts.length - 1].id + 1;
console.log('Concert id was 0, adding it with id ' + concert.id + '.');
db.concerts.push(concert);
}
console.log("Added stuff to temporary db");
var error = commit(db);
if (error)
response.send(error);
else
response.status(200).send(concert.id + '');
});
This probably doesn't say much, so if someone is interested in helping, you can see the issue live here. If you click on modify for the first concert and change the programme to something like asd and then save, everything looks fine. But if you try to refresh the page a few times (usually even up to 6-7 tries are needed) the old, unchanged programme is shown. Any clue or advice greatly appreciated, thanks.
To solve: Do not use local files to store data in cloud! This is what databases are for!
What was actually the problem?
The problem was caused by the fact that the App Engine had 2 VM instances running for my application. This caused the POST request to be sent to one instance, it did its job, saved the data by modifying its local JSON file, and returned a 200. However, after a few refreshes, the load balancing causes the GET to arrive at the other machine, which has its individual source code, including the initial, unmodified JSON. I am now using a MongoDB instance, and everything seems to be solved. Hopefully this discourages people who attempt to do the same thing I did.
Basically, what I'm trying to do is to pass a parameter through the URL to the php code, however, it seems that in the function body of the on message event, I can't change the source. Here's the code below:
var source = new EventSource("get_message.php?latest_chat_ID=0");
var i = 0;
$(source).on("message", function (event) {
var data = event.originalEvent.data;
++i;
source = new EventSource("get_message.php?latest_chat_ID=" + i);
// $.post("get_message.php", {latest_chat_ID: 0}, function (data, status) {});
$("#messages").html($("#messages").html() + data);
});
I was wondering -
How do I rectify this problem?
Are there other ways to send data to a PHP page? (I contemplated using the $.post{} jQuery function, but that will execute the script twice - once from firing the EventSource event and once from the .post{} request?)
I also understand that alternative technologies exist, such as WebSockets and libraries such as node.js, that are better suited for bidirectional communication. However, most of my base code is written with an SSE implementation in mind, and I'd like to stick to that.
If you want to continue using SSE, I think you'll need to rewrite what you have similar to what is below. The reason what you have right now doesn't work is because you are only listening to the first EventSource, and just changing the variable. The variable is not reused by jQuery when you change it. Plus, I probably would skip using jQuery for that since it's going to try and cache things you don't want cached.
var listenToServer = function(i) {
source = new EventSource("get_message.php?latest_chat_ID=" + i);
source.onmessage = function(event) {
var data = event.originalEvent.data;
$messages.html($messages.html() + data);
listenToServer(i + 1);
}
},
$messages = $('#messages'),
source;
listenToServer(0);
I also went ahead and cached $('#messages') so you're not creating new objects over and over. Left source outside of the function so that you don't have to worry as much about garbage collection with the various EventSources.
Right now, I have a node.js server that's able to stream data on GET request, using the stream API. The GET request is Transfer-encoded set to 'chunked'. The data can be on the order of 10 to 30 MBs. (They are sometimes 3D models)
On the browser side, I wish to be able to process the data as I'm downloading it--I wish to be able to display the data on Canvas as I'm downloading it. So you can see the 3D model appear, face by face, as the data is coming in. I don't need duplex communication, and I don't need a persistent connection. But I do need to process the data as soon as it's downloaded, rather than waiting for the entire file to finish downloading. Then after the browser downloads the data, I can close the connection.
How do I do this?
JQuery ajax only calls back when all the data has been received.
I also looked at portal.js (which was jquery-streaming) and socket.io, but they seem to assume persistent reconnection.
So far, I was able to hack a solution using raw XMLHttpRequest, and making a callback when readyStead >= 2 && status == 200, and keeping track of place last read. However, that keeps all the data downloaded in the raw XMLHttpRequest, which I don't want.
There seems to be a better way to do this, but I'm not sure what it is. Any one have suggestions?
oboe.js is a library for streaming responses in the browser.
However, that keeps all the data downloaded in the raw XMLHttpRequest, which I don't want.
I suspect this may be the case with oboe.js as well and potentially a limitation of XMLHttpRequest itself. Not sure as I haven't directly worked on this type of use case. Curious to see what you find out with your efforts and other answers to this question.
So I found the answer, and it's Server-sent events. It basically enables one-way http-streams that the browser can handle a chunk at a time. It can be a little tricky because some existing stream libs are broken (they don't assume you have \n in your stream, and hence you get partial data), or have little documentation. But it's not hard to roll your own (once you figure it out).
You can define your sse_transform like this:
// file sse_stream.coffee
var Transform = require('stream').Transform;
var util = require('util');
util.inherits(SSEStream, Transform);
function SSEStream(option) {
Transform.call(this, option);
this.id = 0;
this.retry = (option && option.retry) || 0;
}
SSEStream.prototype._transform = function(chunk, encoding, cb) {
var data = chunk.toString();
if (data) {
this.push("id:" + this.id + "\n" +
data.split("\n").map(function (e) {
return "data:" + e
}).join("\n") + "\n\n");
//"retry: " + this.retry);
}
this.id++;
cb();
};
SSEStream.prototype._flush = function(next) {
this.push("event: end\n" + "data: end" + "\n\n");
next();
}
module.exports = SSEStream;
Then on the server side (I was using express), you can do something like this:
sse_stream = require('sse_stream')
app.get '/blob', (req, res, next) ->
sse = new sse_stream()
# It may differ here for you, but this is just a stream source.
blobStream = repo.git.streamcmd("cat-file", { p: true }, [blob.id])
if (req.headers["accept"] is "text/event-stream")
res.type('text/event-stream')
blobStream.on("end", () -> res.removeAllListeners()).stdout
.pipe(
sse.on("end", () -> res.end())
).pipe(res)
else
blobStream.stdout.pipe(res)
Then on the browser side, you can do:
source = new EventSource("/blob")
source.addEventListener('open', (event) ->
console.log "On open..."
, false)
source.addEventListener('message', (event) ->
processData(event.data)
, false)
source.addEventListener('end', (event) ->
console.log "On end"
source.close()
, false)
source.addEventListener('error', (event) ->
console.log "On Error"
if event.currentTarget.readyState == EventSource.CLOSED
console.log "Connection was closed"
source.close()
, false)
Notice that you need to listen for the event 'end', that is sent from the server in the transform stream's _flush() method. Otherwise, EventSource in the browser is just going to request the same file over and over again.
Note that you can use libraries on the server side to generate SSE. On the browser side, you can use portal.js to handle SSE. I just spelt things out, so you can see how things would work.
I have built an offline app in Phonegap, the JSON data gets pushed and gets to where it needs to go. But there are always a bunch of images that need to follow, and html5 cache will just not do it seeing it clears when an app is closed. I have been breaking my nugget on this for quite a while now.
The issue is that you want to check if the image exists on the file system. I've chosen to do so with a "async:false" AJAX call, and therefore have the code running up to a point. I haven't used the phonegap (almost hack) way by writing and setting overwrite to false (which produces an error) seeing that was another async function I didn't want to deal with (wherein lies the problem: async).
Another problem is that the Phonegap browser allows best for 3 downloads at the time (looked it up). My previous function was suffering from this debacle as it would just cut off downloading images if the amount simultaneous downloads was to high, which was around 20 to 40 (depending on size) images at the time (which of course is not strange seeing download speed plummets when you divide it over multiple downloads at the time).
So the question is, how to build:
A function that loops through JSON data (image variables).
Downloads them three at the time (and set the meta data because Apple told it had to)
Deletes from disk those who are not needed anymore (this last one is problem all over the inet as far as I've read, so we'll keep this optional and because by now I really don't actually care about space anymore).
Should run a function when all images are downloaded.
My code as far:
var datadir = "";
var pics_at_the_time = 0;
var external_url_pics = "http://Folder on server where images are";
// new_config.pics holds the JSON data. Built: '[ key (same as id) { "id": 1234567890, "tld":"jpg" }'. id+'.'+tld = the image name.
window.requestFileSystem( // get the root folder path
LocalFileSystem.PERSISTENT,
0,
function(fileSystem) {
datadir = fileSystem.root.fullPath;
},
function() {
console.log("Could not get root FS");
}
);
function fetch_images() {
var len = new_config.pics.length; // amount of pictures that need to be here
var all_in_counter = 0;
$.each(new_config.pics,function(index,val){ // loop through all pics
pic_exists(val);
});
}
function pic_exists(val) {
$.ajax({ // pic on disk or not
async:false,
url: 'file://'+datadir+'/'+val.id+'.'+val.tld, //or your url
success: function(){
var obj = val.id;
delete new_config.pics.obj;
},
error: function(){
var obj = val.id;
delete new_config.pics.obj;
downloadImage(val);
}
});
}
function downloadImage(val){
if(pics_at_the_time < 3) { // only 3 at a time. else wait for a download to finish
pics_at_the_time++;
var ft = new FileTransfer();
ft.download(
external_url_pics+val.id+'.'+val.tld,
datadir + "/" + val.id+'.'+val.tld,
function(entry) {
if(debug_console) { console.log("download complete: " + entry.name); }
entry.setMetadata(function(metadata) { } , function(error) { console.log("Could not set meta data: "+val.id); }, { "com.apple.MobileBackup": 1}); // no apple cloudbackup for these pics. We can always re-download apparently
pics_at_the_time--;
fetch_images();
},
function(error) {
if(debug_console) { console.log("download error target " + error.target); }
pics_at_the_time--;
fetch_images();
});
}
}
As you can probably tell, the code is not very sophisticated and it definitely does not check for already existing images. Cause although this works, it is far from perfect seeing it reloops the bunch every time, which at first seemed like a good idea. But now I'm having second thoughts.
Any help is obviously appreciated
I'm making a chat script using jQuery and JSON, but my hosting suspends it due to 'resources usage limit'. I want to know if it is possible (and how) to reduce these requests. I read one question in which they tell something about an Ajax timeout, but I'm not very good at Ajax. The code is:
function getOnJSON() {
var from;
var to;
var msg_id;
var msg_txt;
var new_chat_string;
//Getting the data from the JSON file
$.getJSON("/ajax/end.emu.php", function(data) {
$.each(data.notif, function(i, data) {
from = data.from;
to = data.to;
msg_id = data.id;
msg_txt = data.text;
if ($("#chat_" + from + "").length === 0) {
$("#boxes").append('...some stuf...');
$('#' + from + '_form').submit(function(){
contactForm = $(this);
valor = $(this + 'input:text').val();
destinatary = $(this + 'input[type=hidden]').val();
reponse_id = destinatary + "_input";
if (!$(this + 'input:text').val()) {
return false;
}
else {
$.ajax({
url: "/ajax/end.emu.php?ajax=true",
type: contactForm.attr('method'),
data: contactForm.serialize(),
success: function(data){
responsed = $.trim(data);
if (responsed != "success") {
alert("An error occured while posting your message");
}
else {
$('#' + reponse_id).val("");
}
}
});
return false;
}
});
$('#' + from + '_txt').jScrollPane({
stickToBottom: true,
maintainPosition: true
});
$('body').append('<embed src="http://cdn.live-pin.com/assets/pling.mp3" autostart="true" hidden="true" loop="false">');
}
else {
var pane2api = $('#' + from + '_txt').data('jsp');
var originalContent = pane2api.getContentPane().html();
pane2api.getContentPane().append('<li id="' + msg_id + '_txt_msg" class="chat_txt_msg">' + msg_txt + '</li>');
pane2api.reinitialise();
pane2api.scrollToBottom();
$('embed').remove();
$('body').append('<embed src="http://cdn.live-pin.com/assets/pling.mp3" autostart="true" hidden="true" loop="false">');
}
});
});
}
The limit is of 600 reqs/5 min, and I need to make it almost each second. I had a year already paid and they have no refund, also I can't modify the server, just have access to cPanel.
Well, 600 req/5 min is pretty restrictive if you want to make a request/sec for each user. Essentially, that gives you that each user will make 60 req/min. Or 300/5 min. In other words, even if you optimize your script to combine the two requests to one, at maximum you can have two users at your site ;) Not much I guess...
You have two options:
Stick with making a chat system through Ajax requests and change the hosting provider. This might be actually cheaper if you don't have the skills to do 2.
Forget about making an Ajax request to poll and potentially another to push every second. Implement something around web sockets, long-polling or even XMPP.
If you go that route, I would look at socket.io for a transparent library that uses web sockets where they are supported and has fallbacks to long polling and others for the rest. For the XMPP-way, there is the excellent Strophe.js. Note that both routes are much more complex than your Ajax requests and will require a lot of server logic changes.
I don't think that checking each second is really a good idea, in my opinion for online chat 2/3 seconds check should be far enough.
To get less request, you can also add a check on the user activity in client side, if the windows is inactive you can lengthen the checking time, going back to 2/3 seconds when the user come back active, that will allow you to save resources and requests / minutes
I'm working on a project right now that requires keeping the UI in sync with server events. I've been using long polling which does indeed reduce the number of ajax calls, but then it put's the burden on the server to listen for the event that the client is interested in, which isn't fun either.
I'm about to switch over to socket.io which I will set up as a separate push service.
existing server --> pushes to sockt.io server --> pushes to subscribing client
ggozad's response is good, I also recommend web sockets. They work only with newer browser models, so if you want to make it available on all browsers you will need a small Flash bridge (Flash can communicate with sockets very easily and also it can call JavaScript functions and be called from JavaScript). Also, Flash offers P2P if you are interested. http://labs.adobe.com/technologies/cirrus/
Also, for server side you can look into Node.js if you are a JavaScript fan like me :)
To complete my response: there is no way to make an Ajax based chat in witch you are limited to 600 requests/5 min (2 requests/second), want to make a request/second and want more than two users.
Solution: switch to sockets or P2P.
I recommend you to call that paid service from the server side using a single thread (as an API proxy). You can still poll with 600 requests/5 min in this thread. Then every client do Ajax requests to poll or long-poll to your server API proxy without limitation.