Im creating a 8 bit unsigned javascript array:
var myArray = Uint8Array(64);
Manipulating this array on both client and server, then sending it over a socket.io connection. We are writing a game thus the data sent over the wire as small as possible. as socket.io does not support sending binary data is it worth bothering with javascript typed arrays or should we just use normal javascript arrays? will they still be smaller then a native js array?
NOTE: I will assume that by client you mean the browser. Else, please clarify with more details.
Socket.io does not support binary data, mainly because it offers different transports, and many of them do not support it.
However, native websockets DO support Blobs and ArrayBuffers.
If you really want to go with binary data for efficiency (which, I agree, is the way to go in your case), I think you should consider using websockets instead of socket.io.
The bad:
Only ~55% of users browse the web with a browser that supports websockets.
You wouldn't have the commodities socket.io offers, such as channels, emit and on methods.
The good:
Web sockets API is extremely simple.
It will be much more memory efficient. Normally your normal arrays are transferred by first making them a JSON string and then sending them back. This means you're actually sending a string representation of your array! Instead, here you will send the amount of bytes you would expect (in a more predictable manner, without checking string lengths before sending, but in a more "protocol"-ic way if desired).
If you decide to use WS, you could check this: http://www.adobe.com/devnet/html5/articles/real-time-data-exchange-in-html5-with-websockets.html
Else you can just go with JSON.
Truth be told, if you still go with JSON for socket.io and "universal" support, enable flash transport too, and disable slower transports if the game requires low latency.
Related
I am sending the following data on the server.
var loc = {
type : 'exloc',
ms : this.var1.move,
ppx : this.var1.ppx,
ppy : this.var1.ppy,
Name : "red",
rid : this.gv.GlobalInfoHolder.CurrentRoomID
};
and I can send this data in the following form also.
var obj = "type=exloc,ms="+this.var1.move+",ppx="+this.var1.ppx+",ppy="+this.var1.ppy+",Name=red,rid="+this.gv.GlobalInfoHolder.CurrentRoomID;
Then I can strip the data based on the comma's I've put.
Another way would be to encode the object to json then send the data to server and decode.
So I wanted to ask that what will be the best solution for this? It would be better if anyone tells me how to analyze the size in bytes of each of these variables including the json solution also.
I want to go for the solution that gives me the smallest size of the variable.I am doing this for performance and efficiency if anyone wants to suggest something he/she is most welcome.I am considering this because I am working on a real time game which has to share data between client and server very quickly (in milliseconds)
Just get a network analyzer program (like the free Fiddler or many, many other programs available for the same thing) and look to see exactly what is being sent across the wire. I do this all the time to debug client/server problems. It's real easy and shows you exactly what is being sent between client and server.
JSON is not meant to be the most storage efficient format so if you're trying to optimize for size, you can already know that's probably not the best option because formats with less redundancy (single copy of field name vs. many) are nearly always going to be more compact.
In addition, you should first figure out whether any of this matters. If your data is small overall, then the performance difference between 40 bytes and 80 bytes is probably too small to even measure on a single client. On the other hand, if you had thousands of records to transmit, then the size of each record could be enough to make a difference.
I am trying to code a multiplayer game demo in javascipt using no libraries. Everything is going pretty good, but in order to get better performance to move forward I am going to have to try minimize the data I sent over my websockets or I won't be able to do much. I have been thinking about the best way to do this. I am using Node.JS + Express + Socket.IO.
At first I was sending the keyboard state of all the keys from each client to the server and quickly narrowed this down to true/false values for only the keys I was using. But now I am thinking that I should really be doing something like assigning decimal literal values (0, 1, 3, 4) to each possible input state that are allowed (possible combinations of inputs) and simply sending that value to the server.
I have more experience in statically typed languages such as C++, Java, etc. so I know how I would do this in those languages. But basically want I want to know is if I have a small number of possible input states. What is the best way to send this data using javascript on both ends. It will be going into a JSON object. Is there anyway for me to send a single byte?
https://gist.github.com/1437195 No idea if this works but that could the most optimized solution. I've never played with byteArray. I have no clue if they play well with socket.io.
I have a grid in a browser.
I want to send rows of data to the grid via JSON, but the browser should continuously parse the JSON as it receives it and add rows to the grid as they are parsed. Put another way, the rows shouldn't be added to the grid all at once after the entire JSON object is received -- they should be added as they are received.
Is this possible? Particularly using jQuery, Jackson, and Spring 3 MVC?
Does this idea have a name? I only see bits of this idea sparsely documented online.
You can use Oboe.js which was built exactly for this use case.
Oboe.js is an open source Javascript library for loading JSON using streaming, combining the convenience of DOM with the speed and fluidity of SAX.
It can parse any JSON as a stream, is small enough to be a micro-library, doesn’t have dependencies, and doesn’t care which other libraries you need it to speak to.
You can't parse incomplete or invalid JSON using the browser's JSON.parse. If you are streaming text, it will invariably try and parse invalid JSON at some point which will cause it to fail. There exists streaming JSON parsers out there, you might be able to find something to suit your needs.
Easiest way in your case would remain to send complete JSON documents for each row.
Lazy.js is able to parse "streaming" JSON (demo).
Check out SignalR.
http://www.hanselman.com/blog/AsynchronousScalableWebApplicationsWithRealtimePersistentLongrunningConnectionsWithSignalR.aspx
March 2017 update:
Websockets allow you to mantain an open connection to the server that you can use to stream the data to the table. You could encode individual rows as JSON objects and send them, and each time one is received you can append it to the table. This is perhaps the optimal way to do it, but it requires using websockets which might not be fully supported by your technology stack. If websockets is not an option, then you can try requesting the data in the smallest chunks the server will allow, but this is costly since every request has an overhead and you would end up doing multiple requests to get the data.
AFAIK there is no way to start parsing an http request before it's finished, and there is no way to parse a JSON string partially. Also, getting the data is orders of magnitude slower than processing it so it's not really worth doing.
If you need to parse large amounts of data your best bet is to do the streaming approach.
I have a simple 2D array in javascript. I want to pass this array to a ASP.NET page.
I wanted to know what would be the best option, going with JSON or XML.
The metric is speed and size of the data. In some cases, the array may be long in size.
Thank You.
The metric is speed and size of the data.
JSON is faster then XML in terms of speed. It's smaller then XML in terms of size.
XML is bloated to allow you to represent and validate structures.
However there are various BSON formats around where people take JSON and then hand optimise the storage format excessively. (BSON is binary JSON)
Some BSON spec I picked from google
Bison, A JavaScript parser for some arbitary BSON format.
Now if you really have bottlenecks with transferring data (which you probably don't) you may want to use WebSockets to send data over TCP rather then HTTP, thus reducing the amount of traffic and data you send.
Of course you only care about that if you making say X000 requests per second.
JSON Should be your best bet XML datatype sending might be a big pain as sometimes you would have to add in new configs just to support XML datatype to be sent as form data to the server. Genreally it is not a recommended practice due to security concerns
I have a simple piece of data that I'm storing on a server, as a plain string. It is kind of ridiculous, but it looks like this:
name|date|grade|description|name|date|grade|description|repeat for a long time
this string can be up to 1.4mb in size. The idea is that it's a bunch of student records, just strung together with a simple pipe delimeter. It's a very poor serialization method.
Once this massive string is pushed to the client, it is split along the pipes into student records again, using javascript.
I've been timing how long it takes to create, and split, these strings on the client side. The times are actually quite good, the slowest run I've seen on a few different machines is 0.2 seconds for 10,000 'student records', which has a final string size of ~1.4mb.
I realize this is quite bizarre, just wondering if there are any inherent problems with creating and splitting such large strings using javascript? I don't know how different browsers implement their javascript engines. I've tried this on the 'major' browsers, but don't know how this would perform on earlier versions of each.
Yeah looking for any comments on this, this is more for fun than anything else!
Thanks
String splitting for 1.4mb data is not a problem for decent machines, instead you should worry about the internet connection speed of your users. I've tried to do spell check with 800 kb dictionary (which is half of your data), main issue was loading time.
But looks like your students records data could be put in database, and might not need to load everything at loading time, So, how about do a pagination to show user records or use ajax to request to search certain user names?
If it's a really large string it may pay to continuously slice the string with 'string'.slice(from, to) to only process a smaller subset, appending all of the individual items to the end of the output with list.push() or something similar might work.
String split methods are probably the most efficient way of doing this though, even in IE. Processing individual characters using string.charAt(x) is extremely slow and will often show a security error as it stalls the browser. Using string split methods would certainly be much faster than splitting using regular expressions.
It may also be possible to encode the data using a JSON array, some newer browsers such as IE8/Webkit/FF3.5 have fast JSON parsing built in using JSON.parse(data). But using eval(JSON) may overflow the browser if there's enough data, so is probably a bad idea. It may pay to compare for performance though.
A much better approach in a lot of cases is to use AJAX and only load some of the data at once from the server, which would also save download time.
Besides S. Mark's excellent comments about local vs. x-fer speed and the tip to re-encode using AJAX, I suggest a (longterm) move away from JavaScript in the Browser (assuming that's were it runs) to either a non-browser implementation of JS (or possibly another language).
A browser based JS seems a week link in a data-x-fer chain and nothing I would want to run unmonitored, since the browsers are upgraded from time to time and breaking your JS-x-fer might be an unanticipates side effect!