I was testing performance of my node.js server and I detected that this small code last too much time:
socket.emit("a",{a1:something,a2:veryBigArray});
Problem is that my array is very big and socket.io must encode it to JSON.
My array is something like:
veryBigArray=[{x:0,y:0},{x:0,y:1},{x:1,y:1},{x:1,y:2}.......];
I am interested mainly in server performance. I want server can continue in work as soon as possible.
Before sending array to client, I generate array, so I don't have problem to completely change structure of sended data. Maybe array could be somehow compressed. I read about ArrayBuffer;
What is the best (fastest for sever) way of sending big array of coordinates to client (browser) using Socket.io?
If you must encode the array to JSON then I don't think an ArrayBuffer will help. However, if you know the precise data structure of the array to be sent and can predict it server-side, you can come up with your own efficient encoding/decoding schema.
In your example, your data is simply an array of x and y value pairs where each value is an integer. An extremely simple but possibly fruitful approach would be to strip the data of the predictable (unnecessary) x and y keys and encode it as a simple CSV (comma-separated value) string.
For example:
[{x:0,y:0},{x:0,y:1},{x:1,y:1},{x:1,y:2}]
would encode as the string:
0,0,0,1,1,1,1,2
The most efficient approach however would probably be to abandon socket.io for your own custom websocket interface that can somehow send binary data directly instead of encoding it as JSON. JSON will inherently be radically inefficient for sending a large dataset compared to sending encoded binary.
Edit: It looks like socket.io can send binary data so I would explore that along with some kind of efficient encoding/decoding scheme tailored to your dataset.
Related
I'm building a web app, and need to be able to encode user-generated data as concisely as possible before transmitting it to my server. In the past I've used Flash, and it had a very neat system where for any class that you want to serialize, you could write a pair of functions that would describe exactly how to serialize the data. For example:
out.writeShort(session);
out.writeUnsignedInt(itemID);
out.writeObject(arbitraryData);
out.writeShort(score);
You would have to write an equivalent function to read bytes from the serialized data and build the class from it.
Once data is serialized it could be encoded into a Base64 string for safe network transmission to the server.
I can't figure out how to do this in Javascript? JSON is nice and easy but it's incredibly wasteful, sending all object key/value pairs, and unless I'm mistaken everything is encoded as a string? So the value false is encoded as the string "false"?
Any advice on how to implement this in Javascript would be greatly appreciated! Use of libraries is fine so long as they work both on Node and in browser.
Look at this answer. You can use BSON format (Binary JSON) and it doesn't have those features of JSON you mentioned.
I have been working with JSONP on my GWT application. When my server sends a json string, I can get it in the form of a JavaScriptObject on the client side.
My problem is my json has complicated structures: using maps, nests with a lot of different keys. That is a big pain to extract data (I may have to write few hundred functions for all keys to extract data one by one and some complicated codes to fill maps).
I am considering few solutions:
Encode and send whole json strings as normal strings to client (as a value of a simple json string). Just worry my encoded strings may be few time longer than the original ones and may easily exceed the limit of 2k long
Convert back a JavaScriptObject into a pure string (similar to one I sent from the server)
After having a pure string I will parse it using some json parsers / methods to the structures I feel convenient.
My questions:
1) How to convert back a JavaScriptObject object into a pure / original json string?
2) Any idea about solutions?
Many thanks
1) Convert JavaScriptObject to JSON: JsonUtils.stringify(yourJSO)
Convert JSON to JavaScriptObject: JsonUtils.safeEval(jsonString);
2) Did you think about using AutoBeans?? Check out the GWT page
I am sending the following data on the server.
var loc = {
type : 'exloc',
ms : this.var1.move,
ppx : this.var1.ppx,
ppy : this.var1.ppy,
Name : "red",
rid : this.gv.GlobalInfoHolder.CurrentRoomID
};
and I can send this data in the following form also.
var obj = "type=exloc,ms="+this.var1.move+",ppx="+this.var1.ppx+",ppy="+this.var1.ppy+",Name=red,rid="+this.gv.GlobalInfoHolder.CurrentRoomID;
Then I can strip the data based on the comma's I've put.
Another way would be to encode the object to json then send the data to server and decode.
So I wanted to ask that what will be the best solution for this? It would be better if anyone tells me how to analyze the size in bytes of each of these variables including the json solution also.
I want to go for the solution that gives me the smallest size of the variable.I am doing this for performance and efficiency if anyone wants to suggest something he/she is most welcome.I am considering this because I am working on a real time game which has to share data between client and server very quickly (in milliseconds)
Just get a network analyzer program (like the free Fiddler or many, many other programs available for the same thing) and look to see exactly what is being sent across the wire. I do this all the time to debug client/server problems. It's real easy and shows you exactly what is being sent between client and server.
JSON is not meant to be the most storage efficient format so if you're trying to optimize for size, you can already know that's probably not the best option because formats with less redundancy (single copy of field name vs. many) are nearly always going to be more compact.
In addition, you should first figure out whether any of this matters. If your data is small overall, then the performance difference between 40 bytes and 80 bytes is probably too small to even measure on a single client. On the other hand, if you had thousands of records to transmit, then the size of each record could be enough to make a difference.
I have a grid in a browser.
I want to send rows of data to the grid via JSON, but the browser should continuously parse the JSON as it receives it and add rows to the grid as they are parsed. Put another way, the rows shouldn't be added to the grid all at once after the entire JSON object is received -- they should be added as they are received.
Is this possible? Particularly using jQuery, Jackson, and Spring 3 MVC?
Does this idea have a name? I only see bits of this idea sparsely documented online.
You can use Oboe.js which was built exactly for this use case.
Oboe.js is an open source Javascript library for loading JSON using streaming, combining the convenience of DOM with the speed and fluidity of SAX.
It can parse any JSON as a stream, is small enough to be a micro-library, doesn’t have dependencies, and doesn’t care which other libraries you need it to speak to.
You can't parse incomplete or invalid JSON using the browser's JSON.parse. If you are streaming text, it will invariably try and parse invalid JSON at some point which will cause it to fail. There exists streaming JSON parsers out there, you might be able to find something to suit your needs.
Easiest way in your case would remain to send complete JSON documents for each row.
Lazy.js is able to parse "streaming" JSON (demo).
Check out SignalR.
http://www.hanselman.com/blog/AsynchronousScalableWebApplicationsWithRealtimePersistentLongrunningConnectionsWithSignalR.aspx
March 2017 update:
Websockets allow you to mantain an open connection to the server that you can use to stream the data to the table. You could encode individual rows as JSON objects and send them, and each time one is received you can append it to the table. This is perhaps the optimal way to do it, but it requires using websockets which might not be fully supported by your technology stack. If websockets is not an option, then you can try requesting the data in the smallest chunks the server will allow, but this is costly since every request has an overhead and you would end up doing multiple requests to get the data.
AFAIK there is no way to start parsing an http request before it's finished, and there is no way to parse a JSON string partially. Also, getting the data is orders of magnitude slower than processing it so it's not really worth doing.
If you need to parse large amounts of data your best bet is to do the streaming approach.
I have a simple 2D array in javascript. I want to pass this array to a ASP.NET page.
I wanted to know what would be the best option, going with JSON or XML.
The metric is speed and size of the data. In some cases, the array may be long in size.
Thank You.
The metric is speed and size of the data.
JSON is faster then XML in terms of speed. It's smaller then XML in terms of size.
XML is bloated to allow you to represent and validate structures.
However there are various BSON formats around where people take JSON and then hand optimise the storage format excessively. (BSON is binary JSON)
Some BSON spec I picked from google
Bison, A JavaScript parser for some arbitary BSON format.
Now if you really have bottlenecks with transferring data (which you probably don't) you may want to use WebSockets to send data over TCP rather then HTTP, thus reducing the amount of traffic and data you send.
Of course you only care about that if you making say X000 requests per second.
JSON Should be your best bet XML datatype sending might be a big pain as sometimes you would have to add in new configs just to support XML datatype to be sent as form data to the server. Genreally it is not a recommended practice due to security concerns