Encoding for JSON cookie data? - javascript

Do you really need to encode JSON.stringify when saving in cookie?
I tested it out and IE8+ and chrome works just fine without encoding (encodeURIComponent) the actual data. Problem with encoding is you are limited to 4096 bytes, and a lot of the items in JSON will be encoded causing a larger increase in byte size.
I would be saving variables like this, doesn't seem like encoding is really required.
(only 345 bytes)
{
"s":
"p":"1",
"c":"nameofsomething",
"i":"56456,54115,878451,651451,65156,878941,5165165,54545,22115,874845",
"t":"1407515818100",
"gcid":"CPOa-ZTbpL4CFWdo7AodTnQA3A",
"k":"54154154"
}
When I extract the cookie, everything is preserved. I know it is best practice to do encoding, but saving the bytes and keeping the cookie clean would be better.
Encoding would look like this (545 Bytes) about 200 increase because of the encoding
%7B%22s%22%3A%7B%22p%22%3A%221%22%2C%22c%22%3A%22nameofsomething%22%2C%22i%22%3A%2256456%2C54115%2C878451%2C651451%2C65156%2C878941%2C5165165%2C54545%2C22115%2C874845%22%2C%22t%22%3A%221407515818100%22%2C%22gcid%22%3A%22CPOa-ZTbpL4CFWdo7AodTnQA3A%22%2C%22k%22%3A%2254154154%22%7D%2C%22o%22%3A%7B%7D%7D

You are asking about two different things here:
JSON ENCODING
If you are trying to store simple data, then no, JSON encoding is not required. You can manually store key and value pairs in a cookie with no difficulty. Cookies are very good at this.
If, on the other hand, you have a complex object like your example then JSON encoding is the best way. If you don't use JSON encoding, you'll need some other way to handle key/value encoding, and by the time you end up handling tested objects you'll have just created a poor version of JSON. So, use JSON and save the headaches.
encodeURIComponent
Not needed. A cookie is not a URI, nor is it a URI component.
Other
The JSON encoding will handle any unicode encoding that needs done.
Right off the bat I'm having a hard time imagining any encoding that needs to be done that wouldn't be handled by JSON.

Related

Creating Binary Data Structures in Javascript

I'm building a web app, and need to be able to encode user-generated data as concisely as possible before transmitting it to my server. In the past I've used Flash, and it had a very neat system where for any class that you want to serialize, you could write a pair of functions that would describe exactly how to serialize the data. For example:
out.writeShort(session);
out.writeUnsignedInt(itemID);
out.writeObject(arbitraryData);
out.writeShort(score);
You would have to write an equivalent function to read bytes from the serialized data and build the class from it.
Once data is serialized it could be encoded into a Base64 string for safe network transmission to the server.
I can't figure out how to do this in Javascript? JSON is nice and easy but it's incredibly wasteful, sending all object key/value pairs, and unless I'm mistaken everything is encoded as a string? So the value false is encoded as the string "false"?
Any advice on how to implement this in Javascript would be greatly appreciated! Use of libraries is fine so long as they work both on Node and in browser.
Look at this answer. You can use BSON format (Binary JSON) and it doesn't have those features of JSON you mentioned.

Effective send big array of coordinates using Socket.io

I was testing performance of my node.js server and I detected that this small code last too much time:
socket.emit("a",{a1:something,a2:veryBigArray});
Problem is that my array is very big and socket.io must encode it to JSON.
My array is something like:
veryBigArray=[{x:0,y:0},{x:0,y:1},{x:1,y:1},{x:1,y:2}.......];
I am interested mainly in server performance. I want server can continue in work as soon as possible.
Before sending array to client, I generate array, so I don't have problem to completely change structure of sended data. Maybe array could be somehow compressed. I read about ArrayBuffer;
What is the best (fastest for sever) way of sending big array of coordinates to client (browser) using Socket.io?
If you must encode the array to JSON then I don't think an ArrayBuffer will help. However, if you know the precise data structure of the array to be sent and can predict it server-side, you can come up with your own efficient encoding/decoding schema.
In your example, your data is simply an array of x and y value pairs where each value is an integer. An extremely simple but possibly fruitful approach would be to strip the data of the predictable (unnecessary) x and y keys and encode it as a simple CSV (comma-separated value) string.
For example:
[{x:0,y:0},{x:0,y:1},{x:1,y:1},{x:1,y:2}]
would encode as the string:
0,0,0,1,1,1,1,2
The most efficient approach however would probably be to abandon socket.io for your own custom websocket interface that can somehow send binary data directly instead of encoding it as JSON. JSON will inherently be radically inefficient for sending a large dataset compared to sending encoded binary.
Edit: It looks like socket.io can send binary data so I would explore that along with some kind of efficient encoding/decoding scheme tailored to your dataset.

Javascript string compression for URL hash parameter

I'm looking to store a lot of data in a URL hash parameter without exceeding URL character limits.
Are there any conventional ways of compressing string length which could be then decoded on another page load?
I've seen LZW encoding used for similar solutions, however would special characters be valid for this use?
LZW encoding technically works; you'll just need to convert the LZW-encoded binary into URL-safe base64, so that the output doesn't contain special characters. Here's an MDN article on base64 in JavaScript; the URL-safe variant of base64 just replaces + with - and / with _. Of course, you're not likely to reduce the size of your string by much by doing this, unless the data you want to store is extremely compressible.
You can look at smaz or shoco, which are designed for the compression of short strings. Most compression methods don't really get rolling until well after your URL length limit, so you need a specialized compressor for this case if you expect to get any gain. You can then encode the binary result using a scheme like Base 64 or a more efficient coding that uses all of the URI-safe characters.

Decoding cp1251 to UTF-8 in javascript

How to decode cp-1251 to UTF-8 in javascript?
The cp-1251 is from a datafeed, which required to decode from js client side.
There is no way to change server side output, since it is related to a 3rd party, and due to some reason, I would not use any server side programming to convert the datafeed to become another datafeed.
(Assuming that by "UTF-8" you meant the JS strings in their native encoding...)
Depending on the format your 'cp-1251' data is in and depending on the browsers you need to support, you can choose from:
TextDecoder.decode() API (decodes a sequence of octets from a typed array, like Uint8Array) - if you're using web sockets, you can get an ArrayBuffer out of it to decode.
https://github.com/mathiasbynens/windows-1251 operates on something it calls 'byte strings' (JS Strings consisting of characters like \u00XY, where 0xXY is the encoded byte.
build the decoding table yourself (example)
Note that in most cases (not something as low-level as websockets though) it might be easier to read the data in the correct encoding before it ends up as a JS string (for example, you can force XMLHttpRequest to use a certain encoding even if the server misreports the encoding).

Passing array from javascript to ASP.NET

I have a simple 2D array in javascript. I want to pass this array to a ASP.NET page.
I wanted to know what would be the best option, going with JSON or XML.
The metric is speed and size of the data. In some cases, the array may be long in size.
Thank You.
The metric is speed and size of the data.
JSON is faster then XML in terms of speed. It's smaller then XML in terms of size.
XML is bloated to allow you to represent and validate structures.
However there are various BSON formats around where people take JSON and then hand optimise the storage format excessively. (BSON is binary JSON)
Some BSON spec I picked from google
Bison, A JavaScript parser for some arbitary BSON format.
Now if you really have bottlenecks with transferring data (which you probably don't) you may want to use WebSockets to send data over TCP rather then HTTP, thus reducing the amount of traffic and data you send.
Of course you only care about that if you making say X000 requests per second.
JSON Should be your best bet XML datatype sending might be a big pain as sometimes you would have to add in new configs just to support XML datatype to be sent as form data to the server. Genreally it is not a recommended practice due to security concerns

Categories

Resources