More bandwidth-efficient data formats than XML & JSON? [closed] - javascript

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
When transferring large amounts of columnar data from web server to browser in the JSON or XML format, column names are repeated unnecessarily for every row. What I have to transfer is basically CSV format - ie. multiple rows of data, each with the same field order.
For example, in XML I'd be transferring:
<row>
<name>Frank</name>
<city>New York</city>
</row>
<row>
<name>Brian</name>
<city>Jerusalem</city>
</row>
etc..
Same with JSON, field names are repeated unnecessarily, which becomes a lot of extra bytes for no reason.
My question is, are there other standard formats, supported by libraries in Javascript and/or .NET, which I can use to more efficiently transfer this kind of "CSV-like" data set?

Unless you're talking about a LOT of data, it probably doesn't matter that much (bandwidth is cheap, or so I hear). However, if you're trying to squeeze as much data as possible into the available bandwidth, you do have options.
You could use whatever format you want (XML, JSON), then compress it. For example, you could use JSZip. Advantage: you don't have to transform your basic data structure.
Also, there's nothing to stop you from using CSV if that's format that makes the most sense for your application. Best to use a library such as jquery-csv to handle annoyances like properly quoting strings. Disadvantage: this isn't really compression; simply a format without a lot of overhead.
All things told, I would probably go with the first approach.

I guess your json format is like below:
[
{
"name": "Frank",
"city": "New York"
},
{
"name": "Brian",
"city": "Jerusalem"
}
]
Why not simply change it to:
[
["name", "city"],
["Frank", "New York"],
["Brian", "Jerusalem"]
]

I had a similar problem when transferring large amounts of data via XML from Flash (running on client, similar to JavaScript) and .NET (running on server obviously)
The method I ended up using was;
XML > string without whitespace > zip compress > base64 encoded
And then on the other side, do the reverse.
In your case, you may just do;
CSV > string > zip > base64
From memory, I was able to get some really big payloads (approx. 2mb) down to less than 50kb

Related

Which is better when sending Javascript array to PHP script with jQuery: Using JSON or just let jQuery convert it to query string? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I'm using jQuery's $.ajax function with POST method to send a javascript array to a PHP script (AFAIK, GET method would be analogue):
fruits = [
{
color: "yellow",
taste: "sour"
},
{
color: "blue",
taste: "semi-sweet"
},
]
$.ajax({
url: "php/script.php",
data: {
fruits: fruits
},
type: "POST",
});
After making the request, the $_POST variable in PHP is populated with:
$_POST['fruits'][0]['color']
$_POST['fruits'][0]['taste']
$_POST['fruits'][1]['color']
$_POST['fruits'][1]['taste']
Alternatively, in $.ajax function call, I could send data as stringified JSON:
data: {
fruits: JSON.stringify(fruits);
},
In this scenario, the data would be sent as a single string (performance gain?), but I would have to use json_decode() function in PHP, on $_POST['fruits'] argument:
$fruits = json_decode($_POST['fruits']);
After using it, I would be able to access the data in a similar way:
fruits[0]['color']
fruits[0]['taste']
fruits[1]['color']
fruits[1]['taste']
I wonder, if these methods are similarly good or one of them is better and why. Thank You in advance for sharing Your knowledge :)
Better is a very fluid concept. you have to consider several factors and I will try to list some of them.
Readability:
Json format is highly readable and may map nicely to backend models
which may help you keep a better mental model.
on the other hand query string is not a far behind readability wise but looks less like an object.
Performance:
Query string content length is 127 bytes
Json content length is 138 bytes
that eleven bytes difference which may be critical on huge systems.
Intuition
Your data exchange seems fitting to json.
You send an array of objects and although it is easy to post like you showed I think it is slightly easier to debug and construct.
JSON is more an enterprise/rest (api) way to go. However it's more modern, clean and dynamic it might be overkill for you.
If you're doing a simple post with 2 or 3 strings go right ahead.
One thing to note, posting null values is only really possible with JSON if you want this, hence the more diverse, also as stated nested data is easier with json. But simple contact forms shouldn't bother with this unless you're using Angular or something.
It's most common on POST to use JSON.
Few reasons why:
querystrings needed to be urlencoded
it's rather difficult to have complex data(ex. nested objects or nested arrays even) on querystrings
JSON is native objects on client

Persist complex array in Javascript [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
Hello I am having a hard time understating what I need to do in order to persist a complex array I have. The array is filled through an ajax call to a jsonresult and has a couple of strings and 2 objects, something like this:
{
Name
Age
Surname
Object 1
Object 2
}
This array is filled whenever I load my "main page". Now when I change page to lets say page B I would like to still have access to those values.
I know that its possible to use json stringify but I would like to maintain the array without needing to convert it to a string, because I would need to convert the numbers back to numbers.
Any suggestions are greatly appreciated.
The key here is Now when I change page to lets say page B I would like to still have access to those values.
The Javascript scope (that is, all the variables and functions you want to use) only lives as long as the page does. So, if you refresh, it kills the scope (all your variables will disappear)!
So, how to persist your information? As the commenters have said, you've got some options.
Traditionally, cookies are used - there's a lot of tutorials on how to do that. They store a key, a value, and an expiration.
HTML5 API has introduced browser storage, which is generally better than cookies, but less widely supported (although it's pretty good now in 2015).
You can store it on the server using a server-side language like PHP, Ruby, Java, etc. and pass it back to the page when the page is rendered.
Basically, Javascript cannot store variables by itself if the page is refreshed. You've got to use one of the above options, which are each an interesting learning curve by themselves (outside the scope of this question).
I'd recommend, in order:
starting with session storage if you're just experimenting
cookies if you want to build a resilient solution
server-side stuff if you want to take the red pill.
Incidentally, your notation is not correct for Javascript - arrays are notated using
["foo", "bar", "etc"]
and Javascript objects (which can be used as associative arrays) look like
{ "key": "value", "ghandi": "india", "mandela", "south africa" }

JSON design best practices [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I have designed a JSON representation of a mailbox so that I can look up mails easily, for example mailjson[UID].Body.
However after looking at Angularjs and Ember, templating MVC JS engines, it seems that the JSON should be of the format:
[{
"id": 1,
"body": "Blah blah blah..."
},
{
"id": 2,
"body": "More blah foo blah"
},
{
"id": 3,
"body": "Hopefully you understand this example"
}]
And then there is some findAll(id) function to grab the item based on the id one wants, that iterates through the JSON. So now I'm wondering does my JSON design have merit? Am I doing it wrong? Why don't people use the dict lookup design I'm using with my JSON?
Any other tips to make sure I have a good data structure design, I would be grateful.
Best practice for storage of big table in JSON is to use array.
The reason is that when parsing JSON array into memory array there is no speed penalty of building map. If you need to build in-memory index by several fields for quick access, you can do it during or after loading. But if you store JSON like you did, you don't have a choice to load quickly without building map because JSON parser will always have to build that huge map of ids based on your structure.
Structure of how you store data in memory does not have to be (and should not be) the same as structure of storage on disk because there is no way to serialize/deserialize internal JavaScript map structure. If it was possible then you would serialize and store indexes just like MS SQL Server stores tables and indexes.
But if framework that you are using forces you to have same structure in memory and disk then I support your choice of using id as key in one big object, because then it's easier to pass id in JSON requests to and from server to do any action with email item or update it's state in UI, assuming that both server and browser keep significant list of email items in memory.

JSON response objects: "pretty" keys and larger response or short keys and smaller response? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
My real-time web app makes ajax requests to obtain JSON econded data responses.
Returned data is usually in the form of an array of objects.
As the array has often a lot of elements (an although data sent is gzipped by the server) in order to keep the response size at minimum I'm keeping the keys in the response very short.
For example instead of using description: I use d:, instead of using width: I use w: and so on...
Doing so reduces the size of the response but, on the client side, very-short non human-readabe keys makes the JavaScript code (that access the object) less readable.
The only solution seem to reparse the response and rebuild the object with pretty keys or replace them in the original object received. But this may hurt the JavaScript code performance resulting in more delay...
Exists a better solution?
EDIT:
As Björn Roberg suggested in his comment I've made a comparison:
pretty-response.json 459,809 bytes
short-response.json 245,881 bytes
pretty-response.json.zip 28,635 bytes
short-response.json.zip 26,388 bytes
So as the response is compressed by the server the difference is really minimal.
Still, pretty response require the server to compress 450 KB of data, while short response just 240 KB.
Does this impact server performance (or is there a way to measure it) ?
Since you are considering converting the short keys back to long keys on the client side, you are clearly concerned with the bandwidth requirements for the data and not the memory requirements on the client.
I've generated some files containing random data and three keys (description, something and somethingElse). I've also dumped the data through sed to replace those keys with d, s and e.
This results in:
750K long-keys
457K short-keys
HTTP has support for compression, and all significant clients support this with gzip. So, what happens if we gzip the files:
187K 10:26 long-keys.gz
179K 10:27 short-keys.gz
There is very little to choose between them, since gzip is rather good at compressing repeated strings.
So, just use HTTP compression and don't worry about munging the data.
gzip is also a really fast algorithm, so the impact it will have on server performance is negligible.
Maybe you could try protocol buffers and see if that makes any difference. It was developed to be faster and lighter than many other serialization formats (ie. XML and JSON).
Other formats exist that share the same goals, but protocol buffers, aka protobufs, are the one that sprang to my mind.
Refer to this answer for a nice comparison.
You can use the decorator pattern to wrap the objects upon retrieval from the array.
However, given that you probably want to access all the objects returned (why would you return objects that aren't needed by the client?) it would probably be no slower, and possibly faster, to just convert the objects to objects with longer field names upon retrieval from the array.
If you are going to retrieve each object multiple times, you could even go through the array and replace them one by one, to avoid having to repeatedly convert them.
All these options have a performance cost, but it may not be significant. Profile!
compress your json from server with http://dean.edwards.name/packer/
library for compression http://dean.edwards.name/download/#packer
you can also check your json size by checking online if it reduce or not
If you want your code to be readable and still use short keys, you could use index notation for accessing members:
var descriptionKey = 'd';
var widthKey = 'w';
//...
var description = yourObject[descriptionKey];

How should I organize a large JavaScript dialog object? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I have a very large JavaScript object I'm using to store RPG dialog, containing several large top-level objects. The dialog is formatted similar to JSON.
When organizing the dialog, should I put each of those top level categories in their own function?
Or maybe their own separate javascript file?
Does keeping all the text in one javascript (var json =) object affect performance?
Any organization tips would be appreciated!
You should generally try to limit coupling whenever possible. Therefore it would make sense to separate these into separate functions, but see point #3 about databases.
You can separate them into separate javascript files if it gets too big to manage easily with one. You can always have a build process that lumps them all together later (check the html5 boilerplate for more on this technique).
It probably isn't going to make much difference for performance at this point. I might suggest using something like mongoDB in the future as you get more data though. Take a look at meteor js to get some inspiration. Using a database can simplify your code if you do it right.
Addressing question 1, this sounds like JSONP? Slightly different.
I would suggest that performance differences would be negligible for parsing though if using JSONP and injecting <script/> elements to the DOM file size becomes a factor, and it would be wise to group related data into seperate files so that it can be retrieved selectively.
Obviously at the top level if not using JSONP, you can seperate your objects into an array or one keyed object.
I saw your question before the edit and to me it wasn't only about JSON. I understand the reasoning behind meagar's edit but I don't agree with it. I think you should really step out of the technical details and focus on what your data is. You are writing the story of your game in a JSON file. Is that really appropriate or convenient? I personally don't think so. I mean, if you're comfortable working with this so far, sure, knock yourself out. Myself, I would use plain and simple text files to hold that dialog (1 line in a file = 1 line of dialog, with name of file = name of NPC or cutscene) They're much more appropriate for storytelling IMHO.
What I mean is, write that dialog in something that is meant for writing. JSON is a format for data-interchange, sure, it's human-readable. That doesn't mean it should be human-written.
Then, when you have your story, if you want your game to consume that data in JSON form, write some kind of wrapper around this bunch of files. Sure, it looks like more work, but I think it's more flexible. Well, that's what I would do anyway. Best of luck.

Categories

Resources