I have a grid in a browser.
I want to send rows of data to the grid via JSON, but the browser should continuously parse the JSON as it receives it and add rows to the grid as they are parsed. Put another way, the rows shouldn't be added to the grid all at once after the entire JSON object is received -- they should be added as they are received.
Is this possible? Particularly using jQuery, Jackson, and Spring 3 MVC?
Does this idea have a name? I only see bits of this idea sparsely documented online.
You can use Oboe.js which was built exactly for this use case.
Oboe.js is an open source Javascript library for loading JSON using streaming, combining the convenience of DOM with the speed and fluidity of SAX.
It can parse any JSON as a stream, is small enough to be a micro-library, doesn’t have dependencies, and doesn’t care which other libraries you need it to speak to.
You can't parse incomplete or invalid JSON using the browser's JSON.parse. If you are streaming text, it will invariably try and parse invalid JSON at some point which will cause it to fail. There exists streaming JSON parsers out there, you might be able to find something to suit your needs.
Easiest way in your case would remain to send complete JSON documents for each row.
Lazy.js is able to parse "streaming" JSON (demo).
Check out SignalR.
http://www.hanselman.com/blog/AsynchronousScalableWebApplicationsWithRealtimePersistentLongrunningConnectionsWithSignalR.aspx
March 2017 update:
Websockets allow you to mantain an open connection to the server that you can use to stream the data to the table. You could encode individual rows as JSON objects and send them, and each time one is received you can append it to the table. This is perhaps the optimal way to do it, but it requires using websockets which might not be fully supported by your technology stack. If websockets is not an option, then you can try requesting the data in the smallest chunks the server will allow, but this is costly since every request has an overhead and you would end up doing multiple requests to get the data.
AFAIK there is no way to start parsing an http request before it's finished, and there is no way to parse a JSON string partially. Also, getting the data is orders of magnitude slower than processing it so it's not really worth doing.
If you need to parse large amounts of data your best bet is to do the streaming approach.
Related
What is the most efficient way in JavaScript to parse huge amounts of data from a file?
Currently I use JSON parse to serialize an uncompressed 250MB file, which is really slow. Is there a simple and fast way to read a lot of data in JavaScript from a file without looping through every character? The data stored in the file are only a few floating point arrays?
UPDATE:
The file contains a 3d mesh, 6 buffers (vert, uv etc). Also the buffers need to be presented as typed arrays. streaming is not a option because the file has to be fully loaded before the graphics engine can continue. Maybe a better question is how to transfer huge typed arrays from a file to javascript in the most efficient way.
I would recommend a SAX based parser for these kind of JavaScript or a stream parser.
DOM parsing would load the whole thing in memory and this is not the way to go by for large files like you mentioned.
For Javascript based SAX Parsing (in XML) you might refer to
https://code.google.com/p/jssaxparser/
and
for JSON you might write your own, the following link demonstrates how to write a basic SAX based parser in Javascript
http://ajaxian.com/archives/javascript-sax-based-parser
Have you tried encoding it to a binary and transferring it as a blob?
https://developer.mozilla.org/en-US/docs/DOM/XMLHttpRequest/Sending_and_Receiving_Binary_Data
http://www.htmlgoodies.com/html5/tutorials/working-with-binary-files-using-the-javascript-filereader-.html#fbid=LLhCrL0KEb6
There isn't a really good way of doing that, because the whole file is going to be loaded into memory and we all know that all of them have big memory leaks. Can you not instead add some paging for viewing the contents of that file?
Check if there are any plugins that allow you to read the file as a stream, that will improve this greatly.
UPDATE
http://www.html5rocks.com/en/tutorials/file/dndfiles/
You might want to read about the new HTML5 API's to read local files. You will have the issue with downloading 250mb of data still tho.
I can think of 1 solution and 1 hack
SOLUTION:
Extending the split the data in chunks: it boils down to http protocol. REST parts on the notion that http has enough "language" for most client-server scenarios.
You can setup on the client a request header Content-len to establish how much data you need per request
Then on the backend have some options http://httpstatus.es
Reply a 413 if the server is simply unable to get that much data from the db
417 if the server is able to reply but not under the requested header (Content-len)
206 with the provided chunk, letting know the client "there is more from where that came from"
HACK:
Use Websocket and get the binary file. Then use the html5 FileAPI to load it into memory.
This is likely to fail though because its not the download causing the problem, but the parsing of an almost-endless JS object
You're out of luck on the browser. Not only do you have to download the file, but you'll have to parse the json regardless. Parse it on the server, break it into smaller chunks, store that data into the db, and query for what you need.
The title says it all, but basically I use ajax to get information and I don't know if I should have PHP create the HTML which then gets returned to the client and then javascript just plugs it in or if I should have the server just send all the information through JSON and have javascript (jQuery) create all the html that holds it.
Which one is more efficient?
Thanks
I would say it is better practice to serve only the JSON data. Why?
Well, perhaps, you want to hook up a different type of client to your data service.
Maybe you create a mobile app, and it needs the same data, but wants to display it differently.
If you are providing the HTML markup as well, then now your mobile app has to parse the data it wants out of the HTML structure, rather than just dealing with the data right away.
On an efficiency scale, that depends on what you consider efficient.
For example, it would be efficient from a bandwidth perspective to only send the JSON. However, it would be more efficient from a processing standpoint on the target client to simply give it an HTML string to display.
If you are considering ever having different clients accessing the same data, though, then you want to create a single data interface that serves JSON (in your case), and allow the client to decide how to present that data.
Separation of concerns.
I'm a fairly well versed programmer, so learning new technologies shouldn't be that big of an issue. That being said I'm currently attempting to make a card game in HTML5 using canvas/javascript etc.
The current question that I have is what to use to store instances of the cards. I was thinking about using XML to store the card data, but I'd like to limit the amount of work the browser has to do so the game runs more smoothly, I've heard JSON is a good alternative, but I'm just looking for suggestions. Thanks!
JSON is better in my opinion.
You can serialize objects to JSON at server side and send JSON string to client (browser), then your client will be able to parse JSON string into regular JavaScript object using JSON.parse.
In this way you'll not need to walk through XML to find particular nodes, but will just work with data in more convenient way using native JavaScript objects/arrays.
Also in most cases JSON will be more compact than XML so this can save bandwidth and speed-up data loading.
Also the data types stuff may be important here - JSON represents datatypes correctly (integers, booleans, floats, strings) and XML is storing them as strings so you'll need some additional attributes to set datatype during serialization and determine it during deserialization.
I am not sure how to do this without a framework, but what I would do is use Backbone.JS and create a model of what an instance would look like. Eg:{CardNumber:'2', CardColor: 'red', CardClass: 'hearts'}. Now I would create a collection to hold all these models, see backbone collections.
So I would store all this data client side, and possibly provide the user with an option to save the game, to persist this data to a database. This stores it as JSON and then when you persist it to the database, you can serialize it to get the individual components.
If you dont want to save to the db and do not want to use a framework. Try stack/queue implementations in Javascript. See:How do you implement a Stack and a Queue in JavaScript?
I hope that answers your question.
Stick to JSON because JSON is just a string representation of plain JS objects, and browsers are very comfortable with it. JS have no good XML handling and that will be too expensive.
Use HTML5 localStorage for keeping data until you really need to sync with the server. Frequent server operations will cause your game to suffer. Use bulk data transfers instead of many small server connections (for example at the start and the end).
Consider using a game library if the canvas graphics are intense. I have used http://jawsjs.com sometime back, but there should be better libs available out there. Selectively render only the dynamic objects, not everything on canvas.
JSON in conjunction with localStorage is a great way to go.
There are libraries available to serialize and deserialize Javascript objects and allow you tp store and retrieve it from localStorage. Simple Github search is a good way to start
I have a simple 2D array in javascript. I want to pass this array to a ASP.NET page.
I wanted to know what would be the best option, going with JSON or XML.
The metric is speed and size of the data. In some cases, the array may be long in size.
Thank You.
The metric is speed and size of the data.
JSON is faster then XML in terms of speed. It's smaller then XML in terms of size.
XML is bloated to allow you to represent and validate structures.
However there are various BSON formats around where people take JSON and then hand optimise the storage format excessively. (BSON is binary JSON)
Some BSON spec I picked from google
Bison, A JavaScript parser for some arbitary BSON format.
Now if you really have bottlenecks with transferring data (which you probably don't) you may want to use WebSockets to send data over TCP rather then HTTP, thus reducing the amount of traffic and data you send.
Of course you only care about that if you making say X000 requests per second.
JSON Should be your best bet XML datatype sending might be a big pain as sometimes you would have to add in new configs just to support XML datatype to be sent as form data to the server. Genreally it is not a recommended practice due to security concerns
I need a mechanism for storing complex data structures created in client side javascript. I've been considering using the stringify method to convert the javascript object into a string, store it in the database and then pull it back out and use the reverse parse method to give me the javascript object back.
Is this just a bad idea or can it be done safely? If it can, what are some pitfalls I should be sure to avoid? Or should I just come up with my own method for accomplishing this?
It can be done and I've done it. It's as safe as your database.
The only downside is it's practically impossible to use the stored data in queries. Down the track you may come to wish you'd stored the data as table fields to enable filtering and sorting etc.
Since the data is user created make sure you're using a safe method to insert the data to protect yourself from injection attacks (don't just blindly concatenate the data into a query string).
It's fine so long as you don't deserialize using eval.
Because you are using a database it means you need a serverside language to communicate with the database. Any data you have is easily converted from and to json with most serverside languages.
I can't imagine a proper usecase unless you have a sh*tload of javascript, it needs to be very performant, and you have exhausted all other possibilities such as caching, query optimization, etc...
An other downside of doing this is that you can't easily query the data in your database which is always nice when you want to get any kind of reporting done.
And what if your json structure changes? Will you update all the scripts in your database? Or will you force yourself to cope with the changes in the parsing code?
Conclusion
Imho it is not dangerous to do so but it leaves little room for manageability and future updates.