I have an online service where users can create json-backed documents. These are then stored on a server and other users can load them. The json is then decoded exactly as it was submitted. Are there any security risks in the event that a user tampers with the json before they submit it and injects arbitrary javascript, which is then executed on the viewers' browser? Is this even possible? that's what I need to know, if this is possible, or arbitrary execution of javascript from a json string is possible.
This depends entirely on a) whether you're scrubbing the JSON on the server side, and (even more) on b) how you're decoding the JSON on the client side when you load it again.
Any code that uses eval() to deserialize the JSON into a Javascript object is open to exactly the attack you describe.
Any code that uses JSONP to load the JSON (i.e. passing the JSON as a Javascript literal to a named callback function) is open to the attack you describe (it's effectively the same as using eval()).
Most robust JSON-parsing mechanisms (e.g. json2.js, the jQuery $.parseJSON function, or native JSON.parse() functions in browsers that support it) will not accept JSON that doesn't follow the JSON specification. So if you're using a library to parse the JSON string, you may be safe.
No matter how you intend to load the JSON on the client side, it is good practice to scrub any user-submitted content on the server side. In this case, you might use server-side code to check that the JSON is valid (e.g. using json.loads(user_submitted_json) in Python, and catching errors).
So with some care on both the server side and the client side, you should be able to do this safely.
<plug shameless="true">
JSON sans eval is designed to avoid problems with malformed JSON while still being efficient at parsing.
This JSON parser does not attempt to validate the JSON, so may return a result given a syntactically invalid input, but does not use eval so is deterministic and is guaranteed not to modify any object other than its return value.
There are a number of JSON parsers in JavaScript at json.org. This implementation should be used whenever security is a concern (when JSON may come from an untrusted source), speed is a concern, and erroring on malformed JSON is not a concern.
</plug>
JSON has traditionally been parsed using an eval() statement, which is about as insecure as it is possible to get. If you allow this, your application will be insecure.
Related
I have a grid in a browser.
I want to send rows of data to the grid via JSON, but the browser should continuously parse the JSON as it receives it and add rows to the grid as they are parsed. Put another way, the rows shouldn't be added to the grid all at once after the entire JSON object is received -- they should be added as they are received.
Is this possible? Particularly using jQuery, Jackson, and Spring 3 MVC?
Does this idea have a name? I only see bits of this idea sparsely documented online.
You can use Oboe.js which was built exactly for this use case.
Oboe.js is an open source Javascript library for loading JSON using streaming, combining the convenience of DOM with the speed and fluidity of SAX.
It can parse any JSON as a stream, is small enough to be a micro-library, doesn’t have dependencies, and doesn’t care which other libraries you need it to speak to.
You can't parse incomplete or invalid JSON using the browser's JSON.parse. If you are streaming text, it will invariably try and parse invalid JSON at some point which will cause it to fail. There exists streaming JSON parsers out there, you might be able to find something to suit your needs.
Easiest way in your case would remain to send complete JSON documents for each row.
Lazy.js is able to parse "streaming" JSON (demo).
Check out SignalR.
http://www.hanselman.com/blog/AsynchronousScalableWebApplicationsWithRealtimePersistentLongrunningConnectionsWithSignalR.aspx
March 2017 update:
Websockets allow you to mantain an open connection to the server that you can use to stream the data to the table. You could encode individual rows as JSON objects and send them, and each time one is received you can append it to the table. This is perhaps the optimal way to do it, but it requires using websockets which might not be fully supported by your technology stack. If websockets is not an option, then you can try requesting the data in the smallest chunks the server will allow, but this is costly since every request has an overhead and you would end up doing multiple requests to get the data.
AFAIK there is no way to start parsing an http request before it's finished, and there is no way to parse a JSON string partially. Also, getting the data is orders of magnitude slower than processing it so it's not really worth doing.
If you need to parse large amounts of data your best bet is to do the streaming approach.
I would like to be able to store a tree-like structure in a cookie. Ideally, I would like to have something that easily serealizes/deserializes a javascript plain object.
JSON might be a good option, but a quick googling did not filtered out a mainstream approach how to serialize to JSON from JavaScript.
What is the best way to approach the problem?
UPD
Related questions bubbled up Javascript / PHP cookie serialization methods?, which suggests using Prototype's Object.toJSON. I would prefer to stay with jQuery.
UPD2
It turned out that window.JSON.stringify might actually suffice in my case, but mentioned Douglas Crockford's library seems like a good fallback to support browsers where JSON property of the global object is not present.
JSON is your friend.
A free and recognized implementation made by Douglas Crockford is available here
I have used this method to read and store to HTML5's local storage without any problems.
JSON is undoubtedly a good option. To have it work cross-browser include this file in your page https://github.com/douglascrockford/JSON-js/blob/master/json2.js. Then use JSON.stringify() to convert to a string and store, and JSON.parse() to retrieve the object from the cookie.
Be aware that there can be quite low character limits on a single cookie's length, which any jsonified tree could hit, so you might want to preprocess your data before converting to JSON (e.g. replacing booleans with 1's and 0's, switching property names for abbreviated versions) and post-process to reverse these changes after retrieveing from your cookie.
If the amount of data you're storing is really large it may be better to store a session/identifier cookie which is used to retrieve the data from the server via an ajax request (or if you need a quick response on page load, output the data into a script tag) and save the data directly to the server via ajax requests instead of using a cookie.
One more JSON serialization implementation as a jQuery plugin: http://code.google.com/p/jquery-json/
I want to build a multi tenent cloud app. My stack is javascript / json end-2-end: The user inputs data in the browser which jquery turns to json, sends to my node.js server, which in turn stores it as json in couchdb. When fetching data json goes the other way around. If the user injects something to this json is there anywhere in the above stack this json is actually evaludated? If yes I need to sanitize it. How robust is json sanitization? Or will a sandbox help? how robust is it?
This is a multi tenent environment and a lot of secret data of users and companies will be there.
Look on Caja or Node-validator
Caja is implementation of Google Caja sanitizer
Node-validator is a node validator/sanitizer, here express node-validator wrapper
Good luck
I suggest defense-in-depth (i.e. multiple overlapping security mechanisms. Richard and Pasha both make excellent suggestions.
Something else to do is use CouchDB data validation features. You write a validate_doc_update function in Javascript. This function will run for every change to the database. The function can decide whether the data is acceptable or not.
Validation runs deep, in the CouchDB server itself. Therefore, if you have a good validation function, it is impossible for bad data to be stored at all.
Node.js uses JSON.parse to evaluate JSON data. JSON.parse uses the strict JSON syntax which does not allow for functions to be declared within the data string. It also means that data keys must be double-quoted strings, and values can only be Boolean, Number, String, Array, or Object.
As far as I know it is considered bad practice to eval() JSON objects in JavaScript, because of security. I can understand this concern if the JSON comes from another server.
But if the JSON is provided by my own server and is created using PHP's json_encode (let us assume it is not buggy), is it legitimate to simply use eval() to read the JSON in JS or are there any security problem I currently can't think of?
I really don't want to deal with dynamically loading a JSON parser and would be glad to simply use eval().
PS: I will obviously use the native JSON object if it is available, but want to fall back to eval() for IE/Opera.
In your scenario, the question becomes, where is PHP getting the javascript to execute from? Is that channel secure, and free from potential user manipulation? What if you don't control that channel directly?
There are a number of ways that your security may be compromised.
Man in the middle attacks could theoretically alter the contents of data being delivered to the client.
Your server traffic could be intercepted elsewhere and different content could be provided (not quite the same as a MIM attack)
Your server could be compromised and the data source could be tampered with.
and these are just the simple examples. XSS is nasty.
"an ounce of prevention is worth a pound of cure"
Besides the obvious security issues:
Native JSON is faster
You don't need to "load" a JSON parser it's just another function call to the JavaScript engine
Tip:
in asp.net using JSON is considered bad becuase parsing of DateTime differs between the server and the client so we use a special function to deserialize the date in javascript. I'm not sure if PHP has the same issue but its worth mentioning though.
check out this:http://blog.mozilla.com/webdev/2009/02/12/native-json-in-firefox-31/
so at least for firefox you can use the built in json parser
Seriously? Some of the guys here are paranoid. If you're delivering the JSON and you know it's safe, it's ok to fallback(*) to eval(); instead of a js lib for IE. After all, IE users have much more to worry about.
And the man-in-the-middle argument is bullsh*t.
(*) the words fallback and safe are in bold because some people here didn't see them.
Question: I'm using eval to parse a JSON return value from one of my WebMethods.
I prefer not to add jquery-json because the transfer volume is already quite large.
So I parse the JSON return value with eval.
Now rumors go that this is insecure. Why ?
Nobody can modify the JSOn return value unless they hack my server, in which case I would have a much larger problem anyway.
And if they do it locally, JavaScript only executes in their browser.
So I fail to see where the problem is.
Can anybody shed some light on this, using this concrete example?
function OnWebMethodSucceeded(JSONstrWebMethodReturnValue)
{
var result=eval('(' + JSONstrWebMethodReturnValue + ')')
... // Adding result.xy to a table
}
The fundamental issue is that eval can run any JavaScript, not just deserialize JSON-formatted data. That's the risk when using it to process JSON from an untrusted or semi-trusted source. The frequent trick of wrapping the JSON in parentheses is not sufficient to ensure that arbitrary JavaScript isn't executed. Consider this "JSON" which really isn't:
function(){alert('Hi')})(
If you had that in a variable x and did this:
var result = eval("(" + x + ")");
...you'd see an alert -- the JavaScript ran. Security issue.
If your data is coming from a trusted source (and it sounds like it is), I wouldn't worry about it too much. That said, you might be interested in Crockford's discussion here (Crockford being the inventor of JSON and a generally-knowledgeable JavaScript person). Crockford also provides at least three public domain parsers on this page you might consider using: His json2.js parser and stringifier, which when minified is only 2.5k in size, but which still uses eval (it just takes several precautions first); his json_parse.js, which is a recursive-descent parser not using eval; and his json_parse_state.js, a state machine parser (again not using eval). So you get to pick your poison. (Shout out to Camilo Martin for pointing out those last two alternatives.)
Increasingly, JSON parsing and encoding is available natively in modern browsers, [wikipedia reference] This gives your application secure JSON functionality without needing to load an additional library.
You can test for native JSON support by doing something like this:
var native_JSON_exists = typeof window.JSON === 'object';
You should load up a JSON parsing library like Douglas Crockford's one (linked by T.J. Crowder, above) or functionality available via a framewok for browsers that don't have native support. (But you should at least use native JSON in browsers that support it, to protect users lucky enough to have modern browsers)
Bear in mind, JSON is a subset of JavaScript's syntax so strings that work in an JavaScript eval statement may not work in proper JSON parsing. You can test your JSON strings for errors using JSLint (http://www.jslint.com/).