Can my JavaScript code be edited at runtime by (malicious) users? - javascript

Can my JavaScript code be edited at runtime by (malicious) users, even when it is uploaded in a web hosting site?
For example if I declare a variable in my script something like:
var myvalue = 2;
I want to know if it can be edited to:
var myvalue = 1;

Short answer: yes.
Anyone can open the browser's Developer Tools and change values, execute arbitrary code, remove or change or edit anything they like.
So if there is anything crucial in your application where an invalid value could cause a security or data validation issue, then, if that data (or data which is derived using that value) is submitted to the server, it must be re-validated using server-side code (which of course cannot be changed) before being accepted.
P.S. Bear in mind that any edits to the code or variable values will only persist until the next time the page is re-loaded. When the page is refreshed, the JavaScript and HTML files will be downloaded again from the server and all code and variable values are reset to their starting state. Assuming there are no other security vulnerabilities in your server, then a malicious user cannot edit the original source code files which are stored there. They can only change the copy which gets loaded into the browser.

Related

How to preserve userscript modifications in Chrome after an asynchronous content update on an .aspx webpage

I'm trying to automate the workflow of a webpage for my company's inventory system. The page is generated by the server-side logic of an ASP.Net page, which I don't have access to. The page has several fields on it to allow you to enter a new container barcode, the item that should go in the container, etc. Each of these fields has an onchange event listener hooked up to it which calls the page's __doPostBack() function to verify the entered data. If the data is verified, the page code is re-served with the data entered so far, and focus is set to the next field on the form.
I want to automate this page with a userscript in Chrome. I started by using ViolentMonkey to inject a custom script, but I could only get the script to trigger on the initial load, not after each data entry. After this, I tried using Chrome Local Overrides to change __doPostBack() to try to capture the data I need to automate the page. That also only works once; after a field is filled and loses focus and new HTML is served, it overwrites Chrome's local copy.
I think that my problems are being caused by an asynchronous refresh of the entire page contents, which wipes out the injected userscript and Chrome's Local Override without triggering the normal page refresh listeners in Chrome Overrides or ViolentMonkey to re-inject the modified code. Does anyone have any thoughts on how I could modify the JavaScript in such a way that it would persist after the page content is replaced with new HTML?
P.S. I don't think the code itself is relevant to this particular problem, but if anyone thinks it would be helpful to share a limited section of the client-side code, let me know.
Edit 1: Here's a more in-depth view of what I'm trying to accomplish, and the progress I've made so far. For reference, the form looks like this:
My Original Plan
The user loads the page. ViolentMonkey injects a userscript which issues a series of prompts, collecting data on the range of new barcodes that the user would like entered into the system. (Specifically, the barcode prefix, the starting barcode number, and the ending barcode number.) This values are stored in localStorage.
After this data has been collected and validated by the user, the page loads normally. For reference, the form looks something like this:
The user fills out the fields as normal. After each field is filled out (with the exception of the Container Description field), the page pushes focus to the next field. (For example: <script language="javascript"> try { document.getElementById('txtContDesc').focus() } catch (e) { } </script>. The id of the field to focus is dynamically changed via the server logic.)
I need to collect the User Badge, Container Type, and Destination Barcode values so that I can refill them later when I automate the form. My original plan was to add a onfocus event listener to the Container Description field, since focus will be shifted to it once the Destination Barcode field has been verified. I will know at this point that the user has successfully entered a valid entry for each of the fields above the Container Description field, and I would then be able to collect these values and store them in localStorage.
Once I have all the data needed for the form, I would pilot the form using the userscript in ViolentMonkey and the data stored in localStorage, to persist data across page refreshes.
Other Alternatives:
The eventListener idea on an element doesn't work, because ASP.NET updates the page with fresh code every time a field is verified, wiping out the listener. It also doesn't trigger a refresh, so ViolentMonkey doesn't rerun my userscript.
My other thought was to modify doPostBack(). The doPostBack() function looks like this (as far as I can tell):
<script type="text/javascript">
var theForm = document.forms['formNewContainer'];
if (!theForm) {
theForm = document.formNewContainer;
}
function __doPostBack(eventTarget, eventArgument) {
console.log("Form submitted");
}
</script>
It is called on verified fields with the following onchange handler:
onchange="javascript:setTimeout('__doPostBack(\'ctl00$newContPage$txtBarcode\',\'\')', 0)"
My goal would be to modify doPostBack() to save the information I need to localStorage before executing the rest of doPostBack() without changing it.
(Note: doPostBack() here looks incredibly simplistic, so I think I'm missing some information about how ASP.NET works here. This is outside of the question though, unless it's relevant for what I'm trying to do.)
I was able to successfully modify doPostBack() in this way using Chrome Local Overrides to serve myself a local copy of the page on page load, instead of the server version. But this only works for the first doPostBack() request. After the first request, the server serves me new code. Like with ViolentMonkey, the lack of a refresh trigger prevents Chrome Local Overrides from re-serving my local copy, and I'm served code without the doPostBack() modification.
So that's where I'm at. I'll try adding a global listener like #wOxxOm suggested, and see where that gets me.
I ended up using a Chrome extension called "Run Javascript" (has an elephant for it's logo), which runs the JavaScript code even on AJAX requests.
Link: https://chrome.google.com/webstore/detail/run-javascript/lmilalhkkdhfieeienjbiicclobibjao/
I don't see how this is possible at all. You need to work with the people that created that web page.
Asp.net and the server side code will be EXTENSIVE .net code (c# or vb.net). Each of those events you trigger will set variables and server side session (or viewstate) values for the code behind to run.
That's how asp.net pages work. You post back, page travels up to server, THEN the .net code behind runs. That code will modify the page, modify controls, and modify the view state for that page. And after that code runs (say on a button click), then you client side will receive a whole new fresh page - that will blow out any JavaScript you try and inject. (you would have to re-inject each time). But, it gets worse, since quite of bit of that code behind also checks and often will NOT tolerate that the page settings have been messed with, and will be rejected.
About the only way to do this would be to write some desktop software, and that software would "house" or "host" a full "com" object copy of the web page, and you thus automate that given page. (and even then, you still fighting a losing battle).
Hint:
Web development, business logic, and a functional business applcation is NOT some simple markup and JavaScript (despite what that lame 2 week HTML course tells you).
This is a application, and asp.net applcation. Trying to think of this as just some markup and JavaScript is actually quite silly here. It not how you write, or build business solutions for a company.
If you can't write and modify the code and the web server side of things then find out if that site has some kind of web api or whatever.
But, really - this is silly, and unless this is some simple college project, or some hacked up html page and some JavaScript? Forget this approach - you dealing with FAR too much server side and code behind on the server.
In fact, asp.net as noted has quite a bit built in features that check if the page being posted back been messed with, and you never really be sure that you set values and that the proper amounts of code behind that runs to setup row values, database primary key values and a WHOLE boatload of state values that are probably 100% saved in server side session() based class objects - and objects that are never exposed server side.
Tring to supposed modify or assume you can create or modify such a system with only client side tools is not going to work - its just not.
code behind runs, it re-processes the page with .net code and then sends the whole page back down - all with new state values etc. This is not some lame html + JavaScript, but is a full server side code driven system written in c# .net code.

Does everyone shares the same javascript file?

So I'm using Javascript and ajax to connect to a database through an php file, but something came in mind.
If a User log in, the user data will be stored in my Javascript file tittle UserProces.js as:
Var Username = "James"
Var Age ="25";
(Data obtain from a query through a php: RetrieveUserData.php)
If 1 minute after James loged in, another user name Amy log in will the values of name and age of amy will effect the values of James? Since there is only one UserProces.js.
Of course NO! Each user is getting his local copy of javascript file.
The server sends each client that requests the page a copy of the javascript file it has stored. That copy is then in their browser and running there. Any changes to variables are done in that copy in their browser. They have to way (well, unless you set up something special) to change the original file on the server. Think of it like this:
I'm a teacher with a test document on my computer (this is the javascript file on the server). For each student who comes into the class and asks to take the test (a client requesting the page) I'm going to print off a copy in my printer and give them. They will then write their name on the test and fill in answers (assign values to variables). A student doing this doesn't effect anyone else in the class because they aren't changing the original document, they are just editing their copy.
Not a perfect analogy, obviously, but pretty darn close.
Also, addressing a comment made earlier, you probably aren't accessing the service "through a php file". You are using a php file to generate a copy of the web page for the user to view. Again, printing off a copy for the user, but in this case the php file gives a special set of instructions for exactly what should be "printed off".
Each user will load the same script file but all variables, objects and everything else gets stored by each browser, and even your browser doesnt share that info, which prevents one website to have access to variables on another website.
So, final answer is no. They will not share any info. Just load the same "base".

Possible to cache JSON to increase performance / load time?

I'm using a JSON file to autopopulate a drop down list. It's by no means massive (3000 lines and growing) but the time taken to refresh the page is becoming very noticeable.
The first time the page is loaded the JSON is read, depending on what option the user has selected dictates which part of the JSON is used to populate the drop down.
It's then loaded on every refresh or menu selection after. Is it possible to somehow cache the values to prevent the need for it to be reloaded time and time again?
Thanks.
EDIT: More Info:
It's essentially a unit converter. The JSON holds all the details. When a users selects 'Temp' for example a call is made and the lists are populated. Once a conversion is complete you can spend all day running temp conversions and they'll be fine but everytime a user changes conversion type so now length, the page refreshes and takes a noticeable amount of time.
Unfortunately, I don't know of a standardized global caching mechanism in PHP. This article says that Optimizer Plus, a third party accelerator, is being included in core PHP starting in version 5.5. Not sure what version you are using but you could try that.
On a different note, have you considered file storage as andrew pointed out? I think it combined with $_SESSION could really help you in this case. Let me give you an example that would work with your existing JSON data:
Server Side
Store your JSON data in a .json file on your PHP server:
{
"data": "some data",
"data2": "more data",
"data3": [
...
],
etc.
}
Note: Make sure to properly format your JSON data. Remember all strings must be enclosed in double quotes ".
In PHP, use an if statement to decide the appropriate action:
error_reporting(E_ALL);
ini_set("display_errors", "On");
session_start();
if(isset($_SESSION['dataCache'])) {
echo json_encode($_SESSION['dataCache']);
} else {
$file = 'data.json';
if (!is_file($file) || !is_readable($file)) {
die("File not accessible.");
}
$contents = file_get_contents($file);
$_SESSION['dataCache'] = json_decode($contents, true);
echo $contents;
}
So lets dig into the above coding a little more. So here's what we are doing in a nutshell:
Turn on error reporting and start session support.
Check to see if we've already read the file for this user.
If so, pull the value from storage and echo it out and exit. If not continue below.
Save off the file name and do a little error checking to ensure PHP can find, open and read the contents of the file.
Read the file contents.
Save the decoded json, which is not an array because of the `true` parameter passed to `json_decode`, into your `$_SESSION` variable.
Echo the contents to the screen.
This will save you the time and hazzle of parsing out JSON data and/or building it manually on the server. It will be cached for the users session so that they can use it through out.
Client Side
I assume you are using ajax to fetch the information? If not correct me, but I was assuming that's where some of your JavaScript comes into play. If so you may consider this:
Store the returned data in sessionStorage on the user's browser when it's returned from the server:
$.ajax({
...
success: function (res) {
localStorage.setItem("dataCache", JSON.stringify(res));
},
...
});
Or if you use promise objects:
$.ajax({
...
}).done(function (res) {
localStorage.setItem("dataCache", JSON.stringify(res));
});
When you need to read it you can do a simple test:
var data;
// This returns null if the item is not in local storage.
// Since JavaScript is truthy falsy, it will be evaluated as false.
if(localStorage.getItem("dataCache")) {
data = JSON.parse(localStorage.getItem("dataCache"));
} else {
// Make ajax call, fetch object and store in localStorage in the success or done callbacks as described above
}
Notes:
localStorage is a new feature in HTML5, so it's not fully supported on all browsers yet. Most of the major ones do however, even as far back as IE8 (I think). However, there is no standardized size limit on how much these browsers are required to hold per site.
It's important to take that into consideration. I can guarantee you probably will not be able to store the entire 30,000 line string in localStorage. However, you could use this as a start. Combined with the server side solution, you should see a performance increase.
Hope this helps.
I use the browser's cache to ensure that my large chunk of JSON is only downloaded once per session. I program in ASP.NET, but I'm sure PHP has the same mechanisms:
On session start, I generate a random string as session key for my dynamic JavaScripts. This key get stored in the ASP.NET session state under the key JsonSessionID. That way I can refer to it in my page markup.
I have a "generic http handler" (an ashx file) that when called by the browser, returns a .js file containing my JSON.
In my HTML I include the dynamic script:
<script type="text/javascript" src="/dynamicJSON.ashx?v=<%= JsonSessionID %>"></script>
The browser will automatically cache any URLs included as scripts. The next time the browser is asked to load a cached script from a URL, it will just load up the file from the local disk. This includes dynamic pages like this.
By adding the ?v= in there, I ensure that the JSON is updated once per session.
Edit
I just realized that your JSON is probably static. If that's the case, you can just put your JSON into a static .js file that you include in your HTML, and the browser will cache it.
// conversionData.js
var conversionData = { "a":1,"b":2,"c":3 };
When you include the conversionData.js, the conversionData variable will be in scope with the rest of your page's JavaScript that dynamically updates the drop-downs.
Edit 2
If you are serving static files, this blog post has a good pattern for cache-busting based on the file's date modified property. i.e. the file is only downloaded when it is changed on the server.
I have yet to find a good method for cache-busting JSON created via database lookup tables, other than per-session. Which isn't ideal because the database could change mid-session.
Once you've got your JSON data decoded into an object you can just keep the object around, it should persist until a page reload at least.
If you want to persist between reloads you might want to look at HTML5's localStorage etc.
You would need to come up with an age strategy, maybe just dump the current date in there with it as well so you can compare that and expire as needed.
I would suggest storing your json data to a session. On first page load you can write a script to get your json data, then store them into a session.
on each page load/refresh afterwards you can check our session to decide what to do - use the session data or fetch again your json data.
This approach suites me for small scale data (for example: an array of products - colors - sizes - prices).
Based on your data you should test you loading times.
Here is a simple hack:
Create a call to a php file as GET request with parameter "bla-bla.html"
or "bla-bla.css"... well you know, it makes browser think it is not a php, but rather "html" or "css". And browser will cache it.
To verify that the trick is working - go to the "network" tab of the browser dev panel and you will see column "type" there along with "transferred" - instead of having php there and actual size, you will find "html" and "(cached)"
This is also good to know when you passing parameters like "blah-blak.html" to the php file and expect it will not be cached. Well, it will be cached.
Tested on FireFox Quantum 57.0.1 (Mac 64bit)
P.S.
Chrome 63 on Mac is capable of recognising real file type in this situation. So it cannot be fooled.
Thinking out of the box here:
but if your list has 3000 lines and growing (as you said)
is it possible for you to establish its maximum size ?
let's say the answer is 10,000 (max) items; then do you really need an ajax call ?
you could transfer the data straight away with the page
(depending on your architecture of course, you could come out with different solution)

Can JavaScript variables be easily modified maliciously?

I am setting up a quiz that uses boolean variables for correct/incorrect and then passes those variable values to a PHP script via Ajax for processing and storing in a database.
How easily could someone override the values set by my code with after finding the var names with "view source"?
Yes.
You should send the answers to the server and let the server grade the quiz.
They can do it easily using Chrome/Firebug Console by issuing a Javascript command over there like
var your_var_name = 60;
You must have backend synchronization also to prevent this.
for testing purpose you can use firefox add on firebug or chrome's developer tool kit. Change your javascript variable using inspect element and than perform action of button which posts your data. On server side you should make sure that posted variable must be any of variable that is in option of answer of that question.

HTTP cookie between two HTML pages

I have two HTML pages. After entering few inputs users will be redirected from first page to second page. Before redirecting the user to second HTML page(using window.location="new HTML URL"), I persist few of the user inputs in cookie using document.cookie DOM API.
When I am in the second HTML page, I could not retrieve the value from this cookie. I think since document object would have changed in the new HTML page, my cookie values become inaccessible.
Can someone tell me: how do I retrieve the value from a cookie persisted by one javascript in one HTML page in other HTML page i.e cookie written by HTML A's javascript in HTML B's javascript?
I don't have any server-side code, so I could not take advantage of server-side logic. Also I am not supposed to pass the values in URL. So I need a solution on plain javascript and HTML.
If some one has a better solution please let me know. Thanks
try to use localStorage instead of cookies,
// set your values in the first page
localStorage.setItem('itemKey', 'values');
// on the second page, retrieve them
var values = localStorage.getItem('itemKey');
you can use a jStorage plugin for cross browser behaviour.
also refer to this question for storing objects instead of strings
JAAulde is on point with his answer.
For what the OP is trying to do something like PHP would be great, in that case I wouldn't bother with cookies in order to just pass data between two pages, that's just silly. However, if true persistence was needed and the data requirements were simple cookies would be the way to go even while using a language such as PHP.
Those are rather draconian constraints, is this a class project? That said there aren't any other ways to do what you're attempting, save for an ugly and highly insecure hack of the DOM.

Categories

Resources