Javascript function execution on link click? - javascript

I have a link, that when a user clicks on it, it loads a different page as normal but also executes a JS function that autofills a specific text-box on that different page. Is it better practice to use Jquery or Javascript to do this? How can I do this using either one of them?

You can't do this from the source page.
It's a security feature. Imagine if you wrote a JS function that went to an online banking page and auto-filled a bank transfer using the user's current cookie. That's why you can't.
If you control the other page then the sequence you can use is:
Save data to the server;
Go to the new page with a JS redirect;
The new page is loaded from the server;
While loading th epage the data that was saved from the server is retrieved and used to populate the text box.
So it can be done from the server but only if you save it there. The only way of doing that is using Ajax.
An alternative approach is:
Instead of a JS redirect, submit the page back to the server;
The server saves whatever data it needs to;
The server sends back an HTTP redirect to the new page;
The new page uses the saved data to construct the new page with the populated text box.

At the end of the script add return false;. This will make the page run the script without redirecting the page.
Edit: (after saw your edition).
Is it better practice to use Jquery or Javascript to do this? How can I do this using either one of them?
jQuery is a javascript library, this it doesn't matter if you use plain javascript or use jquery as long as you happy with the result.
And about what you say that you successfully manipulated a page fro the redirecter page... I don't see how it possible.

Related

How to preserve userscript modifications in Chrome after an asynchronous content update on an .aspx webpage

I'm trying to automate the workflow of a webpage for my company's inventory system. The page is generated by the server-side logic of an ASP.Net page, which I don't have access to. The page has several fields on it to allow you to enter a new container barcode, the item that should go in the container, etc. Each of these fields has an onchange event listener hooked up to it which calls the page's __doPostBack() function to verify the entered data. If the data is verified, the page code is re-served with the data entered so far, and focus is set to the next field on the form.
I want to automate this page with a userscript in Chrome. I started by using ViolentMonkey to inject a custom script, but I could only get the script to trigger on the initial load, not after each data entry. After this, I tried using Chrome Local Overrides to change __doPostBack() to try to capture the data I need to automate the page. That also only works once; after a field is filled and loses focus and new HTML is served, it overwrites Chrome's local copy.
I think that my problems are being caused by an asynchronous refresh of the entire page contents, which wipes out the injected userscript and Chrome's Local Override without triggering the normal page refresh listeners in Chrome Overrides or ViolentMonkey to re-inject the modified code. Does anyone have any thoughts on how I could modify the JavaScript in such a way that it would persist after the page content is replaced with new HTML?
P.S. I don't think the code itself is relevant to this particular problem, but if anyone thinks it would be helpful to share a limited section of the client-side code, let me know.
Edit 1: Here's a more in-depth view of what I'm trying to accomplish, and the progress I've made so far. For reference, the form looks like this:
My Original Plan
The user loads the page. ViolentMonkey injects a userscript which issues a series of prompts, collecting data on the range of new barcodes that the user would like entered into the system. (Specifically, the barcode prefix, the starting barcode number, and the ending barcode number.) This values are stored in localStorage.
After this data has been collected and validated by the user, the page loads normally. For reference, the form looks something like this:
The user fills out the fields as normal. After each field is filled out (with the exception of the Container Description field), the page pushes focus to the next field. (For example: <script language="javascript"> try { document.getElementById('txtContDesc').focus() } catch (e) { } </script>. The id of the field to focus is dynamically changed via the server logic.)
I need to collect the User Badge, Container Type, and Destination Barcode values so that I can refill them later when I automate the form. My original plan was to add a onfocus event listener to the Container Description field, since focus will be shifted to it once the Destination Barcode field has been verified. I will know at this point that the user has successfully entered a valid entry for each of the fields above the Container Description field, and I would then be able to collect these values and store them in localStorage.
Once I have all the data needed for the form, I would pilot the form using the userscript in ViolentMonkey and the data stored in localStorage, to persist data across page refreshes.
Other Alternatives:
The eventListener idea on an element doesn't work, because ASP.NET updates the page with fresh code every time a field is verified, wiping out the listener. It also doesn't trigger a refresh, so ViolentMonkey doesn't rerun my userscript.
My other thought was to modify doPostBack(). The doPostBack() function looks like this (as far as I can tell):
<script type="text/javascript">
var theForm = document.forms['formNewContainer'];
if (!theForm) {
theForm = document.formNewContainer;
}
function __doPostBack(eventTarget, eventArgument) {
console.log("Form submitted");
}
</script>
It is called on verified fields with the following onchange handler:
onchange="javascript:setTimeout('__doPostBack(\'ctl00$newContPage$txtBarcode\',\'\')', 0)"
My goal would be to modify doPostBack() to save the information I need to localStorage before executing the rest of doPostBack() without changing it.
(Note: doPostBack() here looks incredibly simplistic, so I think I'm missing some information about how ASP.NET works here. This is outside of the question though, unless it's relevant for what I'm trying to do.)
I was able to successfully modify doPostBack() in this way using Chrome Local Overrides to serve myself a local copy of the page on page load, instead of the server version. But this only works for the first doPostBack() request. After the first request, the server serves me new code. Like with ViolentMonkey, the lack of a refresh trigger prevents Chrome Local Overrides from re-serving my local copy, and I'm served code without the doPostBack() modification.
So that's where I'm at. I'll try adding a global listener like #wOxxOm suggested, and see where that gets me.
I ended up using a Chrome extension called "Run Javascript" (has an elephant for it's logo), which runs the JavaScript code even on AJAX requests.
Link: https://chrome.google.com/webstore/detail/run-javascript/lmilalhkkdhfieeienjbiicclobibjao/
I don't see how this is possible at all. You need to work with the people that created that web page.
Asp.net and the server side code will be EXTENSIVE .net code (c# or vb.net). Each of those events you trigger will set variables and server side session (or viewstate) values for the code behind to run.
That's how asp.net pages work. You post back, page travels up to server, THEN the .net code behind runs. That code will modify the page, modify controls, and modify the view state for that page. And after that code runs (say on a button click), then you client side will receive a whole new fresh page - that will blow out any JavaScript you try and inject. (you would have to re-inject each time). But, it gets worse, since quite of bit of that code behind also checks and often will NOT tolerate that the page settings have been messed with, and will be rejected.
About the only way to do this would be to write some desktop software, and that software would "house" or "host" a full "com" object copy of the web page, and you thus automate that given page. (and even then, you still fighting a losing battle).
Hint:
Web development, business logic, and a functional business applcation is NOT some simple markup and JavaScript (despite what that lame 2 week HTML course tells you).
This is a application, and asp.net applcation. Trying to think of this as just some markup and JavaScript is actually quite silly here. It not how you write, or build business solutions for a company.
If you can't write and modify the code and the web server side of things then find out if that site has some kind of web api or whatever.
But, really - this is silly, and unless this is some simple college project, or some hacked up html page and some JavaScript? Forget this approach - you dealing with FAR too much server side and code behind on the server.
In fact, asp.net as noted has quite a bit built in features that check if the page being posted back been messed with, and you never really be sure that you set values and that the proper amounts of code behind that runs to setup row values, database primary key values and a WHOLE boatload of state values that are probably 100% saved in server side session() based class objects - and objects that are never exposed server side.
Tring to supposed modify or assume you can create or modify such a system with only client side tools is not going to work - its just not.
code behind runs, it re-processes the page with .net code and then sends the whole page back down - all with new state values etc. This is not some lame html + JavaScript, but is a full server side code driven system written in c# .net code.

Is it possible to make a HTTP POST to a standard ASP.Net Web Form using XMLHttpRequest/FormData?

Let's suppose we have an ASP.Net Web Form, Page.aspx, in which we do the following:
<script>
$(document).ready(function () {
// grab the standard ASP.Net form
var form = document.forms['ctl01'];
form.addEventListener("submit", function (event) {
event.preventDefault();
sendData(form);
});
});
function sendData(form) {
const xhr = new XMLHttpRequest();
const fd = new FormData(form);
xhr.addEventListener("load", function (event) {
document.open();
document.write(event.target.response);
document.close();
});
xhr.addEventListener("error", function (event) {
alert('Error!');
});
xhr.open("POST", "Page.aspx");
xhr.send(fd);
}
</script>
The reason for this setup is I want to take advantage of the XMLHttpRequest progress event to erm, show some progress indication because the postback may include files that take some time to upload.
The load event handler works great. As a result of the POST I get the contents of Page.aspx again and replace my current document. So it seems that some kind of POST actually does happen BUT, there is one problem. In Page.Load(), the Request.Form and Request.Files collections are empty so I can't process the form/files.
I tried adding the following header but without much luck:
xhr.setRequestHeader('Content-type', 'application/x-www-form-urlencoded');
Do you think it is possible to make a successful POST (with page receiving data) using XMLHttpRequest/FormData, or is there some fundamental limitation that prevents this from happening for ASP.Net pages?
Thanks!
Well any ajax call simply can run some code behind, but since the web page IS STILL sitting on the client side in the browser, then things like controls and the page state are NOT available.
So you don’t want a post back, but now you asking for a post back? (I am confused). I mean, either you post back the whole page (standard event post back). Or you drop the controls and things in question into a update panel, and then ONLY that part of the page is posted back. I fail to see any advantage of trying to send “more” of the page in a ajax call when the WHOLE idea is to NOT send the page in the first place, right?
I mean, if you need some extra values in the ajax call, then you have to get/grab those bits and parts from the page, and include that information in your ajax call. (perhaps as a json string).
Without a post back, then viewstate and all of the controls are STILL just sitting on the users desktop in the browser. The code behind, and even the page class object + code ARE OUT OF scope at this point in time. Only upon a post-back does the WHOLE page travel up to server - code behind runs - you have use of full controls on the page, and THEN the whole page travels back down to the client side (and this quite much means that JavaScript code will have to re-start!!!
However, if you need a few parts and values in a page and don't want a full post-back? Then simply put those parts into a update panel. you can then in JavaScript for example do this:
varMyAspNetButton = docuement.GetElementById("Button1");
varMyAspNetButton.Click();
The above will save you world poverty and not have to wire up a bunch of js and web methods since the js code simply CLICKS on your button (that is inside the update panel). In this case, of course a whole page post back does NOT occur, but the page load and events do in fact fire - these so called "partial" page post backs means that the code behind is LIMITED to the information (controls) inside of that up-date panel.
However, as noted, if you do a post back, then the browser page NOW travels up to the server - and that quite much means any js code client side is toast and now can't run, since a whole NEW fresh copy of the web page is about to travel down to the client side again - and that re-starts your js code.
As noted, you can do a partial page post back with a update panel. And in js you can fire a "click" or in fact MOST events of asp.net controls on that page with js.
But, then again?
You don't want a full page post back, and you likely don't want all of the controls and the whole page to travel up to the server. But then again you wondering why you can't use or access controls on the page with ajax calls? Well as noted, the server side code behind is OUT of scope and OUT of context when you make ajax calls. The web page does NOT exist server side. We don't know if the user turned off their computer, or will never do anything in that client side browser and web page. The server at that point in time has lost ALL KNOWLEGE of that web page. So any ajax call does not have use of the controls on the page, and does not even have use of viewstate either.
This tends to mean that say when using say a ajax system to up-load files? Well, you can't store the status in the web page server side - since the page DOES NOT really exist at that point in time. So you can call some web methods, and about the ONLY way to keep some values in context is to use session(), since that does not need the web page, or the view state to function and work.
The major down side of session() of course is that if some user has two tabs open or even two different browsers open? Well, session() is SHARED between those pages - so while session() is great, it also shared between ALL copies of web pages for that given user - and thus you need to add code to separate out each session "set" of values, or simply hope that the user will not have two pages in operation for such file up-loads.
But to answer your question?
You can do and achieve partial page pushbacks by using a up-date panela And thus you can have timer code or js code client side to continue to run since a full page life-cycle does NOT occur. In other words, you control what part of the web page will and is sent up to the server side by using a update panel.
If you don't use a up-date panel, then any ajax calls you make WILL have to pass the data from the browser side, since it STILL just sitting on the users desktop, and any code behind can't grab, nor reach out, or see or even KNOW that the web page exists client side.
So you either pass extra values from the web page with your ajax calls OR YOU can use a update panel, drop controls inside and then the partial page post back will ONLY send up and have use of what you want inside of that panel. So you have two really great choices.
And in either case (a full page post back) or a partial one?
Grab a reference to the client side asp.net button, and fire off a .click event. You can I suppose wire up all kinds of _doPostBack in js, but with update panels and the click() trick, then you have a choice of how much of the page gets sent up, and it all quite much automatic wired up for you and saves a TRUCKLOAD of work that you would have to manually write and wire up if you don't use a update panel to control this.
So you get that "partial" page post back, and in that case the code and events inside of that up-date panel can update/see/use/modify controls in that up-date panel, but anything outside of that up-date panel will NOT have traveled up to the server.
And if you don't use a update-panel, then any ajax call is just that - a direct call to the server side - but the web page STAYS client side - thus on-load and any of the controls or objects or in fact the WHOLE class form object that represents that web page IS STILL SITTING client side - thus as noted, no on-load, no code behind can touch or even see or know about the values of controls on that page, and as noted there is also no ViewState either.
The WHOLE idea of ajax calls is that you did not want and never did want the page to travel up to the server, and then be re-rendered, and then re-sent back down to the client side. But you need to be 100% crystal clear here:
Without a page post back (or partial one with update panels), then the web page does NOT exist any more server side. Web pages are state-less and once the round trip has occurred (web page up to server - code behind runs, page sent back to client), then as far as the server is concerned (and you the developer) that web page is GONE and DOES NOT exist anymore at all - it is out of scope and from your point of view (and the server point of view) that web page does NOT exist anymore the instant it been sent back down to the client side. As noted, the only exception that is practical here is session() values - since they are not part of any given web page.
So, you have to decide if you want a partial page post back to get at and modify some values with server side code.
Or you pass the values with your ajax calls and the returned values can then update the browser controls. And of course once you do eventually do that say full page post back, then the code behind can certainly see + use any controls that the client js code changed - but can only do so with that full page post back, or as noted, controls limited to a update panel if we are talking about a partial page post back (update panel).
You either have to include additional data in your ajax calls, or consider using a partial page post back to send up part of the web page if you need to modify that part of the page with code behind. Or as noted, return information with your ajax call, and then update the client side. There not really a in-between choice here.

Run PHP function when img clicked

I'm wondering if there's a way to perform a function when an image is clicked in PHP? I know it can be done in Javascript. For example, when foo.png is clicked, the following function will run.
function example() {
header('Location: example.php');
}
PHP is run on the server, and is used before and while creating the page -- and then it is submitted to the browser for rendering.
Once the page is rendered and displayed to the user, all PHP processing has stopped and the page has been served to the client. Server code no longer runs - its time has passed.
Once in the browser, only client-side code can run (javascript). Particularly when interacting with the user, client side code is all you can use.
However, you can also use client-side code (javascript/jQuery) to interact with the user (detect a click or mouse movement) and then use AJAX to send data over to another server-side PHP file. The PHP file will "wake-up" as it receives the data, and it can do some additional server-side stuff -- such as use the received data to perform a DB look-up, then take the new DB data and create some HTML code and send that back to the page. This new HTML code can be injected onto the page in the success (or .done() ) function of the AJAX code block and new data can appear on the page without refreshing or navigating away from the current page.
But for actually detecting the user click, it's javascript/jQuery.
Note that PHP can also inject javascript along with the HTML - but once the code has been served to the browser for display, it is only the javascript that can interact with the user, not the PHP.
With javascript/jQuery, you can do what you want like this (the code is correct but the example will not work properly - SO will not navigate to the Google webpage):
$('#myImg').click(function(){
window.location.href = 'http://google.com';
});
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>
<p>Click below for Google</p>
<img id="myImg" src="http://placeimg.com/50/50/animals" />

Save jquery modified page permanently

I want to save modifications made on a HTML page(modifications made with JQuery), PERMANENTLY! I have read that this thing gets possible by sending an Ajax call and saving it in a table in a database, but what do you actually save in the database? The URL of the page? And what do you retrieve back in the Ajax call so that your modifications actually stay on the page?
This is a Spring MVC based web application, in case this information is needed.
I have no clue how to start or if to start trying saving it, because I have also read that this thing might not be possible, as we're talking about Client-Side modifications.
Modification that I am trying to make:
function versionOne() {
$('#title').addClass('text-center');
$('#title').css({"margin-top":"0px","color":"black", "font-size":"45px"});
$('#title').append('<hr>');
$('#content').addClass('col-md-6');
$('#content').css({"margin-top":"80px","font-size":"20px", "text-align":"center"});
$('#picture').addClass('col-md-6');
$('#picture').css({"border-radius":"25px", "margin-top":"50px"});
}
I'd be grateful for some suggestions!
Thanks :)
Saving the whole page won't work in most cases since it's very hard to also save the JavaScript state. So while you can save a static copy of the page without JavaScript with $('html').html(), that doesn't get you very far (or causes more trouble than it's worth).
What you want is "preferences". The site should remember some specific values. The usual approach is to load the preferences from the database before the server sends the page for the client. Apply them to the elements of the page and send the result to the browser. That way, the page looks as expected when the user sees it.
When the user changes these settings, use JavaScript to update the page and send the changes as AJAX requests to the server to persist them in the database.
When the user returns to the page, the code above will make sure that the page now looks as before.

How do I get url that is hidden by javascript on external website?

How do I get url that is hidden by javascript on external website?
ex: http://royaldesign.se/Att_Dricka.aspx
This url is constant through navigation of pages, so page content is loaded by javascript.
link location of a page:
javascript:__doPostBack('ctl00$masterContent$DataPager2$ctl00$ctl00','')
javascript:__doPostBack('ctl00$masterContent$DataPager1$ctl00$ctl01','')
javascript:__doPostBack('ctl00$masterContent$DataPager1$ctl00$ctl02','')
.....
Is there a way to analyze (manually or by PHP script) the function __doPostBack to find out about the urls?
Thx in advance
Those values are not hidden, the __doPostBack method posts back to itself. Those values passed to doPostBack represent the html ID's of the control doing the postback.
The page your looking at is written in ASP.NET also, not PHP.
You can use your browsers debug tools to see what data is being passed back to the server via javascript.
The __doPostBack javascript function is used to submit data to an asp.net page.
The first parameter to the function is the event target. This is the ClientID of the control that is being clicked.
Asp.net uses this value to raise a Click event on the server when the page gets submitted.
You can call this __doPostBack function via javascript yourself to get the same behavior as a user clicking it.
I gave some tips on "scraping" ASP.net pages on this other question: curl script just filling up the form not submitting it
The basics of simulating a POST request using CURL are discussed here: PHP + curl, HTTP POST sample code?
I would also add that if the site you are "scraping" from is owned by someone you are on friendly terms with (and not e.g. a competitor!) you may be able to save a lot of time by asking nicely for the content, or a static URL that gives you the content.

Categories

Resources