This is going to be a difficult one to explain but basically I have an angular application with a login form which runs a function on submit which sends an ajax request to the server to do the login.
Now, I'm not using ng-submit but hijacking the normal submit attribute like this:
<form my-form submit="controllername.doSubmit()">
I then have an angular directive called "my-form" which uses {require: 'form'} in it's definition object and then does this in the postLink function:
element.bind('submit', function(event)
{
// Removed for brevity
scope.$apply(scope.submit);
});
So, basically this form submit stuff was written a long time ago and does a lot of other stuff like triggering form validation and stuff by default so I don't wnt to rewrite any of this or go back to using ng-submit. Aside form anything else I have a few big apps using this code which would need to change a lot.
Anyway, it all works fine on the surface but if I fill in the log in form and then do some other stuff (including filling in other forms set up the same way) and then leave my laptop for a few days and come back to the page, somehow all the form data has been added into the URL bar, after the ?? and before the # including the password in plain view!
Not sure why this doesn't happen straight away, ony after un-sleeping the PC, and not always. The other weird thing is that the names of the parameters are not the original ones (email, password) but the names of the parameters of the first form currently on the page (actionStatus, required), so Chrome is obviously getting very confused.
My instinct tells me that when the form is submitted, the formData is being stored somewhere for later because I'm not cancelling the default action of the form correctly when I'm running my javascript function and because it's a single page application that formData never leaves the memory. It's then thinking it's gone to a new page and putting that data in the URl, but it's getting the names wrong because the forms on the page has changed.
Sorry, I can't provide more code, just a fairly wooly description but I don't know what else to say really, it's all very strange.
I finally have an answer to this and the answer is totally unexpected.
I use a plugin called browserSync to sync multiple browsers I'm testing on and it had a feature called form syncing switched on.
I had chrome open and when I opened up another browser (typically in the morning when the computer first woke up) I had to log in and because they were on different pages it was copying my login details from Firefox into the first form it found in my existing chrome window. It was also putting the contents of Firefox's login form into chrome's URL.
Crazy.
Related
I'm trying to automate the workflow of a webpage for my company's inventory system. The page is generated by the server-side logic of an ASP.Net page, which I don't have access to. The page has several fields on it to allow you to enter a new container barcode, the item that should go in the container, etc. Each of these fields has an onchange event listener hooked up to it which calls the page's __doPostBack() function to verify the entered data. If the data is verified, the page code is re-served with the data entered so far, and focus is set to the next field on the form.
I want to automate this page with a userscript in Chrome. I started by using ViolentMonkey to inject a custom script, but I could only get the script to trigger on the initial load, not after each data entry. After this, I tried using Chrome Local Overrides to change __doPostBack() to try to capture the data I need to automate the page. That also only works once; after a field is filled and loses focus and new HTML is served, it overwrites Chrome's local copy.
I think that my problems are being caused by an asynchronous refresh of the entire page contents, which wipes out the injected userscript and Chrome's Local Override without triggering the normal page refresh listeners in Chrome Overrides or ViolentMonkey to re-inject the modified code. Does anyone have any thoughts on how I could modify the JavaScript in such a way that it would persist after the page content is replaced with new HTML?
P.S. I don't think the code itself is relevant to this particular problem, but if anyone thinks it would be helpful to share a limited section of the client-side code, let me know.
Edit 1: Here's a more in-depth view of what I'm trying to accomplish, and the progress I've made so far. For reference, the form looks like this:
My Original Plan
The user loads the page. ViolentMonkey injects a userscript which issues a series of prompts, collecting data on the range of new barcodes that the user would like entered into the system. (Specifically, the barcode prefix, the starting barcode number, and the ending barcode number.) This values are stored in localStorage.
After this data has been collected and validated by the user, the page loads normally. For reference, the form looks something like this:
The user fills out the fields as normal. After each field is filled out (with the exception of the Container Description field), the page pushes focus to the next field. (For example: <script language="javascript"> try { document.getElementById('txtContDesc').focus() } catch (e) { } </script>. The id of the field to focus is dynamically changed via the server logic.)
I need to collect the User Badge, Container Type, and Destination Barcode values so that I can refill them later when I automate the form. My original plan was to add a onfocus event listener to the Container Description field, since focus will be shifted to it once the Destination Barcode field has been verified. I will know at this point that the user has successfully entered a valid entry for each of the fields above the Container Description field, and I would then be able to collect these values and store them in localStorage.
Once I have all the data needed for the form, I would pilot the form using the userscript in ViolentMonkey and the data stored in localStorage, to persist data across page refreshes.
Other Alternatives:
The eventListener idea on an element doesn't work, because ASP.NET updates the page with fresh code every time a field is verified, wiping out the listener. It also doesn't trigger a refresh, so ViolentMonkey doesn't rerun my userscript.
My other thought was to modify doPostBack(). The doPostBack() function looks like this (as far as I can tell):
<script type="text/javascript">
var theForm = document.forms['formNewContainer'];
if (!theForm) {
theForm = document.formNewContainer;
}
function __doPostBack(eventTarget, eventArgument) {
console.log("Form submitted");
}
</script>
It is called on verified fields with the following onchange handler:
onchange="javascript:setTimeout('__doPostBack(\'ctl00$newContPage$txtBarcode\',\'\')', 0)"
My goal would be to modify doPostBack() to save the information I need to localStorage before executing the rest of doPostBack() without changing it.
(Note: doPostBack() here looks incredibly simplistic, so I think I'm missing some information about how ASP.NET works here. This is outside of the question though, unless it's relevant for what I'm trying to do.)
I was able to successfully modify doPostBack() in this way using Chrome Local Overrides to serve myself a local copy of the page on page load, instead of the server version. But this only works for the first doPostBack() request. After the first request, the server serves me new code. Like with ViolentMonkey, the lack of a refresh trigger prevents Chrome Local Overrides from re-serving my local copy, and I'm served code without the doPostBack() modification.
So that's where I'm at. I'll try adding a global listener like #wOxxOm suggested, and see where that gets me.
I ended up using a Chrome extension called "Run Javascript" (has an elephant for it's logo), which runs the JavaScript code even on AJAX requests.
Link: https://chrome.google.com/webstore/detail/run-javascript/lmilalhkkdhfieeienjbiicclobibjao/
I don't see how this is possible at all. You need to work with the people that created that web page.
Asp.net and the server side code will be EXTENSIVE .net code (c# or vb.net). Each of those events you trigger will set variables and server side session (or viewstate) values for the code behind to run.
That's how asp.net pages work. You post back, page travels up to server, THEN the .net code behind runs. That code will modify the page, modify controls, and modify the view state for that page. And after that code runs (say on a button click), then you client side will receive a whole new fresh page - that will blow out any JavaScript you try and inject. (you would have to re-inject each time). But, it gets worse, since quite of bit of that code behind also checks and often will NOT tolerate that the page settings have been messed with, and will be rejected.
About the only way to do this would be to write some desktop software, and that software would "house" or "host" a full "com" object copy of the web page, and you thus automate that given page. (and even then, you still fighting a losing battle).
Hint:
Web development, business logic, and a functional business applcation is NOT some simple markup and JavaScript (despite what that lame 2 week HTML course tells you).
This is a application, and asp.net applcation. Trying to think of this as just some markup and JavaScript is actually quite silly here. It not how you write, or build business solutions for a company.
If you can't write and modify the code and the web server side of things then find out if that site has some kind of web api or whatever.
But, really - this is silly, and unless this is some simple college project, or some hacked up html page and some JavaScript? Forget this approach - you dealing with FAR too much server side and code behind on the server.
In fact, asp.net as noted has quite a bit built in features that check if the page being posted back been messed with, and you never really be sure that you set values and that the proper amounts of code behind that runs to setup row values, database primary key values and a WHOLE boatload of state values that are probably 100% saved in server side session() based class objects - and objects that are never exposed server side.
Tring to supposed modify or assume you can create or modify such a system with only client side tools is not going to work - its just not.
code behind runs, it re-processes the page with .net code and then sends the whole page back down - all with new state values etc. This is not some lame html + JavaScript, but is a full server side code driven system written in c# .net code.
So I recently signed up with anti-captcha and have been testing with the https://github.com/ad-m/python-anticaptcha/blob/master/examples/recaptcha_selenium.py script.
I cannot get past a reCAPTCHA that has no submit button (hidden or visible) nor a clear way to submit for verification. I've used the anti-captcha firefox plugin, so I know it can be passed. But I am stuck at the point of doing this manually myself.
I thought this was going to be a helpful answer, but it doesn't go into depth. I am able to get the job.get_solution_response() token and enter it into the required textfield, but I cannot submit the "form."
Does anyone have success with this? I am also looking to do this in a headless version of the browser. Would a solution be different based on headless vs non-headless?
BTW, realtor(dot)com is the website I am having trouble with. If I am not aloud to post this site, please let me know so I can remove it.
I went back into the source code of the site and found they are using a function as a callback for verification...
I saw that they are using a function called solvedCaptcha and injecting a variable name payload. So here is how I solved it:
NOTE: Make sure the driver is currently on the reCAPTCHA page.
driver.execute_async_script("var payload = '<enter the job.get_solution_response() here>'; solvedCaptcha(payload);")
This async script then calls the page's verification and it reloads the blocked window.
NOTE: This CAPTCHA did not have a submit button and was not placed in a form. So using .submit() on an item within the driver would not work.
try to sendKeys of \n or Keys.Enter to the textfield.
This was my solution in my projects.
Also this is generic one
In all of the sample code I have seen, it appears that the only function of onbeforeunload is to serve up an alert dialog box prior to the person leaving the page. Is that the only thing that can be triggered by the event or is it possible to do something else, like an unobtrusive function that sends off partial form data?
I am trying to capture abandoned shopping carts in Yahoo! Small Business and unfortunately I do not have access to any server side scripting, so I'm forced to work client-side only.
I was also thinking of doing an ajax posting of data after the email field was changed, and then comparing the list of all forms partially submitted against completed carts to determine which were incomplete.
You can save the partial form data in localStorage. Then, when another page is loaded, you could check for the presence of that data and AJAX it to the server, deleting it from localStorage on success. Or you might be able to just use that data in JavaScript, without involving the server, but that depends on your setup.
<body onbeforeunload="return ('You will lose all your data')" onunload="alert('You have gone away!')">
</body>
Onbeforeunload uses for alert box. Onunload for anything else.
You can technically fire off an ajax event, but there is no guarantee that it will complete before the page is actually reloaded.
We are trying to implement a web page that each time of page refreshing will not result in the form resubmit, how to achieve that? Is there any Javascript code or HTML can make it WITHOUT external javascript library(jquery, dojo or extJs)
The reason of such design is that the form is going to tie an unique relation to current data with means cannot do it twice but for security reason we have to use POST instead of GET, also after the action we still want to preserve user the right to do similar action on the same page to another relation. so how to avoid a consequence like that?
Thanks.
Suppose that the action to the form submits it to submit_form.php. That file can handle the data and do whatever it needs to do. Then in it's response, it can redirect the browser to a separate page (you'll have to look up the exact method of how to do this depending on what language you write your POST handler in). This separate page can show the results of the form submit using session variables or some other method.
Internet Explorer (with default settings, which I generally assume will be in effect on the desktops of the Great Unwashed) seems to dislike the idea of accepting attachment content in an HTTP response if the corresponding request wasn't made directly from a user action (like a "click" handler, or a native form submit). There are probably more details and nuances, but that's the basic behavior that's frustrating me.
It seems to me that this situation is common: the user interface in front of some downloadable content — say, a prepared PDF report — allows for some options and inputs to be used in the creation of the content. Now, as with all forms that allow the user to stipulate how an application does something, it's possible that the input will be erroneous. Not always, but sometimes.
Thus there's a dilemma. If the client tries to do something fancy, like run an AJAX transaction to let the server vet the form contents, and then resubmit to get the download, IE won't like that. It won't like it because the actual HTTP transaction that carries the attachment back will happen not in the original user-action event handler, but in the AJAX completion callback. Worse, since the IE security bar seems to think that the solution to all one's problems is to simply reload the outer page from its original URL, its invitation to the user to go ahead and download the suspicious content won't even work.
The other option is to just have the form fire away. The server checks the parameters, and if there's anything wrong it responds with the form-container page, peppered appropriately with error messages. If the form contents are OK, it generates the content and ships it back in the HTTP response as an attached file. In this case (I think), IE is happy because the content was apparently directly requested by the user (which is, by the way, a ridiculously flimsy way to tell good content from bad content). This is great, but the problem now is that the client environment (that is, the code on my page) can't tell that the download worked, so the form is still just sitting there. If my form is in some sort of dialog, then I really need to close that up when the operation is complete — really, that's one of the motivations for doing it the AJAX way.
It seems to me that the only thing to do is equip the form dialogs with messaging that says something like, "Close this when your download begins." That really seems lame to me because it's an example of a "please push this button for me" interface: ideally, my own code should be able to push the buutton when it's appropriate. A key thing that I don't know is whether there's any way for client code to detect that form submission has resulted in an attachment download. I've never heard of a way to detect that, but that'd break the impasse for me.
I take it you're submitting the form with a different target window; hence the form staying in place.
There are several options.
Keep the submit button disabled and do ongoing validation in the background, polling the form for changes to fields and then firing off the validation request for a field as it changes. When the form is in a valid state, enable the button; when it isn't, disable the button. This isn't perfect, as there will tend to be a delay, but it may be good enough for whatever you're doing.
Do basic validation that doesn't require round-trips to the server in a handler for the form's submit event, then submit the form and remove it (or possibly just hide it). If the further validation on the server detects a problem, it can return a page that uses JavaScript to tell the original window to re-display the form.
Use a session cookie and a unique form ID (the current time from new Date().getTime() would do); when the form is submitted, disable its submit button but keep it visible until the response comes back. Make the response set a session cookie with that ID indicating success/failure. Have the window containing the form poll for the cookie every second or so and act on the result when it sees it. (I've never done this last one; not immediately seeing why it wouldn't work.)
I expect there are about a dozen other ways to skin this cat, but those are three that came to mind.
(Edit) If you're not submitting to a different target, you might want to go ahead and do that -- to a hidden iframe on the same page. That (possibly combined with the above or other answers) might help you get the user experience you're looking for.
There's a whole number of really good reasons IE does this, and I'm sure it's not something anyone would argue with - so the main objective is to get around it somehow to make things better for your users.
Sometimes its worth re-thinking how things are done. Perhaps disable the button, use javascript to check when all the fields are filled out, and fire off an ajax request once they are. If the ajax was successful, enable the button. This is but one suggestion, I'm sure there will be more...
Edit: more...
Do simple submission (non-AJAX), and if the checks fail, send a page back rather than an attachment. The page sent back could contain all the information originally submitted (plus whatever error message to the user) so the user doesn't need to fill out the entire form again. And I'm also sure there will be more ideas...
Edit: more...
I'm sure you've seen this type of thing before - and yes, it is an extra click (not ideal, but not hard).... an "if your download fails, click here" -> in this case, do it as you want to do it, but add a new link/button to the page when the AJAX returns, so if the download failed, they can submit the already validated form from a "direct user action". And I'm sure I'll think of more (or someone else will).....
I have been fighting a similar issue for a while. In my case, posting to a hidden iframe didn't work if my web app was embedded in an iframe on another site (third party cookie issues) unless our site was added to the Trusted Sites list.
I have found that I could break up the download into POST and GET sequence. The post returns a short lived GUID that can be used in a GET request to initiate the download. The POST can do the form validation as well as return the GUID in a successful response. Once the client has the GUID, you can set the src property of a hidden iframe element to the download URL. The browser sees the 'Content-Disposition': 'attachement' header and gives the user a download ribbon to download the file.
So far it appears to work in all the latest browsers. Unfortunately it requires you to modify you server side API for downloading the file.