Creating an auto-login script to website? - javascript

We have an intra-net website that requires me to enter login information every time and it's annoying.
I want to create some sort of automatic login for this website.
I tried creating a two-frame website, with one page being the login page and the other being my own html. In my html, I wanted to enter values to the textboxes in the other frame but got an "access denied" js error.
Any other idea?
(I know I can record a Macro, but unfortunately, can't install any third-party applications in the inner-network. I could also have used Robo-Form...).
Oh yeah, and it has to work on the worst browser ever, Internet Explorer 6...

Maybe local html file with form that mimic login form and with autosubmit (like onload="form1.submit();"?). Than you can run this file and it will autosubmitted and redirects you where you want to go (form's action).

A greasemonkey script might work.

Related

(Semi-)Automatically logging in to websites

I want to automatically log in to specific websites, e.g. the groupware webinterface at work. My browser (Chrome on Linux, if that matters) saves passwords for me, but I want a complete auto-login, so that I don't even have to click the "login" button anymore.
I have investigated multiple ways to approach this, but none of them has turned out to be satisfying:
1. Use a Tampermonkey JavaScript which clicks the "login" button on the website
I wrote a custom JavaScript which was supposed to just click the "submit" button once I load the login page. Chrome was supposed to fill in the password fields. The idea sounded pretty straight-forward. However, this is bad for 2 reasons: On the one hand, I cannot use Chrome's saved password. Chrome has a policy that the password field already displays the circles, but the password is not actually filled in and is also not accessible from JavaScript until the user has performed a gesture such as clicking (see this Chromium issue), which kind of defeats the purpose of my JavaScript. I could go around this by saving the password in localStorage additionally (security wouldn't be compromised, as the saved passwords are not encrypted either), but this doesn't feel good. On the other hand, this breaks a (imho) significant security feature of Chrome. It is the same feature mentioned above which prevents XSS attacks from stealing login passwords. Because whenever I load the login page, the password would be filled in and it would log me in.
So what I would rather want is a special (if possible local) page which I can bookmark, but which will (probably) never be known to anyone performing an XSS attack on me.
2. Use a local HTML page which loads the login page, fills out the form and logs me in
This is a simple idea and would accomplish my goal, but of course it doesn't work because of the same-origin policy.
3. Use a script/program
This would theoretically work. I could write a program which downloads the login page, reads the form, submits it and then transfers the cookies (or the login URL, if the form uses GET to submit to the browser. However, this would be a major piece of work, especially for the case where the forms use the POST method (I'd have to transfer cookies to a possibly running instance of Chrome).
Plus, I'd have to somehow tie this program to a local webserver or turn it into an extension so I could access it from within my browser. After all, opening a shell and typing a command is not really easier than clicking a login button.
4. Use cookies
This is not really an approach, but I mention it here for completeness' sake. By default, Chrome removes all cookies when I exit the browser. I can configure it to keep the cookies of specific websites so I don't have to log in again when I restart it. Some websites use only session cookies, though, so closing the last tab already (correctly) removes the cookies and I have to login again. As a result, cookies only solve my problem for a few websites, but not all.
So my question is: Is there an easier way to accomplish automatic log-in without having to circumvent security features or write a large program?
P.S.: I know, this is a lot of effort to get around clicking a single button every now and then :)

Crawling a website fails because javascript isn't enabled

I use Abot for crawling,
I want to crawl a website that appears to block any request that doesn't have javascript enabled
it's a php page, and i get a "Please activate javascript to view this site." instead of the real site
How does a page know if javascript is enabled or not? (and in your opinion - do you think i can overcome that?)
Thanks
Unless you crawl the site with a javascript-enabled client, you are out of luck.

How can I prevent saving/downloading web page?

I was wondering if there was a way to prevent a user from saving/downloading a web page? Specifically, I mean not letting them have access to the data displayed through my web application on their own machine?
I've heard that this is not possible since the browser must have access to the source code/data, but at the same time, I've noticed that if I to my gmail account, open an email, save the page, but when I try to open that page on my computer, it doesn't work. Furthermore, if I click "view source", I can see that even the source does not display the entire email message, even though the email is opened in my browser.
How it's possible for gmail to prevent me from seeing that email data?
Thats what called rendering pages using dynamic data without refreshing page (AJAX). The entire page source code is not downloaded in one go and components within the page request data asynchronously to display content. Try googling it and you will find more information.
In View source you can only see the HTML, CSS, JavaScript codes. No one can copy any dynamic code (PHP) from view source.
You can't stop anyone to see your html,css code in browser, as we are having view source option.
Maximum what you can do is disable right click on your page. Thant can be done through JavaScript.

Open local html file in current window with Javascript Bookmarklet

I'm trying to build a sample bookmarklet to grab current webpage source code and pass it to a validator. Validator is not a an online website, but a folder with bunch of javascript and html files. I'm trying to open file:///C:/Users/Electrifyings/Desktop/Validator/Main.html file with the help of javascript bookmarklet code and put the source code in the textarea in the newly opened window, but it is not working for some reasons that I'm not aware of.
Here is the sample code with algorithm:
javascript:(function(){var t = document.body.innerHTML;window.open('file:///C:/Users/RandomHero/Desktop/test.html',_self);document.getElementById("validator_textarea")=t;})()
Here are the steps:
Grab current web page source code in a variable.
Open locally stored HTML web page in current or new window or new tab (either way is fine with me, but no luck)
Put the source code from the variable into the validator textarea of the newly opened HTML file.
I have tried above code with a lot of variations, but got stuck on the part where it opens the new window. Either it's not opening the new window at all or it is opening blank window without loading the file.
Would love to get some help with this issue, thanks a lot.
Oh and btw,
Windows 7 x64, Tried IE, Firefox and Chrome. All latest and stable builds. I guess it's not a browser side issues, but something related to javascript code not opening the URI with file:/// protocol. Let me know if any more details are needed. :)
You wouldn't want a webpage you visit to be able to open up file://c:/Program Files/Quicken/YourSensitiveTaxInfo right? Because then if you make a mistake and go to a "bad" website (either a sleazy one or a good one that's been compromised by hackers), evil people on the intarweb would suddenly have access to your private info. That would suck.
Browser makers know this, and for that reason they put VERY strict limits to prevent Javascript code from accessing files on a user's local computer. This is what is getting in the way of your plan.
Solutions?
build the whole validator in to the bookmarklet (not likely to work unless it's really small)
put your validator code up on the web somewhere
write a plug-in (because the user has to choose to install a plug-in, they get much more freedom than webpages ... even though for Firefox, Chrome, etc. plug-ins are basically just Javascript)
* * Edit * *
Extra bonus solution, if you don't limit yourself to a purely-client-side implementation:
Have your bookmarklet add a normal (HTML) form to the page.
Also add an iframe to the page (it's ok if you hide it with CSS styling)
Set the form's target attribute to point to the iframe. This will make it so that, when the user submits the form and the server replies back to that submission, the server's reply will go to the (hidden) iframe, instead of replacing the page as it normally would.
Add a file input to your form - you won't be able to access the file within that input using Javascript, but that's ok because your server will be doing the accessing, not your bookmarklet.
Write a server-side script which takes the form submissions, reads the file that came with it, and then parrots that file back as the response. In other words, you'll have a URL that you can POST to, and when it sees a file in the POST's contents, it will respond back with the contents of that file.
Now that you've got all that the user can pick their validator file using the file input, upload it to your server, your server will respond back with the file it just got, and that file will appear as the contents of the iframe.
And now that you finally have the file that you worked so hard to get (inside your iframe) you can do $('#thatIframe').html() and viola, you have access to your file. You can save the current page's source and then replace the whole page with that uploaded file (and then pass the saved page source back to the new validator page), or you can do whatever else you want with the contents of the uploaded validator file.
Of course, if the file doesn't vary from computer to computer, you can make all of that much simpler by just having a server that sends the validator file back; this could be a pure Apache server with no logic whatsoever, as all it would have to do is serve a static file.
Either way though, if you go with this approach and your new file upload script is not on the same server as your starting webpage, you will have a new security problem: cross-domain script limitations. However, these limitations are much less strict than local file access ones, so there are ways to work around them (JSONP, cross-site policy files, etc.). There are already tons of great Stack Overflow posts explaining these techniques, so I won't bother repeating them here.
Hope that helps.

Creating a testing script which 'skips' file uploads

I am creating a testing script for my team. So far it works fine - it goes to the login form and attempts to login using the details you entered, if that fails it goes to a job description and attempts to apply for a job, selecting a random answer for each screening question until it gets to the actual application form. Unfortunately the application form includes a file upload control, which I don't appear to be able to skip over. Does anyone have any idea if I could actually skip over it or somehow click it? I am using FF7 and from previous posts I can see that apparently FF4 does it - but FF7 doesn't appear to make file uploads clickable through JS.... any ideas? Thanks in advance.
Regards,
Richard
Javascript can't access file upload inputs for security reasons (e.g. prevent malicious upload of user's files). For this kind of testing you should use a browser automation tool like Selenium http://seleniumhq.org/ (or in web service form https://saucelabs.com/).

Categories

Resources