Page loader setup is to fast in localhost environment? - javascript

I'm testing a jquery pageloader (queryloader2) offline, but
even frequent browser cleaning, doesn't provide me the ability to see the preloading script in full action, as it zips by so quickly I don't know it it's even doing anything.
How I could simulate online speeds.
Thanks I'll take me answer offline. Dido.
Stackflow:
Your question has been identified as a possible duplicate of another question. If the answers there do not address your problem, please edit to explain in detail the parts of your question that are unique.
(I've searched for a duplicate question, an answer to prevent me from asking. Thanks.)

You can simulate a slower network connection with Chrome Developer Tools. The mode is intended for mobile device emulation, but you can turn off the screen resizing and just use the network throttling feature.

Related

Check if custom protocol is installed

I am working with a safe exam browser (SEB) which is a software that provides proctor mode, when i want to open a website in SEB i just add protocol seb:// instead of https:// in url to open the website in SEB software, but if SEB is not installed it just prints the error in console that schema does not have a registered handler. I have checked a lot of other stack overflow answers and have found out that we have to use different hacky solutions for each browser i.e. onfocus etc. in chrome etc.; but all these answers were very old so I wanted to know if there is any easy implementation for this problem in javascript as of now. Your help is appreciated.

Why CasperJS and browsers show different behaviors with CAPTCHA? [duplicate]

Is there any way to consistently detect PhantomJS/CasperJS? I've been dealing with a spat of malicious spambots built with it and have been able to mostly block them based on certain behaviours, but I'm curious if there's a rock-solid way to know if CasperJS is in use, as dealing with constant adaptations gets slightly annoying.
I don't believe in using Captchas. They are a negative user experience and ReCaptcha has never worked to block spam on my MediaWiki installations. As our site has no user registrations (anonymous discussion board), we'd need to have a Captcha entry for every post. We get several thousand legitimate posts a day and a Captcha would see that number divebomb.
I very much share your take on CAPTCHA. I'll list what I have been able to detect so far, for my own detection script, with similar goals. It's only partial, as they are many more headless browsers.
Fairly safe to use exposed window properties to detect/assume those particular headless browser:
window._phantom (or window.callPhantom) //phantomjs
window.__phantomas //PhantomJS-based web perf metrics + monitoring tool
window.Buffer //nodejs
window.emit //couchjs
window.spawn //rhino
The above is gathered from jslint doc and testing with phantom js.
Browser automation drivers (used by BrowserStack or other web capture services for snapshot):
window.webdriver //selenium
window.domAutomation (or window.domAutomationController) //chromium based automation driver
The properties are not always exposed and I am looking into other more robust ways to detect such bots, which I'll probably release as full blown script when done. But that mainly answers your question.
Here is another fairly sound method to detect JS capable headless browsers more broadly:
if (window.outerWidth === 0 && window.outerHeight === 0){ //headless browser }
This should work well because the properties are 0 by default even if a virtual viewport size is set by headless browsers, and by default it can't report a size of a browser window that doesn't exist. In particular, Phantom JS doesn't support outerWith or outerHeight.
ADDENDUM: There is however a Chrome/Blink bug with outer/innerDimensions. Chromium does not report those dimensions when a page loads in a hidden tab, such as when restored from previous session. Safari doesn't seem to have that issue..
Update: Turns out iOS Safari 8+ has a bug with outerWidth & outerHeight at 0, and a Sailfish webview can too. So while it's a signal, it can't be used alone without being mindful of these bugs. Hence, warning: Please don't use this raw snippet unless you really know what you are doing.
PS: If you know of other headless browser properties not listed here, please share in comments.
There is no rock-solid way: PhantomJS, and Selenium, are just software being used to control browser software, instead of a user controlling it.
With PhantomJS 1.x, in particular, I believe there is some JavaScript you can use to crash the browser that exploits a bug in the version of WebKit being used (it is equivalent to Chrome 13, so very few genuine users should be affected). (I remember this being mentioned on the Phantom mailing list a few months back, but I don't know if the exact JS to use was described.) More generally you could use a combination of user-agent matching up with feature detection. E.g. if a browser claims to be "Chrome 23" but does not have a feature that Chrome 23 has (and that Chrome 13 did not have), then get suspicious.
As a user, I hate CAPTCHAs too. But they are quite effective in that they increase the cost for the spammer: he has to write more software or hire humans to read them. (That is why I think easy CAPTCHAs are good enough: the ones that annoy users are those where you have no idea what it says and have to keep pressing reload to get something you recognize.)
One approach (which I believe Google uses) is to show the CAPTCHA conditionally. E.g. users who are logged-in never get shown it. Users who have already done one post this session are not shown it again. Users from IP addresses in a whitelist (which could be built from previous legitimate posts) are not shown them. Or conversely just show them to users from a blacklist of IP ranges.
I know none of those approaches are perfect, sorry.
You could detect phantom on the client-side by checking window.callPhantom property. The minimal script is on the client side is:
var isPhantom = !!window.callPhantom;
Here is a gist with proof of concept that this works.
A spammer could try to delete this property with page.evaluate and then it depends on who is faster. After you tried the detection you do a reload with the post form and a CAPTCHA or not depending on your detection result.
The problem is that you incur a redirect that might annoy your users. This will be necessary with every detection technique on the client. Which can be subverted and changed with onResourceRequested.
Generally, I don't think that this is possible, because you can only detect on the client and send the result to the server. Adding the CAPTCHA combined with the detection step with only one page load does not really add anything as it could be removed just as easily with phantomjs/casperjs. Defense based on user agent also doesn't make sense since it can be easily changed in phantomjs/casperjs.

Remote Debug Website

Is there a way to remotely debug a website?
I've just finished putting together a website that has some jquery animations. The site works fine on every machine/configuration I've tested it on.
One of the people the site needs to work for, however, reports that the animations don't work; which effectively breaks the website.
I strongly suspect his companies' network is the root of the problem; however diagnosing this is challenging as he is not a technical user and guiding him through the webkit inspector/console, etc. is not really an option.
Ideally I'd like to be able to 'capture' the network/javascript logs from IE or Chrome so that I can inspect them and attempt to work out what's gone wrong.
Aside:
I'm using an off-the-shelf Wordpress theme (http://theme.co/x/) for the site; so I expect the code is good.
While it doesn't seem possible to remotely capture and inspect the network or javascript logs from another machine's browser; there are a number of services that allow you to add automatic error reporting to your javascript code, which you can then inspect to find the root of the problem.
Examples of these are Errorception and Raygun.
As far as I have found, there aren't any similar tools to do so for monitoring network performance / loading specifically- although a similar approach with a custom script to detect if specific items have been loaded could be written.

Slow down browser rendering

Is there a way to slow down browser DOM rendering and JS execution for development so we can see which parts of the website are too JS intensive and might be slow on slower machines? Maybe an extension for Chrome/Firefox for Linux/OSX?
Some clarification:
It's not about connection or testing the speed of the browser! It's just for our developers to see which parts of the page are rendered slowly or are "glitchy". For example when you use ajax and you are loading something you show a loader, but just after the loader is shown the loaded part is shown too. We want to see that in slow motion. Like when you press SHIFT in OSX when doing Expose.
PS. I did find some articles on delaying Internet connection, but that's not enough in this case.
PPS. Loading everything in VMs didn't work for us.
PPPS. Using slow down code like proposed in Javascript code for making my browser slow down is not the best option in my opinion.
Converting what #z0r said in the comments to an answer:
In Chrome, open devtools and select the Performance tab
Make sure Screenshots is checked
Press the record button (or hit Ctrl + E)
Do your activity
Stop recording
Hover over the timeline to see the screenshots of the screen as things change.
Use the timeline or profiler on your browser inspector. Here you can see, what functions take down the speed.
The accepted answer is good; I use and recommend Chrome Dev Tools as well.
As an alternative to Chrome Dev Tools:
Several 'website performance analysis' services offer timeline views. Run some internet searches and you'll find various free and paid options.
Try webpagetest.org
It's open source, highly regarded and has been running for years. It may offer different information, or is accessible in a different way, than Chrome Dev Tools.
In the test results, click "Filmstrip View".

conflicts between javascript code and browser addon/extension

I am the owner and developer of a e-commerce website.
Every single day some potential customers call us because can not order, we investigate a little bit and inevitably discover he can not because of some js errors.
We check his browser addons/extensions, disable some or all, and the JS errors disappear.
The JS are always different from each other, and the addons/extension vary; it happens with Chrome, IE, Firefox, indifferently. Usually are some sort of coupon/deals addons/extension like DealSpy.
And I don't have any data to support this but I believe these cases had a spike since we moved to angularjs.
I am wondering if there's anything I can do; I can't disable their addons/extension programmatically from my code I guess, but somehow catch those errors and manage them?
Any advice from anyone who faced same or similar issue?
There is likely no way to answer this properly as it depends on the code of your site and the add-ons breaking it, none of which you supply...
Anyway, try to reproduce the error and contact the add-on authors in question.
There might be ways to work around particular issues, but that would be case-by-case and dependent on the actual code.
Also, in case of Firefox add-ons, if you encounter an add-on that just "breaks the web" (or part of it: your website, maybe others), consider filing a Tech Evangelism :: Add-ons bug. Mozilla and/or the add-ons editors team may then take appropriate steps according to the Add-on guidelines.
Not sure if there is something similar for Chrome... Their Help section just says to contact the actual extension author. So unless it is a security sensitive bug, you shouldn't expect any assistance from Google.

Categories

Resources