The following trick for getting page's HTML content after javascript has been rendered works pretty well, when placed in the onNewPicture() of a WebView's PictureListener:
browser.loadUrl("javascript:window.HTMLOUT.showHTML('<head>'+document.getElementsByTagName('html')[0].innerHTML+'</head>');");
However, for some reason it works only the first time the page is loaded. That is, the application's first such WebView.loadUrl() call gets a completely rendered version of the page.
Thereafter, if I reload/refresh the page (same exact URL), the output of HTMLOUT.showHTML() appears to be the original HTML+javascript before the page was rendered.
The strange thing is that visually, on the WebView itself, all content is there! (albeit after a significant delay... I can see the WebView's hourglass spinning, perhaps it takes too long for Javascript to be re-processed?)
This seems to suggest either an initialization problem (in my code), a bug in WebView, or some caching principle that is well known to experienced web programmers but with which I am not familiar with yet.
But then it gets even more interesting: Subsequent calls to WebView.loadUrl() result in the aforementioned failure multiple (3-10) times until... the page is miraculously fully rendered again! (and then multiple failures again, and so on)
Which may suggest timing problem?
Any suggestion on how to debug or troubleshoot this?
you have to inject the javascript after the page loads! Took me forever to figure it out
Related
I'm seeing an odd bug in IE that I'm not seeing in Chrome. Specifically, this involves some JS code not firing when a (Telerik) wizard is navigated back to it's first step.
When the user clicks their "Previous" button, some data isn't being properly loaded. Hitting F12 and bringing up the developer console has shown me the following Warning:
DOM7011: The code on this page disabled back and forward caching. For more information, see: http://go.microsoft.com/fwlink/?LinkID=291337
Ok, so I go to the link provided and I noticed the documentation states:
In order to be cached, webpages must meet these conditions:
...
- The F12 Developer tools window isn't open
This is a problem, because when I use the navigation buttons within my wizard WHILE the dev window is open, it behaves properly, just as it does in Chrome.
How can I debug my related Javascript so I can figure out what's going on? Also, I understand what caching is but I'm not exactly sure what this is about and I have no idea why Chrome behaves differently. Is there a way that I can force IE to behave like chrome and cut on (or off) whatever features that are causing this issue?
Yuck. Back to old school debugging for you.
Short of putting the whole browser into a Windows debugger, you can pretty much forget about setting breakpoints. All you can do is log.
If you are lucky and your problem isn't too deep, you can use a sprinkling of simple alert() statements to let you know the state of things at various stages in your code. One nice thing is that you can serialize objects now pretty nicely; for example, you can do JSON.stringify(this), which will probably give you a giant output, which you can copy and paste into your IDE and unpack. A major upside to doing this is that the alert will block, so you can take your time studying the output. A major downside to this is that race conditions are now much more likely.
Alternatively, you can add a <textarea> to the page and throw your JSON.stringify(this) results into that. Because this means extra DOM mutations, it also increases the odds of race conditions, but not by much. (If race conditions are a possibility, you can do this:
(function () {
var currentState = JSON.stringify(this);
setTimeout(function () {
document.querySelector('textarea').value = currentState;
}, 1000);
})()
Even though these are now asynchronous, if you use this multiple times in sequence, these will execute in that same sequence (unless you change the timeout period).
If you are doing actual page navigations (and not just changing the URL with pushState()), then actually reading those logs is going to be a problem. The solution is to put the page in a frame and write the content out to a sibling frame. As long as both frames are running on the same domain, you will have no problem pushing the data into the sibling frame. If you can't put them on the same domain, you are kind of screwed.
I have a php page that hangs for 3-10 seconds after the page loads, you can't even scroll up or down, or close the tab when this happens. (the chrome loading gif still loops tho) Happens in Chrome and IE.
Chrome Timeline: http://imgur.com/wF5Pioz,KRbnxIm#0
Shows ContentVeil.js repeating over and over. I think it is client side(?), I did a grepWIN to search for ContentVeil, with no luck, and it doesn't show up in Chrome Network tab.
Chrome Profile: Second image, from above link.
I think this shows the issue at the anonymous function from meta-boxes.min.js, ln 1.
meta-boxes.min.js: http://pastebin.com/yqtJyqB1
Unfortunately line one is a function that encapsulates the whole script. I don't know js very well, I tried to just remove each function one by one but that just created more errors.
Any ideas on how I could find the source of the problem would be much appreciated.
It's part of the Evernote web clipping extension, and it's hooks DOM events, causing massive slowdowns if you are doing large amount of dom changes.
I ran across this code in some HTML and I am unsure what the point of it is:
onresize="window.location.reload(false);"
I am not very proficient in JavaScript, but it looks like it basically just... reloads the browser window when it is resized? Does that even work? It seems odd.
It does reload the page as you suspect, although the argument false that is passed to reload indicates that the page should be reloaded from the cache if possible, to minimize load time. See Mozilla's window.location documentation for more details.
I suspect this is to force the browser to re-layout the page in case resizing the window changes how it renders. Typically that should not be necessary, especially if the site is employing responsive web design techniques, but some sites may do it anyway.
I agree with Stuart. Do you own this code? In other words can you make changes to the code and deploy it to some test environment? If so, try commenting that line and see what happens. May be this line was added way back when some particular browser couldn't handle window re-size events correctly. May be it's not needed anymore.
I have a webapp, that when loaded for the first time has a long initialization sequence. Basically it calls an external API to get loads of data which it caches upon completion, using HTML5 localstorage API.
The issue is, it never gets through initialization in Mobile Safari on the first attempt. At around the same point each time, my AJAX calls just stop firing. When I refresh the page, it starts initialization over again, but this time gets through.
If I clear the browser cache and start this process over again, it is always the same. Fail on first attempt, succeed on subsequent refreshes.
I'm aware that there are certain barriers in place in Mobile Safari to prevent large consumption of data unless in direct response to user input (such as the HTML5 audio tag not being able to 'autoplay').
I'm wondering if there is something similar in place for loading web pages for the first time that immediately consume large amounts of data. And by refreshing, Mobile Safari takes that as your explicit permission to do so.
Anyone know?
I suggest you start with a simple, quick-loading base html file, which will give your user something to look at right away -- even if it's just a simple "Loading...".
Then use ajax to get your "loads of data," using window.onload for example. Ideally give your user something to read or interact with so they don't notice the wait, or a progress indicator, to know the site is actually working.
People are impatient, and when faced with a blank screen and the browser loading indicator not making progress, they're going to assume your site is broken within a few seconds.
The certain barriers...to prevent large consumption of data are probably there for exactly this reason, to improve user experience and prevent monstrous web pages.
On a page, something like jsFiddle, that executes user inputed Javascript, is there a way to stop / disrupt "problem" scripts running in an iframe?
The major class of problem scripts would be infinite loops, I think. Browsers deal with multiple alerts quite well, but a script like, for (var i = 0; ++i; i < 100) { /* do stuff */ } will go forever.
How can I either detect and not run, or run and stop, say after 10 seconds of running, a script?
Removing the iframe is fine, but I only want to remove it if the script is still running after 10 seconds, but I don't want to remove it if the script has stopped running.
Here is how I imagine the page will work. If you have a better solution, let me know...
The input page contains a textarea and a blank iframe. The user enters their script into the textarea, and when ready they click on run. Then (backstage) a separate page is created that contains the user script in executable form on an HTML page. The src of the iframe is set to the page with the executable code. This all happens dynamically without a page refresh.
I haven't used this jsandbox script, but it appears to have what you want.
If one script freezes on a page, other scripts will not continue to run. Therefore, there is no way to detect if another script has stopped running, without using a custom plugin or something. Browsers do not use multithreading in that way.
You could set a timeout in the main window which stops / deletes the script after 10 seconds. Then you just have to clear the timeout when the script has finished (just add a line like this to the iframe script: window.frames[0].clearTimeout(window.frames[0].timeoutName) -- I don't know if it works, but it should)
I think this would largely depend on the script and how browsers handle scripts in iframes.
Let's say there's a while(true) in the iframe.
The browser may either lock up, or crash the tab (like what Chrome does), or it might lock up the iframe. If it locks up or crashes the tab, there's nothing you can do with JS itself to prevent it, other than attempting static analysis on the script to find possibly problematic statements (Static analysis to find problematic scripts like that would never be foolproof)
Assuming the browser only locks up the iframe or does something else while still allowing the scripts in the main page to do things, removing the iframe after a certain period of time is an option.
The browser might also display the "Script is slow" popup. In this case, it will most likely either completely shut down all scripts in the entire tab or just in the iframe. If just the iframe, the other scripts in the tab could still clean up after it after the predefined period of time.
Another alternative would be to pre-evaluate the script in a separate runtime where you can detect things like that yourself. You could run the script in, say, Rhino, and determine if it takes too long to process, or something similar.
I don't know if this would work exactly, you might be able to get something like this to work with a little bit of tinkering. I take it that you are taking in text JavaScript and then evaling it, right? You could parse or maybe even just use regex to replace all of the for, for..in, while, and function declaration's call with the function call then some logic that calls your code and figures out if it has been running for 10 secs. If it has it will either return; or break; or something. The code would likely freak out afterward and probably start throwing errors, but at least the script would stop.