I'm trying to include a file using ng-include.
the side-bar appear in the UI but when I say view page source that time the HTML tags of the side-bar doesn't appear.
That is because when you view the source of an HTML page in any browser, it will perform a fresh GET of the original document and display that source code. Since AngularJS injects elements to the DOM dynamically (and because it is "just" JavaScript all together), the original generated by the server-side will not be modified. To see the generated source, use a developer tool of your choice, i.e F12 Developer Tools in IE. Also, you may want to read up on the role JavaScript plays in the whole lifecycle of webpage rendering.
I am currently running the following js on some of my pages to dynamically set the title tag on the page pulling from the h1 tag contained on that page.
document.title = document.getElementsByClassName("Category-H1")[0].innerHTML;
However, when I view the page's source, it still shows the old title tag, even after the js has changed it.
It was suggested to me that I might have to add an "onload event of the body", not sure if this would help, or what onload event I would need to run.
Any other ideas? Suggestions?
Thanks - Alex
The 'source' is just that ... it is the code returned from the server. In most cases to see DOM changes made after loading you will need to look at the code in Firebug or Chrome Inspector.
I'm using a src less iframe to develop a javascript widget to protect my css from the parent page.
It's perfectly working in chrome/safari. However in firefox I can see the content is added during few milliseconds and then the iframe becomes empty.
If I inspect html I see empty head and empty body, however if I inspect innerHtml through the console I can see that It has the right content...
I'm sorry I can't give you code as it's hard to pull out the relevant parts : I can tell you I access the iframe with jquery contents() and then find body or find head.
Any idea plz ?
I Managed to make an example : http://jsbin.com/arenat/2/edit#javascript,html,live
Just some code pull out to show the issue : working on chrome no in firefox (10.0.1). Hope it's enough.
When you add the frame to the DOM, it starts loading about:blank asynchronously. Then you modify the DOM in the iframe ... and then the asynchronous load completes and replaces the document you modified.
I suggest either using an onload handler on the iframe to do your manipulation after the about:blank has finished loading or using document.open() and document.close() to force cancellation of the async load.
It's working with a timeout :
http://jsbin.com/arenat/9/edit
It may be trivial, but I have following code that is dynamically adding some href attributes to an <a> element, with important variables that are passed to php generated page in a pop-up window.
jQuery(document).ready(function(){
var url = jQuery("a.special-links").attr("href");
var data = "?iframe=true&width=800&height=350&format=popup";
jQuery("a.special-links").attr("href", url + data);
});
When inspected the page with Firebug, the <a> element got the url href properties right, but link does not work. When I inspect the code looking at source code, I see that href data part was in fact not added!
Is this runtime problem or something else ? Thanks for clues...
You do no see the changes when you view source of the page because when you view source you are viewing the actual HTML document that was downloaded by the browser when you loaded the visited the URL. This can NEVER be changed by javascript.
When you write javascript you change something called the Document Object Model aka the DOM. This is an in memory data structure built by the browser as a result of parsing the HTML document. This is what firebug enables you to inspect.
Given a webpage that uses lots of javascript to generate its HTML, how can I get the final computed HTML being parsed by the browser instead of the source HTML? In other words, presume a page has lots of tags surrounding javascript functions that, when called, return some HTML. When I view the source of the page, I see the script function call, not the HTML it produces.
How could I get all of the HTML produced by a webpage?
I've noticed that Firebug appears able to see the HTML instead of the scripts, but it doesn't appear to have any way to save the whole page, only little segments of it.
Update:
Thanks for all the answers. However, I'm still not getting the HTML I see in Firebug's console with any of those techniques. For my example page, I'm using the 'Info' tab of my own Facebook profile. If you view source on that page, you'll see lots of scripts with the title 'big_pipe.onPageletArrive()'. However, if you look at it in Firebug, each of those function calls renders out to HTML. I tried the right-click on the tag in Firebug, the View Generated Source in the Webdev Toolbar, and the Chrome suggestion, but they all give me the script call, not the HTML.
Any other ideas?
Update 2:
When I said each of those functions renders out to HTML in Firebug, I wasn't quite correct. They only render out if I select them in the page and right click->Inspect Element. Then it appears to render it out. So maybe my question has become how do you get Firebug to automatically render out all of the HTML so you can select and save it? (Or I'm open to any other solution for grabbing this HTML).
With Firebug's HTML tab, you can right click on the <html> element, and click "Copy HTML".
You can do the same thing with Developer Tools in Chrome/Safari.
The Web Developer Toolbar for Firefox has a "View Generated Source" option which provides this functionality.
with (window.open("")) {
document.open("text/html");
document.write("<!--\n"); //for live version delete this line
document.write(opener.document.documentElement.outerHTML.replace(/</g,"<").replace(/>/g, ">"));
document.write("\n//-->"); //for live version delete this line
document.close();
document.title = "DOM Snapshot:" + opener.document.title;
focus();
}
Open console
copy paste the above code and execute
it opens an empty page,
now inspect the page with right click or f12,
copy outerhtml of the comment
paste wherever you want
optionally remove the comment at the start and end
If you want a live version that is clickable, then simple leave out the comment tags in the above code.
document.getElementById('awesomeness').textContent = document.documentElement.outerHTML.replace(/<\/\w+>/g, (e) => e + '\r\n');
<div id="awesomeness" style="overflow:scroll;width:100%;height:100%;white-space:pre;"/>
so yea, use that...
I was having problems with a page generated by Javascript: the content would only render if the page was scrolled down, so the copied HTML was incomplete. This happened to me with all suggestions based on Chrome.
This issue was solved by the following trick:
Open a console, then type a zoom that will render the entire page (or desired contents), e.g.
javascript: document.body.style.zoom = 0.1
Copy the HTML as per other suggestions, e.g.
copy(document.querySelector('html').outerHTML)
When pasting, search the text for "zoom", then revert the value to "1", save the HTML.
It is not possible generally. Here is excerpt from my bookmarklet which relies on non-standard outerHTML:
with (window.open("")) {
document.open("text/html");
document.write("<PRE>");
document.write(opener.document.documentElement.outerHTML.replace(/</g,"<").replace(/>/g, ">"));
document.write("</PRE>");
document.close();
document.title = "DOM Snapshot:" + opener.document.title;
focus();
}
Note: DTD is missing and not retrievable at all.