XML, XSLT and JavaScript - javascript

I'm having some trouble figuring out how to make the "page load" architecture of a website.
The basic idea is, that I would use XSLT to present it but instead of doing it the classic way with the XSL tags I would do it with JavaScript. Each link should therefore refer to a JavaScript function that would change the content and menus of the page.
The reason why I want to do it this way, is having the option of letting JavaScript dynamically show each page using the data provided in the first, initial XML file instead of making a "complete" server request for the specific page, which simply has too many downsides.
The basic problem of that is, that after having searched the web for a solution to access the "underlying" XML of the document with JavaScript, I only find solutions to access external XML files.
I could of course just "print" all the XML data into a JavaScript array fully declared in the document header, but I believe this would be a very, very nasty solution. And ugly, for that matter.
My questions therefore are:
Is it even possible to do what I'm
thinking of?
Would it be SEO-friendly to have all
the website pages' content loaded
initially in the XML file?
My alternative would be to dynamically load the specific page's content using AJAX on demand. However, I find it difficult to find a way that would be the least SEO-friendly. I can't imagine that a search engine would execute any JavaScript.
I'm very sorry if this is unclear, but it's really freaking me out.
Thanks in advance.

Is it even possible to do what I'm thinking of?
Sure.
Would it be SEO-friendly to have all the website pages' content loaded initially in the XML file?
No, it would be total insanity.
I can't imagine that a search engine would execute any JavaScript.
Well, quite. It's also pretty bad for accessibility: non-JS browsers, or browsers with a slight difference in JS implementation (eg new reserved words) that causes your script to have an error and boom! no page. And unless you provide proper navigation through hash links, usability will be terrible too.
All-JavaScript in-page content creation can be useful for raw web applications (infamously, GMail), but for a content-driven site it would be largely disastrous. You'd essentially have to build up the same pages from the client side for JS browsers and the server side for all other agents, at which point you've lost the advantage of doing it all on the client.
Probably better to do it like SO: primarily HTML-based, but with client-side progressive enhancement to do useful tasks like checking the server for updates and printing the “this question has new answers” announce.

maybe the following scenario works for you:
a browser requests your xml file.
once loaded, the xslt associated with the xml file is executed. result: your initial html is outputted together with a script tag.
in the javascript, an ajax call to the current location is made to get the "underlying" xml-dom. from then on, your javascript manages all the xml-processing.
you made sure that in step 3, the xml is not loaded from the server again but is taken from the browser cache.
that's it.

Related

SEO for html single-page site via quasi-html content

Suppose, I have a javaScript-heavy single page web application. My Javascript render dom directly from model / datasource (Json).
I came up with an approach to generate simple html from datasource (on backend). This html is required only for search engines to index. After page is loaded, JavaScript will replace this quasi-html with the proper UI. Quasi-html can be removed from layout with display:none to avoid performance penalty on the browser.
Will it work?
Also I am concerned about legitimacy of the approach.
Thoughts?
It should work giving the search engines content to craw even if they don't read javascript. Now bots evolve and they read quite a bit of javascript nowadays, I've created a page that only has 2 sentences onBeforeLoad and uses Ajax to get the rest of the content and I see Google indexing a lot of the keywords delivered by Ajax. A problem would be misleading the search bot, like putting in content irrelevant to your other page content - something the bot might pick up at some point and penalize you for it. "I am concerned about legitimacy of the approach" - I wouldn't be, keep code valid and ride on

Is it recommended to render your whole website with Javascript? (Optimizing) [duplicate]

This question already has answers here:
Should I load an entire html page with AJAX?
(4 answers)
Closed 9 years ago.
I read somewhere that pros print out only one line html and one line javascript per page and rest of the rendering process made by the client. I've found this very promising so I thought I'd use the following structure to render pages:
<html>
<head>
{{Title}}
{{Meta tags}}
{{CSS via CDN}}
</head>
<body>
{{Dynamic JSON array containing the datas of the current page}}
{{Javascript libraries via CDN}}
{{JS files that contain HTML templates via CDN}}
</body>
</html>
So the questions are:
Is it really a good practice?
Is it worth it to load the HTML templates via CDN?
SEO is secondary, but of course I'd render some necessery meta tags.
Thanks for your answers!
Is it really a good practice?
That's rather subjective. It depends on how much you value reliability, performance and cost.
You can get a performance boost, but you either:
Have a very fragile system that will completely break if a JS fail fails to load for any reason, trips over a browser bug, etc or
Have to start by building all your logic server side and then duplicate all the work client side and use pushState and friends to have workable URIs.
SEO is secondary, but of course I'd render some necessery meta tags.
Leaving aside questions of meta tags being necessary for SEO… rendering them with client side JavaScript is pointless. If the content is rendered only with client side JS then search engines won't see it at all.
Is it worth it to load the HTML templates via CDN?
Again, it depends. Using a CDN can cause your static files to be delivered faster, but they are an added expense and require a more complex build system for your site (since you have to deploy to multiple servers and make sure the published URIs match up).
Ofcourse, this is a good practice (if SEO is really secondary importance) to
Dynamically loading JSON array containing the datas of the current page
Javascript libraries being loaded via CDN
JS files that contain HTML templates via CDN
Besides you can minify your javascript and gzip it
Client script is much faster than server script as far as the performance is concerned
There are of course pros and cons of rendering website in the client.
Pros:
You can reuse given template. No need for asking server to render given UI element so it's faster.
When using such tools like Meteor.js you can even go further and when rerendering template only replace parts that changed.
You can include given module/subpage (when required) so you can still avoid loading all the data at the first load.
When not rendering website on server, it can handle more requests.
Websites are more dynamic. User gets feeling of immediacy when swiching page.
Cons:
It's not SEO friendly but there are easy to use tools that help to deal with it (there is one for Meteor.js).
The calcuation is easy :). Use dynamic JS website rendering :).
It makes your initial render slower (browsers are extremely well optimized for rendering HTML), which can potentially affect your search rankings, and it is somewhat less amenable to caching. Twitter tried a JavaScript-and-JSON-only architecture and ended up going back to serving a prerendered page along with the JavaScript app because it gave better perceived response times. (Again, the actual response times aren't necessarily better, but the user sees the response sooner.)

How do I render an html file in javascript?

OK, I am using javascript sever side, including node.js. Because of performance issues, we have decided to move one page to being rendered server-side, not client side, so the server returns a stream of html, fully rendered, back to the client.
I have seen this question and the related answers, but wondered if this was the best or right approach. In particular, what is the most appropriate way to render a page, and run all of the javascript on it within a js or node.js call?
Ideas that I have looked at:
Call the javascript code directly on the page, and invert everything to make it generate the html items needed. As this is urgent, I would rather avoid re-writing any more than I have to.
Render a document, with a simple iframe to generate the html. But how do I point to the page in the iframe, as I am server side? Surely this is just adding another level of abstraction to the same problem.
Using the ideas detailed above, but I am wondering whether this is the right route, given some of the problems I have seen encountered with it.
EDIT: Just to clarify - I want to, in effect, load the html page in a browser, let it finish rendering, and then capture the entire generated html for passing through to the client (saving the time to render on the client).
This is a simple example that does server-side templating (no express): https://github.com/FissionCat/handlebars-node-server-example
This is an example that serves html, js, css and an mp3 (but doesn't use express or any templating): https://github.com/FissionCat/Hue-Disco
There's some pretty useful documentation found here: http://www.hongkiat.com/blog/node-js-server-side-javascript/
Like you said, avoiding lots of rewriting is a bonus.
Might be the information provided in the article be of some help.

Safely parse/work with HTML from XMLHttpRequest

I'm writing code (right now it's a Chrome extension, though I may make it cross-browser, or try to get the site owners to include the enhancement) that works with the contents of a particular <div> on a webpage that the user is viewing, and any pages that are part of the same discussion thread. So I find links to the other pages of the thread, and get them with XMLHttpRequest. What I want to do is just be able to use .getElementsByClassName('foo') on the resulting page.
I know I can do that by loading the results of the request into a div (i.e. Optimal way to extract a URL from web page loaded via XMLHTTPRequest?). However, while figuring out the best way to do this, I read that there are security concerns (MDN - Safely Parsing Simple HTML to DOM).
In this case, I'm not sure that matters much, since the extension would just load a page from the same comment thread that the user was already looking at, but I'd still like to do this the right way.
So what's the right way to work with HTML from an XMLHttpRequest?
P.S. If the best answer is jQuery, then tell me that, but I've yet to start using jQuery, and would also like to know the fundamentals here.
Edit: I don't know why I phrased things the way I did, but let me be clearer that I'm really hoping for a non-JQuery answer. I've been trying to learn the basics of javascript before learning JQuery and I'd prefer not to import a whole framework to call one function when I don't understand what I'm doing. That may seem irrational, but it's what I'm doing for the moment.
Since you say you're not opposed to using jQuery, you should look at the load function. It loads html from the address you specify, then places it into the matched elements. So for example
$("#formDiv").load("../AjaxContent/AdvSearchForm.aspx?ItemType=" + ItemType);
Would load the html from ../AjaxContent/AdvSearchForm.aspx then place it in the div with the id of formDiv
Optional parameters exist for passing a data to the server with the request, and also a callback function.

Using Javascript to render data onload

This post probably will need some modification. I'll do my best to explain...
Basically, as a tester, I have noticed that sometimes programers who use template-based web back ends push a lot of stuff into onload handlers that then do stuff like load menu items, change display values in forms, etc.
For example, a page that displays your network configuration loads blank (or dummy values) for the IP info, then loads a block of variables in an onload function that sets the values when the page is rendered.
My experience (and gut feeling) is that this is a really bad practice, for a couple reasons.
1- If the page is displayed in an environment where Javascript is off (such as using "Send Page") the page will not display properly in that environment.
2- The HTML page becomes very hard to diagnose, because what is actually on screen is needs to be pieced together by executing the javascript in your head (this problem is less prominent w/ Firefox because of Firebug).
3- Most of the time, this is not being done via a standard practice of feature of the environment. In other words, there isn't a service on the back-end, the back-end code looks just as spaghetti as the resulting HTML.
and, not really a reason, more a correlation:
I have noticed that most coders that do this are generally the coders that have a lot of code-related bugs or critical integration bugs.
So, I'm not saying we shouldn't use javascript, I think what I'm saying is, when you produce a page dynamically, the dynamic behavior should be isolated to the back-end, and you should avoid changing the displayed information after the page is loaded and rendered.
I think what you're saying is what we should be doing is Progressive Enhancement with JavaScript.
Also related: Progressive Enhancement with CSS, Understanding Progressive Enhancement and Test-Driven Progressive Enhancement.
So the actual question is "What are advantages/disadvantages" of javascript content generation?
here's one: a lot of the things designers want are hard in straight html/css, or not fully supported. using Jquery to do zebra-tables with ":odd" for instance. Sometimes the server-side framework doesn't have good ways to accomplish this, so the way to get the cleanest code is actually to split it up like that.

Categories

Resources