Search engine indexing of single page applications - javascript

Alright so I've been writing Backbone.js apps for over a year now and I love the framework model. I've learned how to avoid all the pitfalls and such, but there's one area I'm still quite weak as a single page app developer: how to SEO a public facing app.
I'm working on a blog project, and the easiest solution to my mind is to have a server generated list of all blog entries visible as a link from the /blog section that is rendered on page load, and to ensure that when hitting a /blog/:id url, the server loads the blog content into the very first div on the page, which will be set as display:none.
My question is if this should be sufficient for a good search engine index? SEO is still my weakest skill as a developer. Are there techniques for making sure a search engine crawls this content first and is able to use that content for its more complex indexing?
Also, is there a way to blacklist the generated app content on the page as I know Google has been testing crawling JavaScript apps? In my mind that could never be done at the level it needs to be without some sort of standard browser level event that can be triggered on a full page render or after all data has been loaded.
Anyways, this is more of an ambiguous ticket I know, but it could end up being useful to people in the future if we get a collection of good answers here.

Most of the major search engines (including Google) are rendering the content they receive from the website, in our (Google's) case with something close to a headless browser, so whatever you do for the users the search engines will also get it. Serving different stuff to search engines however will get you into a dangerous area, named cloaking.
Hiding the content with a display:none might backfire on you. We are giving hidden content way less weight in ranking.

Related

Should I use Next.js to allow the application to be indexed by google?

I created a web application for a family business using react.js a few months ago but the website is only accessible by people who know the exact URL. It is using a firebase backend and a React.JS frontend.
I've used the google crawler checker and it returns normal saying that the crawlers are able to access the website with a screenshot of the page. However, it is not indexed on google search results.
I've read about how SSR is a possible solution to this using Next.JS.. but not really sure what it means. How can i get the website to show towards the top of the search results when the business name is searched in google? Should I use Next.JS over React.JS for something like this?
Welcome to the massive world of Search Engine Optimization.
There are many, infinite ways, to get your website to the top of the rankings on Google. To name a few:
Readable domain, look up how to put your website behind jtsapebusiness.com instead of weird-animal-123.firebase.etc.io.
Serving certain files like robot.txt that Google specifically searches for
Having Meta Tags
Having specific Meta Tags for each page (Next JS is great for this)
Render Time (Server Side Rendering is also great for this, but if your "React" app is small enough the performance difference shouldn't really matter to be honest.)
Page accessibility, Google can scrape single page apps a lot better than it used to, but serving up each page individually via Sever Side Rendering has a lot of perks.
How often your page is searched and clicked on (tell your friends and family to search you on google and click on your website)
These are just a few! Reading more about Search Engine Optimization will help you come up with even more questions. When and if you do switch to Next.js, you will still be using React. You will just be writing it a little differently to fit a more server side pattern.
It will not matter much if you choose to pick between "React or Next". If you wanted to maximize your chances of ranking higher, then I would go with Next. But I wouldn't want you to pickup a whole new technology if you already have the React App built. Instead you would just need to add some Search Engine Optimization sprinkles on top (some examples listed above).

SEO for html single-page site via quasi-html content

Suppose, I have a javaScript-heavy single page web application. My Javascript render dom directly from model / datasource (Json).
I came up with an approach to generate simple html from datasource (on backend). This html is required only for search engines to index. After page is loaded, JavaScript will replace this quasi-html with the proper UI. Quasi-html can be removed from layout with display:none to avoid performance penalty on the browser.
Will it work?
Also I am concerned about legitimacy of the approach.
Thoughts?
It should work giving the search engines content to craw even if they don't read javascript. Now bots evolve and they read quite a bit of javascript nowadays, I've created a page that only has 2 sentences onBeforeLoad and uses Ajax to get the rest of the content and I see Google indexing a lot of the keywords delivered by Ajax. A problem would be misleading the search bot, like putting in content irrelevant to your other page content - something the bot might pick up at some point and penalize you for it. "I am concerned about legitimacy of the approach" - I wouldn't be, keep code valid and ride on

What is a good example of a strategy for achieving SEO-friendliness in a javascript-heavy application?

Intro
I know this has been asked before but the questions I found were either to specific or to general to provoke the kind of answer I was looking for. The best possible answer I can imagine would be an example using backbone and the least amount of server-side logic possible (no preferred language/framework there).
Problem
I am planning an javascript/ajax-heavy (backbone + mostly-json backend) application that implements a facetted search. Take for example a facetted search of a simple shoe shop application that lets you filter color, brand and type of shoes and sort by price and size or whatever else.
Assume I am using backbone or a similar framework on the client and a json service as a backend.
What would be a good (tradeoff between effort and result) strategy to achieve seo-friendliness as well as a snappy interface?
Resources
A solution that came to my attention is Hijax by reusing client-sided templates on the server-side, as described here: http://duganchen.ca/single-page-web-app-architecture-done-right
Resources that I digested without final conclusion
http://code.google.com/intl/de-DE/web/ajaxcrawling/
https://stackoverflow.com/a/6194427/818846
http://www.quora.com/Search-Engine-Optimization-SEO/If-I-have-data-that-loads-using-json-JavaScript-will-it-get-indexed-by-Google?q=seo+javascript
The general point in SEO friendliness: It should work without JavaScript.
It's also good for accessibility, so you should do it like this, if the user does not have JavaScript enabled (like the search engine does), it will work.
If he has JavaScript enabled, (like any sane human being does), it will work with all the nifty JavaScript features you've added.
As a general usability rule of thumb: If it works, it should also work without JavaScript!
The solution of your first link sounds right. The main issue of a single page app is that you have to render your templates on both sides, the backend and the frontend. Using the Mustache or google closures template will be good solution for this.
The same solution that was used for google+, where initially the side will be rendered on the server and you load a static html page, after that the page will be rendered on the client side but with the same templates as on the server.
Also remember that the search engines follow links much more often than they (ever?) complete forms.
This problem of enabling the crawlers to see your db contents is called the "dark web," "invisible web", "deep web" or "hidden web". Blog post
So re your problem statement:
a facetted search of a simple shoe shop application that lets you filter color, brand and type of shoes and sort by price and size or whatever else.
I'd suggest that you include searches via a hierarchy of links in addition to searching via forms with select fields.
Eg, on a secondary menu include all the different brands as individual links. Then each link should lead to a list of the products sold by that brand. The trick is arrange things so that the link to an individual shoe will take you back to the first page (the rich one page app) but showing the specific shoe. -- And the page should implement the Google Ajax-crawling recommendations that you reference in the OP.

Javascript isn't good for SEO, is it?

If I decided to use some javascipt in my website like
$('#body').load(URL);
or
$.get(URL, {param:value}, function(){ ... });
or
window.title = 'TEXT';
Is it good for SEO? Or am I recommended to use pure PHP for data on the page for SEO purposes?
The question of if javascript is good for SEO or not is missing the point. We should pretty much assume that any content which is only available by javascript will not be crawled by the search engines. Google at least claims to be able to crawl some javascript only content but is fairly tight lipped about what exactly they can crawl. Other search engines probably don't crawl it and it's certainly the case that not all do. So assume it doesn't get crawled.
That doesn't mean it's bad for SEO.
If the content will contribute to your SEO, then it's bad for SEO. If the content is neutral to SEO, then it's neutral for SEO. So the answer to your question really depends on the nature of your content. If the content is part of your SEO campaign, then stick with server-side HTML generation be it PHP or some other method. Otherwise the question of SEO has no bearing on the decision to to use javascript or not. Accessibility would be another thing to take into account. Javascript only content is terrible for that.
The larger search engines can/do render limited amounts of javascript. However, for SEO purposes your best bet is rendering the content via HTML rather than javascript. A good rule of thumb is to utilize HTML for content/expressing limited content structure (e.g. paragraph type text = p, lists = ul/ol, headings = h1/h2/h3, etc...), CSS for presentation, and JS for client side programming. With that being said, always ensure a good user experience first. If you can do the above while providing a great user experience, great! If you can't, users first. Its likely you can keep both users and bots happy 95% of the time if you take the time to do so.
Further reading (sorry, I can only post one link as a new user):
Matt Cutts Interview (Check out #26 on Google Javascript Rendering)
A spider's view of Web 2.0
EDIT Added that for "a new user" ;) ~ drachenstern
I think first you should consider what SEO means. It means "Search Engine Optimization" ... how does a search engine get data in the first place for it to be optimized?
It does a GET on the page and whatever data is returned in the GET is processed. No JS engine. No POST data. So you should be optimizing for whatever data is returned on a GET.
Additionally, you tagged this with PHP, but the question has nothing to do with PHP.
Have you seen any of the questions on this list?
https://stackoverflow.com/search?q=javascript+seo
No Sir, Google does not translate flash and java script properly so it may not crawl those area using java script or flash content. I suggest you should keep your website simple but if it is necessary to keep flashy/java script content then you should keep a text base backup.
The first thing you should be asking is not what is good for SEO, but what is good for users. For users, loading data with JavaScript will give them an interactive page, where they can start seeing the page immediately while it is still loading and where the page can update without having to reload it.
From Google's Webmaster Guidelines and article on Cloaking, you should not assume that crawlers can understand JavaScript. This does not mean that you should not use JavaScript on your website, but rather that you should provide the textual equivalent in noscript tags, for use both by users with JavaScript disabled
as well as for crawlers, bearing in mind that the content of these noscript tags should be roughly equivalent to what was shown with JavaScript enabled; showing different content to users and to search engines is called "cloaking" and is frowned upon to say the least.
Google doesn't (yet) execute a page's Javascript (JS). So if your JS replaces/creates content on a page then the content would normally be invisible to the crawlers (not good).
But, the Googlers have implemented a url hack that enables your server to create pages (from the server, not from JS), with all the different varients of your JS page's content.
This solves the SEO problem of Ajax powered pages. At least for Google searches...
See Crawable Ajax
Javascript or any scripts for that matter should never be used to house your sites content, ever! The entire web is driven by HTML and CSS, and in rare cases XML languages, everything else is a headache when it comes to SEO. Ask yourself this question, what exactly is SEO and what is it that search engines are indexing? Javascript and all programming/scripting languages are proprietary, this means that they are NOT standards as defined by the W3C, which means they are essentially worthless when it comes to indexing content. On the other hand, HTML, CSS, and XML are real standards developed for the web! It's ok to use scripts to add additional functionality to your pages, embed apps like social networking plugins, etc, but you should never use them to hold your websites HTML, CSS, or actual content ever, for any reason. Here's a link to a good article that will explain why you should be using HTML and CSS, and not a million scripts, optimizing webpages using proper html markup. Scripts cause other problems besides code that is hard for search engines to decipher. For one, they are harder for browsers to process, causing pages to load much slower than "static" pages made with HTML and CSS would. Pages made with PHP tend to create "dynamic" URL's that users and search engines cannot read. This is why Google recommends people who use jsp or PHP for their webpages include a sitemap, otherwise your links will never be found and might as well not exist. Stick to the conventions! Lets face it, we have standards for a reason. If every electronic component in your home required had a different type of plug that required a special socked, and all those devices had differing voltage and amperage requirements, what would happen? You would essentially burn down your house! And, you'd be spending 5 hours a day at the hardware store looking for those special adapters to fit your wall sockets with. If you plan on designing a website, use scripts for embedding apps or connecting with a database only, and use HTML and CSS to build "static" webpages. Also, use text links, as they are both human and search engine readable, and easy to index and make sense of. Never use scripts for your links. Programming and scripting can be fun, but not on the internet its not.
Search engines index HTML, CSS, and content (multi-media, graphics, videos, text, thats it!) everything else is pointless and annoying to both users and search engines alike. For best results use XML and design a custom language.
Google can crawl, index and rank javascript generated content.
But... it uses an old Chrome version (42) with an old javascript render engine.
The consequence is that your javascript code needs to work in older browsers and older chrome versions (older than 42). So no fancy ES6 functions, you need to use polyfills or use Babel for example.
Although you can do a lot with javascript (like click events or inject your mobile menu), it's recommended to still use a-href instead of a button with a javascript event and then using a function to get to a new page.
You can check the mobile testing tool from Google: https://search.google.com/test/mobile-friendly and check the errors/warnings/logs. If the rendered output looks like intended, Google will see your content.
In search console you can also ask to index the page. Sometimes the javascript crawler is first, sometimes the 'classic' crawler.
Doublecheck it some days afterward by googling a sentence or paragraph from your page.
There's no answer about if it's better or not. Content is content and Google should rank your website, SPA, PWA, AMP site, PDF document, online Doc, wikipage, and so on based on their content, not on the underlying technique.
If you are familiar with JavaScript, give it a go.
Regards, Peter

Factoring in SEO on a Flash Site

There have been many debates about this topic already here, but none of them fully answered my question so I figured I would pose it and hope I get one or two decent answers.
We're planning on relaunching our company website in the next few months. Our current site, for the most part, is text-driven and because of this we rank very well on Google, Yahoo, and Bing for our primary keywords. We want to increase the "Wow Factor" of the site a bit (we're an interactive agency) but still maintain a majority of our search engine footing. The option to use Flash, AJAX, and other technologies that are not considered to be search engine friendly have come up numerous times in our meetings and each time we have to evaluate what kind of impact it would have on us from an SEO perspective.
Assuming a good portion of the site content will be encapsulated within a Flash (swf) file, what would be the best course of action for maintaining current rankings? I've read numerous times that Google indexes Flash files but I am unsure as to what extent. Further, is there a method of telling Google not to index a Flash file (through a variable or otherwise)?
Finally, I had an idea that seemed sound in theory and wanted to put it out into the world and see what type of feedback I receive on it:
Again, assuming the whole page is in a Flash file living on index.html, would it be possible to build out the site as normal (set up a logical directory structure, add content to static pages within said structure, etc), specify paths to those static pages in a Google XML Sitemap file, and have the spiders crawl only those pages (which are rich in content) while the user experiences some concoction of Flash/Javascript/AJAX/etc? If this works, what would be the pros/cons of this solution? Thanks for bearing with me on this slightly off-kilter question.
Well referencing Google I found that they have made impressive strides into indexing Flash based web pages. The only limitation I found from reading the article is that they are currently still limited in their ability in these three areas:
Googlebot does not execute some types of JavaScript. So if your web
page loads a Flash file via
JavaScript, Google may not be aware of
that Flash file, in which case it will
not be indexed.
We currently do not attach content from external resources that are
loaded by your Flash files. If your
Flash file loads an HTML file, an XML
file, another SWF file, etc., Google
will separately index that resource,
but it will not yet be considered to
be part of the content in your Flash
file.
While we are able to index Flash in almost all of the languages found on
the web, currently there are
difficulties with Flash content
written in bidirectional languages.
Until this is fixed, we will be unable
to index Hebrew language or Arabic
language content from Flash files.
By the sounds of it you won't have any problems with any of the 3 'problems'. Based on this document Flash sounds like a viable option for you.
Adobe has been working on their end as well to accommodate the search engines in their stride to make SWFs more search engine friendly as well. So with the combined efforts of both Adobe and Google/Yahoo if you take a dip in ranking within a year or two the search algorithms will be better than they are even now.
As far as not indexing you should be able to add in a simple
User-agent: *
Disallow: /directory/
Disallow: /directory/page.html
to your robots.txt file.
Andrew,
I've had to deal with this sort of thing a few times and I'd recommend maintaining both a Flash site (for users) and an HTML site (for search engines). Here's how you do it:
With whatever server-side stuff you're using set up some kind of switch that determines whether a particular request is for HTML or for whatever your Flash movie consumes (XML, JSON, another SWF, whatever). Every page on your site should be able to return HTML and whatever you choose to feed your Flash movie. A query string parameter like "requestType=Flash" will work just fine.
Put all of the content in your HTML pages in a div tag and make the div invisible with CSS. Use SWFObject to check if the requesting browser supports Flash and, if it does, have SWFObject replace your HTML content with your Flash movie. Search engine spiders will ignore your scripts and simply crawl your HTML pages and if you'd like to show the HTML to users with browsers that don't support Flash (like mobile browsers), just make the HTML content visible after SWFObject has determined that the browser doesn't support Flash.
Once your Flash movie has loaded, have it request whatever data it needs from the server using the same URL of the page that it was loaded on, but with the addition of the switch variable above.
Handle navigation from that point on with SWFAddress. When a user clicks a button to request a new page, pass the request through SWFAddress first, which will update the browser history using the hash mark trick, and then have your Flash movie make its request to the server.
I'm currently working on a site for a friend that uses this technique here (I should note, to protect my pride, that the site is still very much a work in progress):
http://www.casabarbuenosaires.com/
A browser request to any page on the site will first return the HTML representation of that page (you can view source in your browser to see that). SWFObject then replaces the HTML content with a Flash movie that loads a custom XML description of the same page which the Flash movie then constructs and displays.
I've worked on sites in the past that have used this technique and gotten excellent search engine results. Since you don't need to worry too much about what your HTML site looks like to humans, you can focus solely on what it looks like to search engines.
Another added benefit of building your site this way is that you are compelled to separate your site's content/copy from its visual representation. Throwing your entire site into a single SWF is generally NOT a good way to do that. It's much easier to maintain (or re-skin or scrap) a site when your content isn't all mixed up with your code.
Hope this helps,
Scott

Categories

Resources