Factoring in SEO on a Flash Site - javascript

There have been many debates about this topic already here, but none of them fully answered my question so I figured I would pose it and hope I get one or two decent answers.
We're planning on relaunching our company website in the next few months. Our current site, for the most part, is text-driven and because of this we rank very well on Google, Yahoo, and Bing for our primary keywords. We want to increase the "Wow Factor" of the site a bit (we're an interactive agency) but still maintain a majority of our search engine footing. The option to use Flash, AJAX, and other technologies that are not considered to be search engine friendly have come up numerous times in our meetings and each time we have to evaluate what kind of impact it would have on us from an SEO perspective.
Assuming a good portion of the site content will be encapsulated within a Flash (swf) file, what would be the best course of action for maintaining current rankings? I've read numerous times that Google indexes Flash files but I am unsure as to what extent. Further, is there a method of telling Google not to index a Flash file (through a variable or otherwise)?
Finally, I had an idea that seemed sound in theory and wanted to put it out into the world and see what type of feedback I receive on it:
Again, assuming the whole page is in a Flash file living on index.html, would it be possible to build out the site as normal (set up a logical directory structure, add content to static pages within said structure, etc), specify paths to those static pages in a Google XML Sitemap file, and have the spiders crawl only those pages (which are rich in content) while the user experiences some concoction of Flash/Javascript/AJAX/etc? If this works, what would be the pros/cons of this solution? Thanks for bearing with me on this slightly off-kilter question.

Well referencing Google I found that they have made impressive strides into indexing Flash based web pages. The only limitation I found from reading the article is that they are currently still limited in their ability in these three areas:
Googlebot does not execute some types of JavaScript. So if your web
page loads a Flash file via
JavaScript, Google may not be aware of
that Flash file, in which case it will
not be indexed.
We currently do not attach content from external resources that are
loaded by your Flash files. If your
Flash file loads an HTML file, an XML
file, another SWF file, etc., Google
will separately index that resource,
but it will not yet be considered to
be part of the content in your Flash
file.
While we are able to index Flash in almost all of the languages found on
the web, currently there are
difficulties with Flash content
written in bidirectional languages.
Until this is fixed, we will be unable
to index Hebrew language or Arabic
language content from Flash files.
By the sounds of it you won't have any problems with any of the 3 'problems'. Based on this document Flash sounds like a viable option for you.
Adobe has been working on their end as well to accommodate the search engines in their stride to make SWFs more search engine friendly as well. So with the combined efforts of both Adobe and Google/Yahoo if you take a dip in ranking within a year or two the search algorithms will be better than they are even now.
As far as not indexing you should be able to add in a simple
User-agent: *
Disallow: /directory/
Disallow: /directory/page.html
to your robots.txt file.

Andrew,
I've had to deal with this sort of thing a few times and I'd recommend maintaining both a Flash site (for users) and an HTML site (for search engines). Here's how you do it:
With whatever server-side stuff you're using set up some kind of switch that determines whether a particular request is for HTML or for whatever your Flash movie consumes (XML, JSON, another SWF, whatever). Every page on your site should be able to return HTML and whatever you choose to feed your Flash movie. A query string parameter like "requestType=Flash" will work just fine.
Put all of the content in your HTML pages in a div tag and make the div invisible with CSS. Use SWFObject to check if the requesting browser supports Flash and, if it does, have SWFObject replace your HTML content with your Flash movie. Search engine spiders will ignore your scripts and simply crawl your HTML pages and if you'd like to show the HTML to users with browsers that don't support Flash (like mobile browsers), just make the HTML content visible after SWFObject has determined that the browser doesn't support Flash.
Once your Flash movie has loaded, have it request whatever data it needs from the server using the same URL of the page that it was loaded on, but with the addition of the switch variable above.
Handle navigation from that point on with SWFAddress. When a user clicks a button to request a new page, pass the request through SWFAddress first, which will update the browser history using the hash mark trick, and then have your Flash movie make its request to the server.
I'm currently working on a site for a friend that uses this technique here (I should note, to protect my pride, that the site is still very much a work in progress):
http://www.casabarbuenosaires.com/
A browser request to any page on the site will first return the HTML representation of that page (you can view source in your browser to see that). SWFObject then replaces the HTML content with a Flash movie that loads a custom XML description of the same page which the Flash movie then constructs and displays.
I've worked on sites in the past that have used this technique and gotten excellent search engine results. Since you don't need to worry too much about what your HTML site looks like to humans, you can focus solely on what it looks like to search engines.
Another added benefit of building your site this way is that you are compelled to separate your site's content/copy from its visual representation. Throwing your entire site into a single SWF is generally NOT a good way to do that. It's much easier to maintain (or re-skin or scrap) a site when your content isn't all mixed up with your code.
Hope this helps,
Scott

Related

Search engine indexing of single page applications

Alright so I've been writing Backbone.js apps for over a year now and I love the framework model. I've learned how to avoid all the pitfalls and such, but there's one area I'm still quite weak as a single page app developer: how to SEO a public facing app.
I'm working on a blog project, and the easiest solution to my mind is to have a server generated list of all blog entries visible as a link from the /blog section that is rendered on page load, and to ensure that when hitting a /blog/:id url, the server loads the blog content into the very first div on the page, which will be set as display:none.
My question is if this should be sufficient for a good search engine index? SEO is still my weakest skill as a developer. Are there techniques for making sure a search engine crawls this content first and is able to use that content for its more complex indexing?
Also, is there a way to blacklist the generated app content on the page as I know Google has been testing crawling JavaScript apps? In my mind that could never be done at the level it needs to be without some sort of standard browser level event that can be triggered on a full page render or after all data has been loaded.
Anyways, this is more of an ambiguous ticket I know, but it could end up being useful to people in the future if we get a collection of good answers here.
Most of the major search engines (including Google) are rendering the content they receive from the website, in our (Google's) case with something close to a headless browser, so whatever you do for the users the search engines will also get it. Serving different stuff to search engines however will get you into a dangerous area, named cloaking.
Hiding the content with a display:none might backfire on you. We are giving hidden content way less weight in ranking.

JavaScript and iframes: dangerous for my website?

In order to see a live preview of another website that users link to on my website, I'm using iframes.
However, this is probably not the best solution, as a website is loaded directly into mine, with every JavaScript element etc that is on the linked page.
My question: how dangerous is it to do such a thing? What is the worst case scenario that could happen? Could a linked site just by using JavaScript (or other technologies) do any serious harm to my site or my user's data?
And then, the second part of my question is, of course, about the website preview.
All I found so far are scripts that contain more than one php and js file in order to load a website preview picture.
Isn't there an easier way to do this? What do you suggest?
how dangerous is it to do such a thing?
Some websites do not like to be embedded using frames. Such websites can possibly take over the full browser by ensuring it is loaded in the topmost window. Aside from that, as long as your website and the website you are loading aren't from the same domain, they won't be able to access your cookies, DOM etc. So its pretty safe in that respect.
about the website preview
There aren't many fool proof mechanisms other than generating the preview image server side - as I believe the scripts you've seen do.

Javascript isn't good for SEO, is it?

If I decided to use some javascipt in my website like
$('#body').load(URL);
or
$.get(URL, {param:value}, function(){ ... });
or
window.title = 'TEXT';
Is it good for SEO? Or am I recommended to use pure PHP for data on the page for SEO purposes?
The question of if javascript is good for SEO or not is missing the point. We should pretty much assume that any content which is only available by javascript will not be crawled by the search engines. Google at least claims to be able to crawl some javascript only content but is fairly tight lipped about what exactly they can crawl. Other search engines probably don't crawl it and it's certainly the case that not all do. So assume it doesn't get crawled.
That doesn't mean it's bad for SEO.
If the content will contribute to your SEO, then it's bad for SEO. If the content is neutral to SEO, then it's neutral for SEO. So the answer to your question really depends on the nature of your content. If the content is part of your SEO campaign, then stick with server-side HTML generation be it PHP or some other method. Otherwise the question of SEO has no bearing on the decision to to use javascript or not. Accessibility would be another thing to take into account. Javascript only content is terrible for that.
The larger search engines can/do render limited amounts of javascript. However, for SEO purposes your best bet is rendering the content via HTML rather than javascript. A good rule of thumb is to utilize HTML for content/expressing limited content structure (e.g. paragraph type text = p, lists = ul/ol, headings = h1/h2/h3, etc...), CSS for presentation, and JS for client side programming. With that being said, always ensure a good user experience first. If you can do the above while providing a great user experience, great! If you can't, users first. Its likely you can keep both users and bots happy 95% of the time if you take the time to do so.
Further reading (sorry, I can only post one link as a new user):
Matt Cutts Interview (Check out #26 on Google Javascript Rendering)
A spider's view of Web 2.0
EDIT Added that for "a new user" ;) ~ drachenstern
I think first you should consider what SEO means. It means "Search Engine Optimization" ... how does a search engine get data in the first place for it to be optimized?
It does a GET on the page and whatever data is returned in the GET is processed. No JS engine. No POST data. So you should be optimizing for whatever data is returned on a GET.
Additionally, you tagged this with PHP, but the question has nothing to do with PHP.
Have you seen any of the questions on this list?
https://stackoverflow.com/search?q=javascript+seo
No Sir, Google does not translate flash and java script properly so it may not crawl those area using java script or flash content. I suggest you should keep your website simple but if it is necessary to keep flashy/java script content then you should keep a text base backup.
The first thing you should be asking is not what is good for SEO, but what is good for users. For users, loading data with JavaScript will give them an interactive page, where they can start seeing the page immediately while it is still loading and where the page can update without having to reload it.
From Google's Webmaster Guidelines and article on Cloaking, you should not assume that crawlers can understand JavaScript. This does not mean that you should not use JavaScript on your website, but rather that you should provide the textual equivalent in noscript tags, for use both by users with JavaScript disabled
as well as for crawlers, bearing in mind that the content of these noscript tags should be roughly equivalent to what was shown with JavaScript enabled; showing different content to users and to search engines is called "cloaking" and is frowned upon to say the least.
Google doesn't (yet) execute a page's Javascript (JS). So if your JS replaces/creates content on a page then the content would normally be invisible to the crawlers (not good).
But, the Googlers have implemented a url hack that enables your server to create pages (from the server, not from JS), with all the different varients of your JS page's content.
This solves the SEO problem of Ajax powered pages. At least for Google searches...
See Crawable Ajax
Javascript or any scripts for that matter should never be used to house your sites content, ever! The entire web is driven by HTML and CSS, and in rare cases XML languages, everything else is a headache when it comes to SEO. Ask yourself this question, what exactly is SEO and what is it that search engines are indexing? Javascript and all programming/scripting languages are proprietary, this means that they are NOT standards as defined by the W3C, which means they are essentially worthless when it comes to indexing content. On the other hand, HTML, CSS, and XML are real standards developed for the web! It's ok to use scripts to add additional functionality to your pages, embed apps like social networking plugins, etc, but you should never use them to hold your websites HTML, CSS, or actual content ever, for any reason. Here's a link to a good article that will explain why you should be using HTML and CSS, and not a million scripts, optimizing webpages using proper html markup. Scripts cause other problems besides code that is hard for search engines to decipher. For one, they are harder for browsers to process, causing pages to load much slower than "static" pages made with HTML and CSS would. Pages made with PHP tend to create "dynamic" URL's that users and search engines cannot read. This is why Google recommends people who use jsp or PHP for their webpages include a sitemap, otherwise your links will never be found and might as well not exist. Stick to the conventions! Lets face it, we have standards for a reason. If every electronic component in your home required had a different type of plug that required a special socked, and all those devices had differing voltage and amperage requirements, what would happen? You would essentially burn down your house! And, you'd be spending 5 hours a day at the hardware store looking for those special adapters to fit your wall sockets with. If you plan on designing a website, use scripts for embedding apps or connecting with a database only, and use HTML and CSS to build "static" webpages. Also, use text links, as they are both human and search engine readable, and easy to index and make sense of. Never use scripts for your links. Programming and scripting can be fun, but not on the internet its not.
Search engines index HTML, CSS, and content (multi-media, graphics, videos, text, thats it!) everything else is pointless and annoying to both users and search engines alike. For best results use XML and design a custom language.
Google can crawl, index and rank javascript generated content.
But... it uses an old Chrome version (42) with an old javascript render engine.
The consequence is that your javascript code needs to work in older browsers and older chrome versions (older than 42). So no fancy ES6 functions, you need to use polyfills or use Babel for example.
Although you can do a lot with javascript (like click events or inject your mobile menu), it's recommended to still use a-href instead of a button with a javascript event and then using a function to get to a new page.
You can check the mobile testing tool from Google: https://search.google.com/test/mobile-friendly and check the errors/warnings/logs. If the rendered output looks like intended, Google will see your content.
In search console you can also ask to index the page. Sometimes the javascript crawler is first, sometimes the 'classic' crawler.
Doublecheck it some days afterward by googling a sentence or paragraph from your page.
There's no answer about if it's better or not. Content is content and Google should rank your website, SPA, PWA, AMP site, PDF document, online Doc, wikipage, and so on based on their content, not on the underlying technique.
If you are familiar with JavaScript, give it a go.
Regards, Peter

Javascript based redirect: will it hurt SEO?

I recently implemented a fix to create separate landing pages depending on whether or not the user has javascript enabled. Basically the way it works is this.
The default page is an HTML page w/ no javascript. Basic version of the site. Upon landing on it, there is a script that says if javascript is enabled then go to another page. That landing page is generated by sending the user request through a JSP file that renders the page (header, footer, etc.). The final landing page is http://whatever.com/home.jsp if the user has javascript enabled.
My question is if this will hurt SEO. Considering 99% of the world has javascript enabled I would hate to compromise any SEO benefit to accomodate the 1% who doesn't enable javascript.
Hope that make sense.
In general, searchbots should be treated as browsers with JS disabled. I think you can now imagine where they'll land.
This whole question is by the way completely unrelated to JSP. It is just a server side view technology which provides a template to write HTML/CSS/JS in and provides capabilities to control the page flow dynamically with taglibs and access backend data with EL. All what webbrowsers and bots sees (and thus all what counts for SEO) is its generated HTML output.
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Short version, if your JS sends them to entirely different content, it's probably bad, and Google may give you a a hard time. Other than that, you should be good.
If the alternative version is an (almost) full-featured, full-content version, then it's perfectly OK.
Google even advices for making alternatives for Flash-only sites, for example, in regard to usability.
Read google FAQ
You touch two topics, one is described as "Cloaking", the other as "Duplicate Content". With "cloaking", you present different (optimized-with-bad-intention) content based upon identification of the client that accesses it, e.g. by inspecting the User-agent header (google-bot versus Browser). You are not doing this, you just want to present content in a way that suits your client best, like a redirect on a page optimized for mobile clients ("m.example.com").
The other thing is how to avoid duplicate content. There's a way by indicating the original content source with a canonical tag, see here: http://googlewebmastercentral.blogspot.com/2009/02/specify-your-canonical.html

URL Redirects for SEO (in Flash)?

I am creating a flash site and am trying to make it SEO. I'm thinking a possible solution would be to render html to any search engine bot, or to anyone who needs accessibility, and rendering the flash site for the rest of the users.
First question is, is this acceptable for google, and SEO in general?
This would mean I would redirect urls to flash users from site.com/home.html to site.com/#/home only if they weren't a bot of some sort.
Second question is, is it possible to do this in javascript or rails?
I would do this by capturing the URL, checking to see who the user is (is it google, or is it a human), I'm just not sure how to do this with javascript/rails, whatever need be. Then once I found "hey this is google", I would return the html page; if it was a user, I'd return flash.
Would that work? Is there something better?
It'd be worth reading up on Google's policies toward cloaking, sneaky Javascript redirects, and doorway pages.
Personally, I'd build the site in HTML and use the Flash for progressive enhancement where appropriate.
Its not doable in javascript, because javascript is executed after the page is sent, so the damage is already done.
Your webserver would have to recognize the google useragent when the page request is made, and serve a different page accordingly. Then you can avoid the whole redirect nonsense entirely. I know you can configure most webservers to do that, however I do not know the required steps, and it depends on what webserver you are using.
I'm not going to comment on the merits/demerits of flash based websites.
This is a form of SEO called cloaking that's widely considered unscrupulous (though your intended use doesn't sound malicious to me). It can get you banned by Google.
Have you looked in to using SWFAddress?
The flash framework, Gaia, uses separate xhtml pages for it's SEO solution. From it's site:
"The Search Engine Optimization Scaffolding engine in Gaia creates an XHTML file for every page you specify in the site.xml, as well as a sitemap.xml file that follows sitemaps.org protocols.
The purpose of SEO Scaffolding is to provide search engines and non-Flash users with easy access to the content on your site, as well a convenient single data source for the copy on your site, organized by page.
This technique is white hat compliant, and is discussed on the Gaia forums."
More information here: http://www.gaiaflashframework.com/wiki/index.php?title=SEO

Categories

Resources