Fully javascript generated site seo - javascript

The index.html includes only a div where all the html generated by Javascript.
I know that one of the ways is to redirect search bots to an onother html. I read this on an old post and i want to know if is the best way and one or two tips for this.(not how to redirect)
The site is build in Tumult Hype so i cant place content on html.

If you redirect the search bot to a diffrent document then this is considered cloaking and may harm your ranking in google.
Yes, google is able to execute JS. But you should not dynamically generate the content on your site, it will hurt your rankings. One could use some kind of prerender.
Edit: of course you can dynamically generate content, but the main content should not be using dynamic javascript.

Related

Should JS dynamically generate metadata/the whole page?

So I am going to have many pages that have a bunch of text in them, that a JS and CSS file will convert to a colored and everything webpage. I noticed that the text is usually going to be long, and since there are going to be many webpages, I should lower file size. Also since I don't want to ruin file quality, I have decided that my JS file is going to take the text and make a webpage out of it. Side Note: what I am trying to do is make tutorial pages, so I am going to use JS to generate a lot of the things that are on every tutorial page, like the lessons list, to lower file size.
I have noticed that metadata (<head> content) usually takes up some space that JS could generate, so I thought, Why don't I just generate this with JS? But then arose the problem that the some browsers might not parse it, or it might be slow to parse it. So I am asking here on Stack Overflow:
Should JavaScript generate metadata (and maybe almost the whole page, like remove the <head> tag completely and generate it with JS)?
It depends on your desired result.
Google has improved it's SEO mechanisms to render your page before indexing it, see here:
https://developers.google.com/search/docs/guides/javascript-seo-basics
However other bots may not do the same, such as social media crawlers like facebook or twitter that read Open Graph meta tags, or other search engines like Baidu.
If a bot doesn't render your document then the javascript doesn't get executed and your meta isn't present.
Additionally, if your initial document does not contain the stylesheets or other CDNs it takes a bit longer for the client. Imagine the process:
With head
fetch document
fetch resources
render content
Without head
fetch document
render content
fetch resources
re-render
That's way over-simplified but it demonstrates my point.
Alternative:
If your content is so dynamic, you might consider Server Side Rendering (SSR) or Pre-Rendering
You would build your pages programmatically and store/serve them all, or build them on the server-side as they are requested.
https://developers.google.com/web/updates/2019/02/rendering-on-the-web

Dynamic HTML content pages like Dropbox and Soundcloud

Check out the source code of Dropbox's main page or any Soundcloud page. You can see they've got a lot of Scripts going on, and little pure HTML content (article, main, p, div). I've been searching and it seems that way of generating pages is called dynamic content/HTML (correct me if wrong).
So, the function I think it has is to be able to edit multiple external separate files in Javascript (if that's the language it uses since they're scripts) so that the HTML documentes where they're linked to are generated dynamically.
Also, other possible function would be to have one external document, which let's say it's a navigation bar, and so you place it in multiple pages, and when you have to update, you just edit the external document and not each page (hooray!).
Questions:
Is it actually named Dynamic content?
What languages does it requires besides HTML, CSS, and JS? Like PHP or ASP (supposing if any is necesary at all).
Does creating pages in that way affects negatively/positively your website positioning in Google? Since I think when Googlebot reaches the page all it see are scripts.
There are two subtly different definitions of the word dynamic, which may be confusing your search for information about this. I'll answer your questions separately for each.
Dynamic as in "generated from content held in a database"
For example, on this page your reputation score was fetched from Stack Overflow's database and injected into the HTML.
Yes, this would be referred to as dynamic content. In contrast to static content, which would just be fixed files, dynamic content would be built up from its parts for each user who requests it.
Your second set of languages (PHP, etc.) are what read from the database and spit out the corresponding HTML.
Google's bot is smart: it can render pages and will see similar content to what you get in a browser. So generating pages dynamically instead of statically won't count against the site for SEO; dynamically generating lots of pages that are very similar might count against it though.
Dynamic as in "page content that updates without you having to refresh the whole page"
For example, as you wrote your question Stack Overflow tried to find similar questions and show them to you in case it had already been asked. JavaScript was sending a request to their server and updating part of the page in response.
This would also be referred to as dynamic content. The key difference is that it's JavaScript in the page that's making further calls to the server to fetch more content, which is what you're seeing on the minimalist sites you mention. This used to be called dynamic HTML (DHTML); more modern references are more likely to discuss it in terms of AJAX or "single page website".
Typically you'd have PHP or similar running on the web server, responding to the requests for content.
Again, Google's bot is smart enough to cope with this. That won't necessarily be the case for all search engines though.

Is there any possible way to get meta tags(OPEN GRAPH) from server side php into angularjs SPA?

I had been searching for this since long but ain't able to figure it out.. I have got a ready website built in angularjs using all the best practices and the server side is PHP CI.
Now what I am suppose to do is to get them the opengraph meta tags into the head section.
I could easily manage it using jquery ofcourse but the problem arises when the facebook scraper crawls over the page.
As it is a single page Application, there is gonna be only one head hence its not possible for me to mention it on any other html any how... And as it's HTML I cannot let php render the page..
I have tried to search for the answer and ultimately got to
http://www.michaelbromley.co.uk/blog/171/enable-rich-social-sharing-in-your-angularjs-app
But this is not possible for me to use.
I also read about the facebook opengraph pointer using
<link rel='opengraph' href='DESTINATION URL'>
But it says that all the basic tags need to be mentioned into the source and the additional tags can be obtained from destination url.
Is there any way I can solve this problem?
Here is the easiest way beyond sharesocial.in
Follow this link
http://sharelinkgenerator.com/
You will get your work done here.
http://www.sharesocial.in directly allows us to do it. It allows to do it for any page, Amy website and for fb, LinkedIn, Google + and whatsapp.

Is there an option to tell crawlers / bots: "don't use javascript"?

I was searching for this subject with no results, so I consider asking a question. I know that there is an option to make pages loaded by AJAX "crawable", using www.example.com/#!somecontent. But is there an option (i.e. Meta tag or robots.txt variable) that say: "Hey, robots, disable javascipt!"?
It can be used in example by:
1) online javascript games, which has huge amount of javascript, and nothing special for SEO and bots to crawl (bots memory and time saver)
2) To build a site using PHP, HTML, CSS (with meta tags change, etc) for robots. And then add some extra functionality (in example reload only content, but NOT CHANGE META TAGS) using AJAX, that crawlers and bots don't need to analyze. In that case bots see meta tags, and contents, and You prevent default actions for anchors, and user only reload content via AJAX, and meta tags are standard in that case.
PS) The question isn't about: You can do it better, or You can rebuild application in other way. It is about: can we suggest bots to disable javascript.
The quick answer is no, you can not.
The long answer is, if you have more than one version of your content, use Canonical link element in one link that use javascript that link to a page that do not have javascript enabled. You can append to url one parameter like no_js=1 and via server side remove all parts of your HTML that use Javascript.
So, yes, with some work you can do it.

newbie question about javascript embed code?

I am a javascript newbie. I am trying to write a requirements document, and need some help describing what I am looking for. We want our application to generate a javascript snippet like this:
<script src="http://www.jotform.com/jsform/10511502633"></script>
This will load a web form.
So my question is:
- How does a single script load an entire web form? Is this a JSON?
- What is this called? Is this a cross browser javascript?
- Can anyone point me in the direction of learning more about what this is?
Thank you for your help!
The javascript file is just hosted on an external site. It appears to be dynamically generated, so feel free to use some fancy words ;) But basically, you just include it here, as if it was on your own site.
You could say "The application will generate the required script-tags to include dynamically generated javascript file from an external, third-party site".
Offcourse you need to take special cautions for cases when the include won't work, because the other site is not reachable (site is down, DNS does not work, file is moved on other webserver, your application is on an intranet/behind a proxy/firewall...). Why can't you copy their file and mirror it locally? Or use a reliable Content Delivery Network, like Google or Amazon.
There are many names for this type of inclusion. The most common being widget.
What does it actually do:
take an id of some sort as parameter
use the id to fetch some specific data (most likely from a database)
generate some js and html based on the id/data
usually this involves iframes of some sort.
To use a script rather than an html iframe has multiple advantages
you can change what is actually delivered to the users browsers without changing the include
you can resize the iframe to fit certain predefined sizes
you can inject the necessary things into the page the widget is included (of course you need to make sure this is sanctioned)
We use this all the time and we never regreted it.
If you don't want to build the widget infrastructure yourself you can always use one of the widget providers like widgetbox:
http://www.widgetbox.com/widgets/make/
With those you are up and running in no time.
This is typically called a script include.
Google have lots of these types of items, and even they call them by many names,
widgets, custom javascript, snippets, custom code, etc. It really depending on who you are writing for... I would go with "cross platform embeddable javascript code" meaning that it would need to load all its dependancies. Also specify which browsers need to be supported and what should happen is the user has javascript turned off.
EDIT :
Actually since we are talking unique IDs, you will need 2 parts probably, the user/site unique "cross platform embeddable javascript code" and whatever serverside code to support it. Basically this is an API that is accessed using your own javascript widget. Feel free you point to examples in your requirements document, programmers love examples.

Categories

Resources