Here is my case: I use some external source to load html data to my page, after that I put content of this html to div. So, as soon as page loaded and ajax call finished, I see the results.
It works, Ok.. but now I came up, that this dynamically loaded content is not crawable by Google bot.. and this is something that I don't like :)
Are there any ways to say to google bot, that page page, actually contains a content of page?
For example, if I loaded a page from http://external.com/test.htm, and loaded it to div, can I use something like
<div id="dynamic"></div>
?
I hope you understand my question, if not, please make your comments!
Thanks!
You might be interested in checking out the following document directly from Google, for a few concrete tips:
Google Code: Making AJAX Applications Crawlable
Related
I'm experiencing some issues regarding an AJAX script I'm working on.
The page loads perfectly well, and all needed scripts are loaded the same for basic page functionality inside AJAX script, but after hash change, jQuery behaves awkwardly.
Let's take this example.
The custom jQuery script writes an inline CSS propriety for a specific DIV at page loading:
Now, I load the login page for example:
I get back to the main page and inline style disappears as well as the basic loaded functionalities cease to exist after Ajax call:
*
Any experience on this? Does anyone have a clue why this happens? Or even near it... Seems the script unloads on page/hash change, which I don't believe. Or enters in double loop, therefore doubling the classes for HTML. I don't get it.
Already searched a lot and went trough the coding and is fine becasue it works fine alongside with basic HTML. Would appreciate some thoughts on this matter.
Thanks!
We're running over a problem here. We have a website which shows a header, content and a footer. Now, when clicking on a link we want just a part of the site (content-part) to be refreshed to save some traffic. The header and footer should stay as they're. We tried to use the following script for that:
http://www.dynamicdrive.com/dynamicindex17/ajaxcontent.htm
Well, the script is not doing great on our site. Any other JS-Code does not get executed at all. The GET-Tags in the URL are not displayed aswell. Everything else works with that script.
Do you have knowledge about any other alternatives to this script?
Regards,
I agree, Ajax is the way.
You can include jQuery.
Have a look at these links.
http://api.jquery.com/jQuery.ajax/
In particular jquery load() function
$('#result').load('ajax/test.html');
a little jsFiddle
http://jsfiddle.net/james_nicholson/c6dpn/5/
please note: $("#content").load("a/html/page/that/should/be/on/your/server");
.load(); <- needs to have a local page, PHP and or HTML will do.
http://api.jquery.com/load/
When you say 'Any other JS-Code does not get executed at all', do you mean at any time from when the page loads initially, or only when the content part is refreshed?
For dynamic adjustment in javascript i would recommend http://angularjs.org/ or http://knockoutjs.com/ , try this out and work through tutorials and you will see how easy it is to change everything without refreshing.
I'd say you only need AJAX if you get some data from a database. Otherwise it's enough to work with AngularJS or KnockoutJS. Have a look at it!
Regards
According to the jQuery documentation for $.ajax, "included script tags are evaluated when inserted in the DOM.". When I use ajax to grab the content of a dialog box, which in turn contains a script tag for displaying a ReCaptcha box, the ReCaptcha box does not appear when added to the DOM. According to FireBug, the script tag is also now missing from the added content. Navigation directly to the dialog content displays the ReCaptcha just fine.
Does anyone know why this may be occuring and/or know a work around? Any help would be greatly appreciated.
Code in action can be viewed here:
https://dustinhendricks.com/
Then click "Register Now".
Does the script try to do things like document.write()? That won't work when loaded dynamically, only on the initial page load.
In general, scripts contained in HTML that is innerHTML'd is not evaluated consistently across browsers. To fix that, jQuery actually looks for script tags and executes them manually.
But that doesn't really matter. Even if jQuery didn't do that, the script would be running in a different type of context than it normally is when a page is loading for the first time. It's not really an 'inline script' anymore, and a lot of 3rd party scripts were written with it being 'inline' as an assumption. You'll have to figure out what the script is doing and find a way to call it with dynamic content.
Take a look at the Google reCAPTCHA AJAX API. Using the CAPTCHA this way should fix your problem.
I hope it helps!
First of all: I hope the following question is not too generic.
I have a small problem and I cant think of a good solution and I was hoping some1 here is able to help me.
This is my situation:
I am using AJAX to dynamically load pages. My main site is index.php and once I click on a navigation link, the AJAX script replaces the content of index.php with new content and adds a hash tag to the URL. For instance:
I click on the link to about.php, the script adds #about.php to the URL and loads content from about.php into index.php. It works great :) However, there is a small issue that I would like to resolve:
Lets say we start by navigating to index.php#about.php directly - this means the content of index.php is visible for 2,3 seconds and than gets replaced with content from about.php. And I would like to avoid that.
I came up with a few ideas, but they are all not really great:
1) Hide content -> than make AJAX call -> on completed AJAX show content again
Downside: The content is still visible for a second.
2) Hide content with CSS and show it after AJAX call
Downside: This would work perfectly, but users without Javascript (and the GoogleBot) will see an exmpty page only.
3) Use an empty index.php and put the content of it in main.php and automatically load main.php via AJAX on page load.
Downside: Would work too, but again, users without JS and GoogleBot will just see an empty page when the visit index.php
Thats all I can think of and all three solutions are not good, because I am worried the SEO value will dramatically decrease when I have an empty index.php (I could accept that users with no JS get nothing to see).
p.s. I read somewhere that when you have display:none in an external css file and block it with robots.txt, GoogleBot wont know the difference, but I am worried thats maybe not the case? Any1 got some experience?
Edit: I guess my whole question comes down to this:
Do you think hiding the whole content of index.php with CSS (and than show it with JS), will be a huge no-go for SEO or will it be okay with GoogleBot (afterall the content is still in the source, but not visible to the user)?
If you used query strings instead of the hash you could have index.php load the correct content at the server level.
A plugin like history.js can help you push URLS to the browser so that you still get your ajax browsing.
Wow where to start...first of all the page 'blink' I'll call it is 2-3 seconds for you but it is completely dependent on the users computer, how fast it executes the javascript, and how fast the AJAX call returns so you could have a much larger delay.
Second I wouldn't worry about Googlebot seeing any of the ajax content. While it's true googlebot does try to fiddle with some javascript it won't make the ajax call like a normal browser would. I'd be very surprised if Googlebot ever saw any of your Ajax loaded data.
Googlebot does a fantastic job of figuring out what content is delivered via html/css to a user when they visit your page. It also figures out if something is displayed or not and does a good job of deciding if that content is just stuffing or is something that really matters.
You're worried about what someone without javascript will see when the entirety of navigating your site is based in javascript. This doesn't seem to reconcile.
You've got PHP available. My suggestion is to forgo the AJAX stuff you're trying to do and do it in PHP. You can just as easily script the same behavior in PHP as you can in AJAX.
SEO NOTES:
If you're looking for solid SEO results I suggest making the static (non-javascript version) page as SEO friendly as possible. I like to 'pick the low hanging fruit' like making sure the page has one and only on H1 and that it has the most important keywords in it. seomoz is one of the best sites I've found for seo information.
I have 2 html page (main and details): the main page consists of a table and a empty div. When the user clicks one a table row, the empty div is filled via AJAX from another page (details page).
On the details page I want to load a Google Map. Also I would like the page to be operational by itself (standalone), not just via AJAX.
So here is my problem:
To use Google maps I have to include this script in head of html:
<script src="http://maps.google.com/maps/api/js?sensor=false" type="text/javascript"></script>
If I include this in the details page, it works fine standalone. But it doesn't work when I try to get it via AJAX from main page. Google server hangs, and doesn't progress.
On the other hand, if I include it in the main page, AJAX works fine, but the details page is not operational on its own, since its missing a vital include.
I'd really like to leave it in the details page, since it has much more logic to be there. Is there any way I can load the script in the main page, from the details page?
Generally what is the best approach with javascript including and AJAX? Keep everything in main page? Or is there any mechanism to load everything into main page, but keep the code in ajaxed pages?
Btw. I'm using jQuery, but it is not really important. This is a design issues, not a library problem.
Since you are not using IFRAME, it is best that you include the JAVASCRIPT in the main page rather than detail page - since you can do and the js will work. This I think will fix the ajax issue for you and the script is loaded once.