Javascript linking to other pages - javascript

We're running over a problem here. We have a website which shows a header, content and a footer. Now, when clicking on a link we want just a part of the site (content-part) to be refreshed to save some traffic. The header and footer should stay as they're. We tried to use the following script for that:
http://www.dynamicdrive.com/dynamicindex17/ajaxcontent.htm
Well, the script is not doing great on our site. Any other JS-Code does not get executed at all. The GET-Tags in the URL are not displayed aswell. Everything else works with that script.
Do you have knowledge about any other alternatives to this script?
Regards,

I agree, Ajax is the way.
You can include jQuery.
Have a look at these links.
http://api.jquery.com/jQuery.ajax/
In particular jquery load() function
$('#result').load('ajax/test.html');
a little jsFiddle
http://jsfiddle.net/james_nicholson/c6dpn/5/
please note: $("#content").load("a/html/page/that/should/be/on/your/server");
.load(); <- needs to have a local page, PHP and or HTML will do.
http://api.jquery.com/load/

When you say 'Any other JS-Code does not get executed at all', do you mean at any time from when the page loads initially, or only when the content part is refreshed?

For dynamic adjustment in javascript i would recommend http://angularjs.org/ or http://knockoutjs.com/ , try this out and work through tutorials and you will see how easy it is to change everything without refreshing.
I'd say you only need AJAX if you get some data from a database. Otherwise it's enough to work with AngularJS or KnockoutJS. Have a look at it!
Regards

Related

Elements of the jQuery scripts work when going back to a page on the browser but not on refresh

I am using bootstrap script, mixitup filtering and owl carousel on a SharePoint page. For example on the owlcarousel the data shows up but the actual carousel function does not work. If i leave the page and then go back -1 everything works. It is the only time it works. When I refresh or reload it goes back to the original state.
On the mixitup filtering a similar issue happens. The data does not load initially like it should. When i leave the page and go back then it loads and functions properly.
Any ideas? Something with caching and loading order of scripts? Why would it work when going back?
Try to load the scripts in this order in the end of your body:
jQuery,
Bootstrap,
owlcarousel
Then your custom Script which initiates any own functions
After weeks of searching and deciding to post the question I found my answer so I wanted to make sure i shared.
I found a post from Chris Coyier on how to run JavaScript only after the entire page has loaded... THAT WAS IT! Once i placed my script in this it fixed my issue.
$(window).bind("load", function() {
// code here
});

Recall external JS files after page transition finish

I have tried to implement this kind of script (Page Transition): here
Everything is going fine as demo provided. But only 1 problem that I cant figure out is:
I have 2 HTML files which is index.html & index2.html. On index.html I put the link with the page transition effect after clicked it goes to index2.html which is on index2.html I was put in some alert script using body on-load method.
Supposedly in normal practice, the alert will appear as normal we seen for debuging. But it doesn't appear anything. Seem like it doesn't load any script after page transition done.
Can somebody give me a clue to solve this? What I have tried is using :
location.reload(); window.location.reload(); etc.. till I don't have idea to fix this :(
*location.reload() works on desktop browser but doesn't work on mobile. My priority target browser is on mobile version.
Please help & Many Thanks
for demo purpose and needs help : here
It wasn't executed because the page never really loaded. The way that page transition script worked is by loading the content of the target page via ajax, replacing the entire content of the page with it.
From the page you provided, it seems like the script accepts a callback function to be called when the page finishes loading, you can put your 'loaded' script there. But keep in mind, what is being executed is the script on that first page.
I don't know what you are trying to make, but I guess it would be better for you to look into a proper single page app with URL matching. There are frameworks like Backbone.js that can help you with this.

Script tags contained in jQuery AJAX response not evaluated when inserted into the DOM?

According to the jQuery documentation for $.ajax, "included script tags are evaluated when inserted in the DOM.". When I use ajax to grab the content of a dialog box, which in turn contains a script tag for displaying a ReCaptcha box, the ReCaptcha box does not appear when added to the DOM. According to FireBug, the script tag is also now missing from the added content. Navigation directly to the dialog content displays the ReCaptcha just fine.
Does anyone know why this may be occuring and/or know a work around? Any help would be greatly appreciated.
Code in action can be viewed here:
https://dustinhendricks.com/
Then click "Register Now".
Does the script try to do things like document.write()? That won't work when loaded dynamically, only on the initial page load.
In general, scripts contained in HTML that is innerHTML'd is not evaluated consistently across browsers. To fix that, jQuery actually looks for script tags and executes them manually.
But that doesn't really matter. Even if jQuery didn't do that, the script would be running in a different type of context than it normally is when a page is loading for the first time. It's not really an 'inline script' anymore, and a lot of 3rd party scripts were written with it being 'inline' as an assumption. You'll have to figure out what the script is doing and find a way to call it with dynamic content.
Take a look at the Google reCAPTCHA AJAX API. Using the CAPTCHA this way should fix your problem.
I hope it helps!

jQuery+ AJAX + Help Hiding This Content

First of all: I hope the following question is not too generic.
I have a small problem and I cant think of a good solution and I was hoping some1 here is able to help me.
This is my situation:
I am using AJAX to dynamically load pages. My main site is index.php and once I click on a navigation link, the AJAX script replaces the content of index.php with new content and adds a hash tag to the URL. For instance:
I click on the link to about.php, the script adds #about.php to the URL and loads content from about.php into index.php. It works great :) However, there is a small issue that I would like to resolve:
Lets say we start by navigating to index.php#about.php directly - this means the content of index.php is visible for 2,3 seconds and than gets replaced with content from about.php. And I would like to avoid that.
I came up with a few ideas, but they are all not really great:
1) Hide content -> than make AJAX call -> on completed AJAX show content again
Downside: The content is still visible for a second.
2) Hide content with CSS and show it after AJAX call
Downside: This would work perfectly, but users without Javascript (and the GoogleBot) will see an exmpty page only.
3) Use an empty index.php and put the content of it in main.php and automatically load main.php via AJAX on page load.
Downside: Would work too, but again, users without JS and GoogleBot will just see an empty page when the visit index.php
Thats all I can think of and all three solutions are not good, because I am worried the SEO value will dramatically decrease when I have an empty index.php (I could accept that users with no JS get nothing to see).
p.s. I read somewhere that when you have display:none in an external css file and block it with robots.txt, GoogleBot wont know the difference, but I am worried thats maybe not the case? Any1 got some experience?
Edit: I guess my whole question comes down to this:
Do you think hiding the whole content of index.php with CSS (and than show it with JS), will be a huge no-go for SEO or will it be okay with GoogleBot (afterall the content is still in the source, but not visible to the user)?
If you used query strings instead of the hash you could have index.php load the correct content at the server level.
A plugin like history.js can help you push URLS to the browser so that you still get your ajax browsing.
Wow where to start...first of all the page 'blink' I'll call it is 2-3 seconds for you but it is completely dependent on the users computer, how fast it executes the javascript, and how fast the AJAX call returns so you could have a much larger delay.
Second I wouldn't worry about Googlebot seeing any of the ajax content. While it's true googlebot does try to fiddle with some javascript it won't make the ajax call like a normal browser would. I'd be very surprised if Googlebot ever saw any of your Ajax loaded data.
Googlebot does a fantastic job of figuring out what content is delivered via html/css to a user when they visit your page. It also figures out if something is displayed or not and does a good job of deciding if that content is just stuffing or is something that really matters.
You're worried about what someone without javascript will see when the entirety of navigating your site is based in javascript. This doesn't seem to reconcile.
You've got PHP available. My suggestion is to forgo the AJAX stuff you're trying to do and do it in PHP. You can just as easily script the same behavior in PHP as you can in AJAX.
SEO NOTES:
If you're looking for solid SEO results I suggest making the static (non-javascript version) page as SEO friendly as possible. I like to 'pick the low hanging fruit' like making sure the page has one and only on H1 and that it has the most important keywords in it. seomoz is one of the best sites I've found for seo information.

AJAX loaded content to be crawable by google

Here is my case: I use some external source to load html data to my page, after that I put content of this html to div. So, as soon as page loaded and ajax call finished, I see the results.
It works, Ok.. but now I came up, that this dynamically loaded content is not crawable by Google bot.. and this is something that I don't like :)
Are there any ways to say to google bot, that page page, actually contains a content of page?
For example, if I loaded a page from http://external.com/test.htm, and loaded it to div, can I use something like
<div id="dynamic"></div>
?
I hope you understand my question, if not, please make your comments!
Thanks!
You might be interested in checking out the following document directly from Google, for a few concrete tips:
Google Code: Making AJAX Applications Crawlable

Categories

Resources