Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I know that this question has been asked many times ( but in different varuations), but I got totally confused...
I have a website with a right panel displaying some content - a set of links to news/inner articles.
This is the same list for all site's pages.
Thus, every time there is a new link/line in th news section, I have to go manually over all site's pages and update them. This is really annoying and time consuming.
The approach I though about is:
Adding a div where the news/link are supposed to be:
<div class"news"></div>
and load a JS file in all files. Upon window.onload(), what the JS will do is writing the news:
document.getElementByID('news').innerHTML=....
Thus, every time there is a new link/news, I will add it to the JS file, and it will be written in all site's pages ( because they all will load this script).
My Question is:
Is this approach ok? Will the news/links that are generated this way be seen by (Goggle)?
P.S
I have read many articles e.g
https://stackoverflow.com/questions/11989908/can-google-crawl-links-generated-on-the-clientside-e-g-knockout
Googlebot doesn't see jquery generated content
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=174992
Does google crawl javascript?
https://developers.google.com/webmasters/ajax-crawling/docs/getting-started
and more.... But got really confused...
I am not using any AJAX or server side programming.
Just want to do a simple .innerHTML ="..." with a predefined list of news, to save a lot of time. Will this content be indexed and seen by google?
Thanks a lot!
Generally, no. Google (and other bots) will not see content that you add dynamically.
They are of course learning the search engine to understand more and more, and it probably recognises some specific ways to add content to the page already, but to be able to see any dynamic content it would have to actually execute the script in a sandboxed environment, or emulate the script execution, to find out what it does.
Even if it does see some dynamic content, it's likely that it gives that content lower priority, so you are always better off by putting your important content as actual elements in the page.
Any search engine crawler crawl only HTML code when you check source code in your browser. Now when you are fetching content using Javascript, it will not fetch in source code for sure. (Here it is depends how you fetch the content).
To test crawler visibility for your page, Google recommend LYNX tool here in webmaster guidelines.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769#2
Hope it will help !!
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
Never mind. I've found the correct answer
I think you have already all pieces in front of you, but don't know where to start.
I suggest that you
find an exciting Bootstrap blog template (like the free ones at Start Bootstrap)
understand the code, which is usually a good mixture of HTML5, CSS3, JavaScript with a modest use of jQuery
tweak it with your own ideas
if you have some special needs not covered by the template (for example collapsible menu items), you can browse the Bootply Snippet Library to look how others coded it.
Nowadays no web programmer starts from scratch. They choose a template with the basic structure of a one-pager, multi-pager, blog, e-commerce, etc. and start from there.
By the way: Alaboudi mentioned in his answer that you need to learn MySQL, too. This is indeed needed for dynamic content like e-commerce and blogs, but not for static content like business websites that don't change that often, but put an emphasis on individual page layouts.
But to get your first website up fast I would start with static websites and later extend your knowledge to MySQL.
Everything you have learnt is great, but you must also learn a database querying language (SQL). May I suggest you start learning MySQL, its very friendly for beginners. Now let me give you an example of how to code a dynamic website.
Lets consider facebook profile pages as an example. Firstly you must realize that there isnt 1 billion uniquely saved profile pages made for each user on the server. Rather, there is only 1 html css template that is filled in with appropriate information depending on the person loading the page. When the visitor comes to his profile page, his information must be queried from the database using a backend language (PHP in your case). Once the result of the call is retrieved, you fill in the appropriate information in your html (name, age, friends, blah, blah) and send it over to the user. So technically you are constructing the complete page with every call and you never actually have the complete page saved on the server.
Long story short, you should look into using a database.
This is not really a question. I would suggest you to go and code something.
You want to do a blog? Ok, try to do it with what you've learned so far.
When you'll start to build it, you'll have specific questions on specific problems, you can then search on Google your specific problem or come back to StackOverflow and ask for it.
Any resource is good and lucky you, there are plenty of resources on the internet ;)
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to know the method by which I can offer others to show my website content in their website by embedding it either by a script tag or iframe. I have seen some websites showing some content for example a tournament fixture and offer visitor to embed the same content in their websites by copying and pasting the script tag or an iframe.
YouTube also offers to embed videos in other websites. Thus the content may be anything like div or table or a video. So how can I do this?
I think this question is related to somewhat I am searching. But it doesn't elaborate the solution in detail. I don't know how the external script file will show the content. Note that I am using PHP as a server-side language.
So you want to let other websites embed an iframe showing some content from your website but not the entire website. What you could do (though probably not the only solution) is to take the content you want to share, say a blog post, and make that into a standalone HTML page.
So create a page with URL foobar.com/blog/post1 but it only contains the post, not your entire website layout, navbar, footer etc. (This will be the case if you are using a MVC or making a SPA website). Make sure to include necessary styles and scripts with it too.
This can then be included in an iframe: <iframe src="http://foobar.com/blog/post1"></iframe>
Another alternative is to write an API where the other websites can request your blog post (or pictures, content or whatever) as .JSON objects, possibly containing the HTML as a string.
Really there are quite a few options depending on it being static or dynamic content. For dynamic I would suggest to use an iframe containing the mini app, which gets updated data through an API, or maybe even websockets if it has to be live.
And don't forget to deal with CORS on your server.
EDIT
So you want to offer a JS file for other's to include in their code similarly to requesting a JS library from a CDN:
https://code.jquery.com/jquery-git2.js
In this case you can load a JS file from somebody's server. Similarly you offer a JS file for the other websites to include. This JS will load content from your server via a HTTP request. There are many libraries which facilitate this, try learning about JQuery Ajax(tutoriallink).
Then as #halfer suggested, you ask the website owner to have a div with a unique id, say 'your_website_name' which your JS script searches for and populates with data received from your server.
<div id="your_website_name"> </div>
Note:
Your question was very vague so don't hate on the SO community trying to help. In the comments they added ideas to spur on other users' answers, otherwise they would have posted an answer.
I think what you are actually searching for is how to populate HTML via JS, and pull data via the network as you seem to know the rest. Read up on these terms, maybe look at some JS libraries such as JQUERY, Mootools or even MVC frameworks if your "applet" is quite complex.
If you want more help, post a new and more specific question about your use-case, any code you have written so far helps too.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I'm working on a page that is more or less a search tool, it basically consists of an input field, and it will show a list of entries to the user, according to the input. In this situation, if there any difference for SEO if the page uses client or server-side rendering (using AJAX) and why?
I'm just concerned if it's a disadvantage if I use client-side rendering in this particular scenario.
I understand that client-side rendering is a disadvantage for SEO compared to server-side - when the HTML is complete at the beginning, so to say. But in a dynamic case, where the results have to be loaded asynchronously anyways, is it still a disadvantage? Does it depend if the current content can be mapped to a URL?
AJAX loading of content has no impact on SEO.
The updating of the DOM via JavaScript will not result in any noticeable changes in what is indexed by a search bot. Almost all legitimate search engines archive the non-dynamic version of a webpage.
In order to enable SEO you have to maintain embedded links to non-dynamic versions of that content.
For example (using jQuery):
<div class="next-page"><a class="ajax-me" href="/page-2.html">Page 2</a></div>
$(document).ready(function(){
$(".ajax-me").click(function(e){
e.preventDefaults();
$('#ajax-target').load($(this).attr("href"));
});
});
That will enable AJAX for links, but still make that link visible to the web crawler for search engines.
Your server will have to know to respond with either a full webpage or AJAX response based upon the header request.
Since you don't seem to be much concerned with UI/UX and want to know more about SEO, I'd suggest to go with the client side. Anything that's dynamically loaded after user's input won't be visible to web crawlers.
However, another approach would be to make it work both ways - so that by visiting a specific URL (site.com/search?q=something) you get the page fully rendered from the server side, while you're still able to make another search that will happen at the client side. You'd still have a little trouble indexing all the relevant searches, but perhaps you could track the last x searches and show them somewhere on the page, with links to full server-side rendered search pages, like the one I mentioned above. You can even make those dynamic calls not only change the content of the page, but the URL hash in the browser's address bar as well (see here).
That way you'd provide users with a nice user interface/experience, while still doing a very nice SEO job since the crawlers would be able to index the links from the list of last searches.
So, to directly answer your question: client-side vs. server-side page rendering - huge SEO difference
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I have a method I'm testing right now for hiding javascript so that the user can't go around searching for it in the source files.
The method is this:
You have a bunch of javascript files included to make your application work. Libraries like jQuery, dojo, and your own code. This is pretty standard.
There is one critical piece of javascript code without which the app will not function, nor will any curious user be able to make heads or tails of the app without it. This critical piece does not get loaded by script tags. Instead, a small unobtrusive script calls to a database and returns the javascript in a big long string.
This string gets eval()-ed to make it live code. But the code was dynamically generated, so it won't show up if the user is looking through the source code or saves the website. Furthermore, you can add some kind of a salt or time-stamp to prevent users from trying to trick the database into revealing your javascript kernel.
I'm trying to get feedback on this from the community, and most of the examples I've turned up for hiding javascript with server-side code has just been people wanting to to include a .php file in the tags instead of .js. This is totally different.
So there you have it. Is this a good idea? What are the weaknesses?
eval() is generally frowned upon, but regardless, the big weakness is that I can simply sniff the HTTP requests and get your script. Obfuscation can make this more inconvenient, but with a good debugger its not that hard to follow a stack trace and get a good idea of what is occurring.
Even if the resource is transferred over SSL, it can be perused/manipulated once it has been loaded by the browser. To test this, I went to a secure website and examined a raw TCP response (both synchronous and asynchronous using XML HTTP) using SmartSniff. As expected, it's encrypted and unreadable. However, the same requests are all visible as plain text in Chrome's network activity inspector.
It's trivial to make Javascript code unreadable by humans (and even highly resistant to reverse engineering) - and you don't need to hide it in a of of other code. But why? Generically, the name given to this kind of code is malware.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 6 years ago.
Improve this question
From the best I know, if you use AJAX or javascript links in your website, it hurts SEO as the google bot has troubles crawling your site, basically it likes anchor tags.
So it's a general saying to avoid AJAX or Flash in your website, though some saying that the google bot knows to read javascript links.
Now i believe it's possible to manipulate the google bot somehow, and when meaning manipulate i DON'T mean anything illegal or black hatting, I just want to Ajaxise my website.
My question is divided to 2:
Is it possible to "present" google bot with 1 version of the site, and users with another? I've read Here that base64 encoding your content may confuse the bot, but that seems like a bad solution to me.
Likewise, the possibilities you can add to robot.txt file is only no index and no follow, as far as i know.
Is it possible to output the HTML as regular unajaxised website with Anchor links and after window finishes loading, then edit the anchor tags to preform dynamic content loading? i mean i know that it is possible, but will the google bot scan the after that event or before? on the same weight is it possible to block that parts of Javascript code from the google bot?
You can't manipulate search engine bots to do things they don't normally do. You either work within their capabilities or not. Although search engines are getting better at handling JavaScript as a general rule dynamic content is not something they're going to be able to handle well or at all in most circumstances.
As far as getting search engines to read dynamic content created by JavaScript you have two options:
Build the site the right way from the beginning and use progressive enhancement. Your site should work without JavaScript enabled. In fact, it should be built that way first. Then you can go back and add JavaScript that enhances the experience for users who have JavaScript enabled. That way your content is accessible to everybody.
Use Google's Crawlable Ajax standard. This will allow Google to crawl content generated via Ajax. Keep in mind this only work for Google and leaves out other search engines and users without JavaScript enabled. So it is a bad idea.
Build your site using Progressive Enhancement and Unobtrusive JavaScript.
When you do significant Ajax stuff, use the history API.
Then you have real URLs for everything and Google won't be a problem.