Just a quick question for you all...
I've got a jsFiddle here and just wondering if we load content this way from an external api on load of the page will Google be able to see this information?
Here's the code of the fiddle...
<p>Property name <span id="property_name"></span></p>
function propInfo(propertyName) {
document.getElementById("property_name").innerHTML = propertyName;
}
Accessing this api is out of my hands - I'm told the only way to do it currently is via Javascript.
We obviously want our pages to be appearing correctly in Google so any help with this would be great!
You need to have a look at these guidelines to make sure that Google can access whatever content you load via AJAX - https://developers.google.com/webmasters/ajax-crawling/
As long as you're compliant with those, you shouldn't have a problem.
The only people who can definitively answer this for you are Google.
Matt Cutts did confirm that Google's crawler does run some content-generating JavaScript and index the result, but I don't think the details of the limits of that have ever been publicly disclosed.
Related
Hey so currently working on my first personal project so bear with the questions!
Currently trying to create a Javascript program that will parse info from google forms to produce slides displaying the info. So far from my research the best way I've found to facilitate this process is googles app script editor. However, I was wondering if I can run this code by requesting it from a different javascript (or maybe even java) program that I will write code on webstorm. If I cant do this what is the best way to utilize the google apps script editor?
Thanks!
Google Apps Script is just javascript with extra built-in APIs (like SpreadsheetApp, FormApp, etc.).
It also has a UrlFetchApp API.
So you can run code like this:
// The code below logs the HTML code of the Google home page.
var response = UrlFetchApp.fetch("http://www.google.com/");
Logger.log(response.getContentText());
As such, if you want to provide JavaScript from elsewhere, you could fetch it and then eval it on the Google Apps Script side. (but we all know how tricky eval can get)
One other option is to have your own server side written using Google App Engine (or any other framework) and use Google's OAuth and authorize your app to fetch data from the Forms form
Slides and Google Apps Script
You might like to take a look at the addon "Slides Merge" by Bruce McPherson. I've never used it but it sounds like it might work for you. Here's what it's looks like in the addon store:
Getting information from Google Forms is a snap with google apps script since your can link the form right up to a spreadsheet. The Google Apps Script documentation is really quite good these days. Here's the documentation link. Google Apps Script is loosely based on Javascript 1.6. If your already a programmer my guess is that you'll have few problems learning to use it. In my experience the most difficult thing was dealing with the arrays of arrays produced by the getValues() method of ranges in google apps script and I made a short video that might be of some help to you.
I also have a script that I wrote in Google Apps Script that produces a sheet show that is a slide show inside of a spreadsheet.
I've found that using the Script Editor is pretty easy. There's some documentation in the support section of the documentation. It can be a bit buggy at times but overall I think it's a pretty good tool.
A few weeks ago I started learning Javascript and the Google Apps Script API, specifically in regard to spreadsheets. I have been trying to make a spreadsheet that fetches web pages and pulls stats about my friends for the game League of Legends. However, I have been running into a problem with the site I want to use, which is basically the only free LoL stats site that updates frequently. I'm not familiar at all with web development, but it seems when I try to access a page on lolking.net, for example http://www.lolking.net/summoner/na/60783 with Google's UrlFetchApp.fetch() it does not load the dynamic page. So instead of the final source, I get this which doesn't help me. Is there an easy way around this or would I simply have to use another website?
Thanks for thie info! Although it turns out I was mistaken. The UrlFetchApp was indeed returning the full source code, but I was using GAS's Logger to view the text. It seems the Logger has a length limit, so when I searched for the stats I wanted they weren't there simply because the source code got truncated. So, due to an oversight on my part, I never had a problem in the first place. For other people reading this question, in the end I have no idea how UrlFetchApp works with dynamic pages using clientside js (you'd probably want to talk to the poster below or post a new question).
You are getting fhe raw html page with clientside js included. That wont work from any system not just gas. You need to debug that page js and find where it does an ajax call to get the data you want.
Then do the same from your gas. Might not work if the call is authenticated etc.
I've embedded Google Calendar into HTML page. It has a big number of calendars in it, so, to make it nice I'd like all the calendars to be turned off by default.
It seems that jQuery can't help here, due to same origin policy.
Is there a simple straight-forward "just-make-it-work" solution?
EDIT: in the question mentioned above there was a word about using local proxy for this task. What is it and how to do it?
I think that your best solution is to pull in the calendar feeds and present the calendar yourself.
Same-origin policy will always bite you. jQuery is just JavaScript. It isn't magic, and if it cannot access something, neither will any other client-side scripts you write.
Proxying something as complex as the Google Calendar isn't a good idea. Even if you get it working, they may change it in the future.
It's trivial to pull in their XML. You can find the URL right on the same panel that you got the HTML for the shared calendar. Then, you can load all of those server-side and present them on your page however you would like.
There is a website that I visit often... let's call it www.example.com. And, I am able to interact with parts of this website. The interactions send XMLHttpRequest and get a response back through Javascript, jQuery I believe.
I'm not sure what technology will let me achieve what I want to do, and where to start. Basically, I want to add additional options/shortcuts that the site does not provide. I thought about maybe using a macro, but trying to use macro recording software is just a pain in the butt.
I inspected (using Google Chrome's Developer Tools) the XMLHttpRequest being sent back and forth and I noticed that it is simple JSON messages. I figured the best way to add enhancements to the site without waiting for the actual owners of the site to do so would be to simulate the website sending/recieving these XMLHttpRequest/Response and making additional adjustments to the DOM to provide extra shortcuts.
I don't want to interfere with the original site's functionality though... ie if I send a request and receive a response I want both the original script and my script to process the response. So, here is where I'm stuck... I'm not sure whether to go along the paths of creating a C# application or a Google Chrome extension (I use Google Chrome) or something else alltogether. Any pointers on what dev tools/languages will give me the ability to do what I want would be great. Thanks!
Chrome has built in support for user scripts. You can use these to modify the page as you see fit and also to make requests. Without more details regarding what exactly you want to do with these AJAX request it's hard to advise further.
I'm not 100% sure what your question is, but as I understand it, you want to be able to make changes to a certain website. If these changes can be done with js, i would recommend Greasemonkey for Firefox. It basically lets you run a custom script when you are visiting a certain webpage/domain. You can be as specific as you want about which pages use the script. Once your script loads jQuery, it is really easy to add any functionality.
https://addons.mozilla.org/en-US/firefox/addon/greasemonkey/
You can find pre-written scripts for tons of sites here:
http://userscripts.org/
a couple of google questions:
1 - is there ANY chance that google will "see" text retrieved using ajax?
the user selects from a chain of select boxes and some text from the Db is displayed.
2 - if i change the page title using javascript, outside the HEAD area, will google index the modified title?
sorry if these are trivial questions and thanx for any reply
have a nice day :-)
What Google sees is what you see when you disable javascript on your browser. So the answer to both your questions is no.
The correct way to have all the data of your site indexed is to degrade gracefully inside <noscript> tags. For example, you could offer an interface to browse all the content of your database, using list and sublists of requests that point to proper result pages, that are well integrated in your site.
Warning note: your content must really be a noscript version of your site. If you create a special site, it becomes cloaking, which is forbidden.
Update: Since 2014, Google seems to support everything you can think of (including javascript and ajax).
Try using seo-browser.com or lynx browser to see how google see your site.
Also see this answer on Googlebot doesn't see jquery generated content and/or this document by Google, on ways you can have your AJAX content spidered.