On a search result page I would like to get the count of Google+ shares for the URL of each search result item via Ajax. I already managed to set up counting functions for Facebook and Twitter:
$.getJSON('http://urls.api.twitter.com/1/urls/count.json?url=' + url + '&callback=?', function(data){
tweets = data.count;
});
$.getJSON('https://api.facebook.com/method/links.getStats?urls='+ url+'&format=json', function(data){
fblikes = data[url].shares;
});
For Google+ I also already found a solution, but this requires the API Key for each URL. Is there any way to retrieve the Google+ count without such a key? Using it for tons of dynamically loaded search results, of course I cannot create an API key for each search result URL.
Yes, it's possible to get Google +1 counts. They are retrieved via a JSON-RPC POST call.
POST URL:
https://clients6.google.com/rpc?key=AIzaSyCKSbrvQasunBoV16zDH9R33D88CeLr9gQ
POST Body:
[{"method":"pos.plusones.get","id":"p","params":{"nolog":true,"id":"%%URL%%","source":"widget","userId":"#viewer","groupId":"#self"},"jsonrpc":"2.0","key":"p","apiVersion":"v1"}]
where %%URL%% is the desired URL.
Have a look at the following:
Getting google +1 Page shares via AJAX (hidden Api)
http://www.sharedcount.com/documentation.php
Related
I want to use YQL to retrieve all 10-Q & 10-K files from SEC EDGAR database.
After ref to the discussions [1] & [2], I bump into some problem.
It seems that YQL cannot get search results from the search engine.
However, I can directly access the filing detail page.
Here is a jsfiddle shows the problem. Although both queries return success message, the query to the search engine returns a result of empty array.
Is there any other way to get all the html addresses of the detail filing pages without querying EDGAR search engine? Thanks.
Example code by using YQL shows below:
// results page from EDGAR search engine:
// fail to get data
var queryURL = "http://www.sec.gov/cgi-bin/browse-edgar?" +
"action=getcompany&CIK=0001326801&type=10-K&dateb=&owner=exclude&count=100";
// EDGAR 10-K detail filing page:
// success to fetch by YQL
var filingURL = "http://www.sec.gov/Archives/edgar/data/1326801/" +
"000132680114000007/0001326801-14-000007-index.htm";
$.get(queryURL).then(function() {
// get successful message, but get results of empty array
})
.then(function() {
$.get(filingURL).then(function() {
// get successful message, and get results of empty array
})
} )
The /cgi-bin URL is restricted by robots.txt, so YQL will honour that and not crawl the page.
You can see this happening by enabling diagnostics for the YQL query.
Add diagnostics=true to the YQL URL, like /v1/public/yql?diagnostics=true&callback=?
Look for the diagnostics field in the results. This contains information about the query and any URLs it visited.
I'm working on a GPS/mapping app using Node.js, express, Socket.IO, and MongoDB.
The tracker page uses HTML5 geolocation and Socket.IO to send coords to the server and store them into MongoDB.
I'm now working on creating user pages so a user can view previously recorded data. I'm currently minus a log in page, so index.html just populates a list of user names and I can select one.
My problem is that I am unsure of how to best pass data such as the username of the selected user to the next page. I have been trying to avoid opening a socket connection to the server just to pass one username but if that is the best way then no problem. I am also aware of localStorage as an option.
Here is some code snippets to give you an idea of what I'm working on:
index.html:
<html>
<head>
<script src="http://code.jquery.com/jquery-latest.min.js"></script>
<script>
$(document).ready(function() {
$.ajax({
dataType: "json",
url: "/getusers"
}).done(function(data) {
console.log("--->", data);
for(var i=1; i<data.length; i++) {
$('#list').append("<li><a href=http://localhost:3000/mapPage.html>" + data[i].name + "</a></li>");
}
$("a").click(function(e) {
alert("--->", $(this).html());
//alert("--->", $(e).html());
//alert("--->", $(e).getVal());
//window.localStorage.setItem("username", )
});
});
});
</script>
</head>
<body>
<ul id='list'>
</ul>
</body>
</html>
The ajax is just hitting an express route that asks the MongoDB for collection names (each user has a collection). Also I'm having trouble getting the data from the specific anchor tag I clicked on. So I need to get the user name I clicked on and send it to the "userpage", which, for now would be almost identical to index.html, it would select the specific user collection and show a list of available objects (arrays of map coords);
tl;dr: get data from anchor tag clicked, make data available to next page.
That's exactly what sessionStorage were designed for - to pass data from one page to another. localStorage can also be used for that but it is more for persistence of some data between sessions.
If you have to pass data like that between pages the simplest way that comes to my mind is just passing it as a query string.
http://localhost:3000/userPage.html?username=jim&user=jdawg149
Then you can read those out into variables on the destination page.
You can get and set the query string pretty simple with some simple methods on the window as discussed here:
http://css-tricks.com/snippets/javascript/get-url-and-url-parts-in-javascript/
You can also get a little more complex with it by using a function with RegEx to parse the URL, as answered here:
How can I get query string values in JavaScript?
Keep in mind that you should limit your URLs to ~2000 characters.
I want the followers count from twitter site , Already I used this query, its worked well, but its not worked now,because the twitter API was changed now.i used script for getting count in a span id ="spnTwitterFolowersCount".
<script>
$.getJSON("https://twitter.com/users/Obama.json?callback=?",
function (data) {
document.getElementById("spnTwitterFolowersCount").innerHTML = data.followers_count;
//alert('Obama has ' + data.followers_count + ' Followers');
});
</script>
As the followers count is accessible from anybody on a twitter profile you can use YQL:
var ttid = "twitterUsername";
var response = $.getJSON("https://query.yahooapis.com/v1/public/yql?q=select%20content%20from%20html%20where%20url%3D%22https%3A%2F%2Ftwitter.com%2F"+ttid+"%22%20and%20xpath%3D'%2F%2Fli%5Bcontains(%40class%2C%22ProfileNav-item--followers%22)%5D%2Fa%2Fspan%5Bcontains(%40class%2C%22ProfileNav-value%22)%5D'&format=json&callback=");
return response.success(function (followers) {
return followers;
});
A lot has changed with the twitter API 1.1, for first, you cannot makes call directly to get data. You need to have some sort of authentication for making those calls.
Please read this documentation which states the need for authentication. You can have Oauth authentication or app-only authtentication based on your needs.
After you are done with the authentication, you can get the list of followers using this api
https://api.twitter.com/1.1/followers/ids.json?cursor=-1&screen_name=sitestreams&count=5000
Read here for more information and complete set of parameters
I was using Google Weather API to fetch weather info, but apparently Google had stopped its service. And I am trying to switch to Yahoo Weather API now.
var WOEID = 2502265; //random WOEID
$.ajax({
url: "http://weather.yahooapis.com/forecastjson?w=" + WOEID + "&u=c",
dataType: 'json',
success: function(data) {
console.log(data);
}
});
However, is there a way that I can get the WOEID by JavaScript only? Because back then I can just do
http://www.google.com/ig/api?hl=en&weather=NYC
and that's it.
It says on the Yahoo weather API page,
To find your WOEID, browse or search for your city from the Weather home page. The WOEID is in the URL for the forecast page for that city. You can also get the WOEID by entering your zip code on the home page.
But I want to get it by JavaScript, not manually go to weather.yahoo.com and find out the WOEID.
Don't care about the Cross-Origin Policy because I am using it in an Chrome extension and it does not apply.
Okay I got to know from your comments what exactly you want
You have a place name and you want to get the WOEID of that place name using javascript ajax calls
The url to get that is not defined any where you have to use GeoPlanet service to resolve a place to a WOEID
http://where.yahooapis.com/v1/places.q('Place name')?appid=[yourappidhere]
OR you have to use Direct YQL some what like this ( use percent encoding in the url for your city name ) appropriately and try doing an ajax call to this
http://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20geo.places%20where%20text%3D%22Place%20name%22&format=xml
you can get it from yahoo too http://developer.yahoo.com/geo/geoplanet/guide/concepts.html
API Reference
DECEMBER 2018 UPDATE:
Definitely use the Direct YQL technique mentioned above by #aravind.udayashankara. I messed around with the yboss api for awhile only to see it has been discontinued (https://developer.yahoo.com/boss/search/) even though Yahoo still has plenty of documentation on it online.
Try the following instead (it runs off the page but there is code within the URL).
yourLocation = "location" (zip, city name, etc.)
urlQuery = "https://query.yahooapis.com/v1/public/yql?q=select+*+from+geo.places+where+text%3D%22" + yourLocation + "%22&format=json"
To get the Woeid by city name
using (WebClient wc = new WebClient())
{
string results = wc.DownloadString("http://query.yahooapis.com/v1/public/yql?q=select%20*%20from%20geo.places%20where%20text%3D%22" + CityName + "%22&format=xml");
}
See this article for more details
I am using the HTML5 version of Facebook Comment in my website. I have my own Facebook APP Id.
Using Graph-API, and FQL (I think this is how to do it), I want to list all the Comments posted in my website.
Example -
Page Title1
--Comment1
--Comment2
--Comment3
Page Title2
--Comment1
--Comment2
--Comment3
Page Title3
--Comment1
--Comment2
--Comment3
etc.
Please help me out.
It is possible, in two different ways, as long as you have a fixed set of sub-pages you want to fetch comments from.
If you have a large amount of sub-pages, or a variable amount, then you don't have a good scalable solution - and many have been looking for one:
Facebook fb:comments Graph API
How to display recent comments from Facebook Comments social plugin?
Facebook FQL query to return all comments against an application
Retrieve all comments with FQL by application ID
Facebook FQL query to return all comments against an application
fql query to get comment count no longer working
http://facebook.stackoverflow.com/questions/10023179/retrieve-all-the-comments-posted-using-fql
For a Fixed set of sub-pages in your website, you can either use a batch request, or an FQL query.
Batch Request
First, you need your access token. Just enter the following as a url in a browser (credit to this website ):
https://graph.facebook.com/oauth/access_token?type=client_cred&client_id=APP_ID&client_secret=APP_SECRET
And this is the javascript jquery code to make a batch request to fetch comments from several urls at once:
$.ajax({
url: 'https://graph.facebook.com/',
type : "POST",
data: {
access_token : 'YOUR_APP_ACCESS_TOKEN',
batch : '[ \
{"method":"GET","relative_url":"URL1"}, \
{"method":"GET","relative_url":"URL2"} \
]'
},
success: function(data) {
jdata = JSON.parse(data);
$.each(jdata, function(index,value){
jdata[index].body = JSON.parse(value.body);
console.log(value.body);
});
// Do whatever you want with jdata
}
});
FQL
inspired from this post
FB.api({
method: 'fql.query',
query: 'select text from comment where object_id in (select comments_fbid from link_stat where url="URL1" or url="URL2")'
}, function(response) {
// Do something with results
});
Conclusion
Because of this limitation of Facebook, I plan to switch to disqus.com, which apparently supports this feature (As you can see from this blog, for example. (search for 'recent comments')
Rather than list all the comments on your site, Facebook wants you to implement code to get notified when a new comment is posted anywhere on your site.
To make this happen, you have to put some Javascript into the page where the comment is posted to also notify yourself:
window.fbAsyncInit = function(){
console.log("subscribing to comment create");
FB.Event.subscribe('comment.create',function(response){
console.log("facbeook comment created: " + JSON.stringify(response));
var commentQuery = FB.Data.query('SELECT fromid, text FROM comment WHERE post_fbid=\'' + response.commentID + '\' AND object_id IN (SELECT comments_fbid FROM link_stat WHERE url=\'' + response.href + '\')');
FB.Data.waitOn([commentQuery], function () {
console.log("Facebook comment: " + JSON.stringify(commentQuery));
});
});
};
Where rather than just logging the comment to the console, you would need to implement some AJAX that would send the comment back to your site where you could store the comment in your database, or send yourself an email notifying you that the comment has been posted.
Reference: Facebook Comments Plugin
Say your website is http://mywebsite.com/blog.php?id=3 and you have a facebook comments plugin on it,
you can access comments this way
https://graph.facebook.com/comments/?ids={YOUR_URL}.
{YOUR_URL} becomes http://mywebsite.com/blog.php?id=3
Example 1: (Comments plugin installed on developers facebook doc website )
website: http://developers.facebook.com/docs/reference/plugins/comments
fetch comments: https://graph.facebook.com/comments/?ids=http://developers.facebook.com/docs/reference/plugins/comments
Example 2:
website: http://techcrunch.com/2011/04/08/the-seven-most-interesting-startups-at-500-startups-demo-day/
fetch comments: https://graph.facebook.com/comments/?ids=http://techcrunch.com/2011/04/08/the-seven-most-interesting-startups-at-500-startups-demo-day/
Check this too
Sample code for pulling comments can be found on this blog post