If I am logged into facebook.com, I expect a call to FB.getLoginStatus will return a status='not_authorized'. Instead it returns status='unknown', even if I pass true for the 'force' parameter.
If I call FB.login, and then call FB.getLoginStatus, I get status='connected'. Makes sense.
If I call FB.login, reload the page, and then call FB.getLoginStatus, I get status='unknown'. Does not make sense. If I add 'true' as the second parameter (i.e. force), I still get status='unknown'. (I expect status='not_authorized' in this case.)
There doesn't seem to be a way to get status='not_authorized' in practice.
Am I doing something wrong? Is this a bug in FB.getLoginStatus or its documentation?
Here is a minimal test page:
http://pastebin.com/NqiBXni2
Context:
I am writing a website widget that displays posts for a public Facebook page. (This can be accessed without prompting the user via an app access token.) Each post has a "Like" / "Unlike" link. In order to determine whether to display "Like" or "Unlike", I need to know what the browsing user's Facebook ID is so that I can check whether it is in the post's list of likes.
Maybe you have third party cookies disabled.
And to elaborate slightly on David's answer, Facebook describes this as a "by design" result when 3rd party cookies are disabled.
Unfortunately, I'm not sure of the most graceful way of handling this -- at least in my site, I need an oath token to load some data.
I had similar issue, even when the 3rd party cookie is not disabled. What I did to bypass is: if I get 'unknown' response, I will call FB.login and inside that call the FB.loginStatus again. This time, FB returned correct value.
Are you use http or https?
DO NOT use http be your DEV HOST!
Just use https and it works for me.
Related
We have currently an order confirmation page on our website after the customer paid where information is sent to our analytics tools.
If the customer refreshes the page, the information about his order is sent again to our analytics tool and then we get wrong/duplicate information in our orders statistics.
Is there an easy way to prevent this?
The goal would be here to not trigger the custom html tag when the customer reloads the page.
It should be only fired once.
Thanks!
Best,
Victor
You can use PerformanceNavigation.type or PerformanceNavigationTiming.type respectively in a Javascript variable to find out if the page has been reloaded (the former is already marked as deprecated while the latter is still marked as experimental, so to be on the safe side you should probably check which one is supported by the browser and use that).
Then use the return value to detect reloads and block the tag depending on that.
An alternative would be to set a cookie or local storage entry with the transaction and block based on that.
We finally found another solution that seems to work: We used an additional trigger in GTM that prevent the info to be sent twice.
When calling the page for the first time, the condition is set to "false", and on each following request to that page it is set to "true".
We made some test orders and it seems to works correctly. Thanks!
I'm creating a chrome extension, and want to be able to get a profile picture from a username. Unfortunately, there isn't an api for the website and there is no correlation between the profile URL and the profile name. I figured I could hijack the search ajax and use it to achieve my goal. Unfortunately, it doesn't seem to be working. I've added the permissions, and it isn't running in a content script, but it's still not working.
$.getJSON("http://www.website.com/user/search",{user:name},function(a){
alert(1);
alert(a.data);
});
It's failing silently, as nothing is happing. Yet, I can't figure out why.
The json I should be receiving from the call (browsed to the site manually) is:
{"error":false,"action":null,"one":true,"data":143217}
Unfortunately it's not working. JSONP isn't an option, as the site obviously has no need of supporting it, so help me please. I don't see what I'm doing wrong.
EDIT: I see the problem. When I try to do the request I'm being redirected to user/search ( no ?user) which fails.
The site you're trying to reach requires authentication before allowing a search to occur. You won't get a response from your ajax request, because the server is looking for a session with login credentials to allow the search to happen. Because your ajax is not authenticated, and does not have the session established, your request doesn't "fail", it is simply getting the server's 302 redirect response.
You would need to 'sign in' to the page you are trying to query in order to establish the session and any necessary variables before you would be able to proceed with your user search.
It could be entirely possible that the website's search API isn't returning the Content-type header as text/json, which is required for $.getJSON to function correctly.
Try simply using $.get and calling JSON.parse on the returned data.
Our system using HttpContext.Current.Session("Client") to store the current user info.
One property in the session is a roleID i.e. CType(HttpContext.Current.Session("Client"), Client).RoleId
By checking the value of RoleId, the system can identify whether the user can access a couple of pages.
I've validated it in the server-side. But for the easiest way to present the Notice Message I think is using JavaScript.
So is it possible to get the session value in JavaScript (even in a external JavaScript)?
How about Cookie? What is the drawback for adding Cookies for an existing system?
And any other suggestions if you have.
Thx
Yes, I did the validation in server side. Later again, I'll add restrictions in DBs as well.
Result:
I used webMethod inside a web service, caz it is a Master Page.
Thanks for you answer.
but another issue raised:
Trigger/Prevent page event by using asynchronous webmethod return value in JavaScript
please give me some advise on that question as well, thx.
You could do it as a cookie, but it would slow down your round trip for every resource. Hence, I don't recommend this approach.
One option is to have a dynamic page that returns a javascript object in global with the appropriate variables printed out. You then could just include it as a standard script tag.
Another approach is to make an AJAX call.
Keep in mind, you should still always validate the base request and never trust the client.
Sending roles to the client and using JavaScript for business logic based upon these roles is a security risk. Users (hackers) know how to manipulate client-side code to gain access to things they're not supposed to.
I recommend sending down only the content the user has access to or use AJAX to retrieve the content dynamically from the client.
But to answer your question, no, you cannot retrieve session data directly from the client.
You can make ashx page or WCF service and call that with javascript. But don't return roleID and check that ID on client, instead just return true / false if user has access. Use jQuery ajax call to ashx or WCF service, you should find tons of examples on google
Could anyone clarify how the GA actions _gaq.push(['_link', <href>]); and _gaq.push(['_linkByPost', <form>]); work?
I'm not interested on how to use them as presented in the documentation. I understand those scenarios. I want to know more about what they do when called.
Edit:
I suspect how this works but I need some confirmation from someone that fiddled with this longer than me. I want to know what the process is in each of the cases in small steps. I know that it changes the sent data in order to overwrite to cookie on the target site, but I need to know exactly the actions that happen (in terms of JavaScript on the sending page) after you do the push.
I would also like to know if I could use _gaq.push(['_link', <href>]); from anywhere in my code to change the page.
Thank you,
Alin
We will assume _gaq.push(['_setAllowLinker', true]); used on any needed page.
What _gaq.push(['_link', <href>]); does:
Appends the __utm<x> cookies to <href>. You need to return false in the onclick of the anchor so that the original link does not follow through.
Changes the browser location to the newly formed URL.
What _gaq.push(['_linkByPost', <form>]); does:
Changes the action attribute of <form> so that it includes the __utm<x> cookies.
What happens on the target page:
The GA script on the target page checks the received parameters and if the __utm<x>s are sent it overwrites its own cookies with these. This results in identifying the user as being the same on that left your original page.
As a bonus _gaq.push(['_link', <href>]); can be used in (almost) any situation window.open(<href>); can be used.
They pass the cookie information from one domain to another; in the instance, it does this by appending a query string on the next page; with _linkByPost, it sends the cookie information as GET parameters on the form action along with your POST data.
If _setAllowLinker is set to true on the target page, the cookie information sent will overwrite the default Google Analytics cookies on the target page, and will allow for linked, consistent session information between the two, as the cookies will ensure that consistent data is shared.
EDIT:
No, you can't call it from anywhere in your page, unless you bind it to an onclick of where you'd like it called.
I have a JSON web service to return home markers to be displayed on my Google Map.
Essentially, http://example.com calls the web service to find out the location of all map markers to display like so:
http://example.com/json/?zipcode=12345
And it returns a JSON string such as:
{"address": "321 Main St, Mountain View, CA, USA", ...}
So on my index.html page, I take that JSON string and place the map markers.
However, what I don't want to have happen is people calling out to my JSON web service directly.
I only want http://example.com/index.html to be able to call my http://example.com/json/ web service ... and not some random dude calling the /json/ directly.
Quesiton: how do I prevent direct calling/access to my http://example.com/json/ web service?
UPDATE:
To give more clarity, http://example.com/index.html call http://example.com/json/?zipcode=12345 ... and the JSON service
- returns semi-sensitive data,
- returns a JSON array,
- responds to GET requests,
- the browser making the request has JavaScript enabled
Again, what I don't want to have happen is people simply look at my index.html source code and then call the JSON service directly.
There are a few good ways to authenticate clients.
By IP address. In Apache, use the Allow / Deny directives.
By HTTP auth: basic or digest. This is nice and standardized, and uses usernames/passwords to authenticate.
By cookie. You'll have to come up with the cookie.
By a custom HTTP header that you invent.
Edit:
I didn't catch at first that your web service is being called by client-side code. It is literally NOT POSSIBLE to prevent people from calling your web service directly, if you let client-side Javascript do it. Someone could just read the source code.
Some more specific answers here, but I'd like to make the following general point:
Anything done over AJAX is being loaded by the user's browser. You could make a hacker's life hard if you wanted to, but, ultimately, there is no way of stopping me from getting data that you already freely make available to me. Any service that is publicly available is publicly available, plain and simple.
If you are using Apache you can set allow/deny on locations.
http://www.apachesecurity.net/
or here is a link to the apache docs on the Deny directive
http://httpd.apache.org/docs/2.0/mod/mod_access.html#deny
EDITS (responding to the new info).
The Deny directive also works with environment variables. You can restrict access based on browser string (not really secure, but discourages casual browsing) which would still allow XHR calls.
I would suggest the best way to accomplish this is to have a token of some kind that validates the request is a 'good' request. You can do that with a cookie, a session store of some kind, or a parameter (or some combination).
What I would suggest for something like this is to generate a unique url for the service that expires after a short period of time. You could do something like this pretty easily with Memcache. This strategy could also be used to obfuscate the service url (which would not provide any actual security, but would raise the bar for someone wanting to make direct calls).
Lastly, you could also use public key crypto to do this, but that would be very heavy. You would need to generate a new pub/priv key pair for each request and return the pubkey to the js client (here is a link to an implementation in javascript) http://www.cs.pitt.edu/~kirk/cs1501/notes/rsademo/
You can add a random number as a flag to determine whether the request are coming from the page just sent:
1) When generates index.html, add a random number to the JSON request URL:
Old: http://example.com/json/?zipcode=12345
New: http://example.com/json/?zipcode=12345&f=234234234234234234
Add this number to the Session Context as well.
2) The client browser renders the index.html and request JSON data by the new URL.
3) Your server gets the json request and checks the flag number with Session Context. If matched, response data. Otherwise, return an error message.
4) Clear Session Context by the end of response, or timeout triggered.
Accept only POST requests to the JSON-yielding URL. That won't prevent determined people from getting to it, but it will prevent casual browsing.
I know this is old but for anyone getting here later this is the easiest way to do this. You need to protect the AJAX subpage with a password that you can set on the container page before calling the include.
The easiest way to do this is to require HTTPS on the AJAX call and pass a POST variable. HTTPS + POST ensures the password is always encrypted.
So on the AJAX/sub-page do something like
if ($_POST["access"] == "makeupapassword")
{
...
}
else
{
echo "You can't access this directly";
}
When you call the AJAX make sure to include the POST variable and password in your payload. Since it is in POST it will be encrypted, and since it is random (hopefully) nobody will be able to guess it.
If you want to include or require the PHP directly on another page, just set the POST variable to the password before including it.
$_POST["access"] = "makeupapassword";
require("path/to/the/ajax/file.php");
This is a lot better than maintaining a global variable, session variable, or cookie because some of those are persistent across page loads so you have to make sure to reset the state after checking so users can't get accidental access.
Also I think it is better than page headers because it can't be sniffed since it is secured by HHTPS.
You'll probably have to have some kind of cookie-based authentication. In addition, Ignacio has a good point about using POST. This can help prevent JSON hijacking if you have untrusted scripts running on your domain. However, I don't think using POST is strictly necessary unless the outermost JSON type is an array. In your example it is an object.