WebAuthn credentials.get bug after discoverable credentials test - javascript

Having got Platform Authenticator and Multi-device Authentication working I am trying to expand my FIDO2 knowledge by reading through WebAuthn issues on GitHub To this end I was testing Discoverable Credentials (i.e. specifying allowCredentials as empty [])
I couldn't get the signing to match so reverted to allowing only the credential id I just CREATEd but now I still keep getting prompted to specify a device when I call GET also the signatures don't match. This is the JS code: -
var allowCredentials = [{
type: "public-key",
id: Uint8Array.from(atob(credentialId), x => x.charCodeAt(0)).buffer
}]
var getAssertionOptions = {
timeout: 60000,
challenge: Uint8Array.from(serverChallenge.Token, c => c.charCodeAt(0)).buffer,
allowCredentials: allowCredentials,
userVerification: "required"
};
return navigator.credentials.get({
publicKey: getAssertionOptions
}).then(rawAssertion => {
var assertion = {
id: base64encode(rawAssertion.rawId),
clientDataJSON: utf8Decoder.decode(rawAssertion.response.clientDataJSON),
userHandle: base64encode(rawAssertion.response.userHandle),
signature: base64encode(rawAssertion.response.signature),
authenticatorData: base64encode(rawAssertion.response.authenticatorData)
};
and this is the C# signature check: -
using (ECDsa dsa = ECDsa.Create(ecparams))
{
if (dsa.VerifyData(data, ECDsaSig, HashAlgorithmName.SHA256))
{
Console.WriteLine("The signature is valid.");
}
else
{
Console.WriteLine("The signature is not valid.");
return FAIL_STATUS;
}
}
Now this code "used to work" using my Samsung phone but then (IIRC) I wasn't being reprompted for a device for verification. UPDATE: Sometimes the first time after CREATE the GET will work by returning a correctly signed load. But now I can't reproduce that :-(
Look this clearly sounds like developer/pilot error on my behalf but I just want to see if it rings any bells? I have cleared all cache, rebooted, can't find any "credentials" in and password history, and am at a loss. I thought there may be some signature timeout but I've extended everything I could
Q1. Was I always prompted to select a device even though I said only allow this Samsung credential
NB: If I use the platform authenticator on my phone then the same thumb-print works. EC encryption.
Chrome: Version 103.0.5060.134 (Official Build) (64-bit)

I'm assuming because you are testing on a Samsung device that you are running Android. Sadly at the moment Android does not support discoverable credentials / resident keys. Your previous flows would work as you are able to invoke the WebAuthn ceremony with credentials populating the allowList.
I tested on a WebAuthn environment of mine and confirmed that I am getting an error that reads "Use of an empty 'allowCredentials' list is not supported on this device" (I'm using Chrome on a Pixel 5 device).
Google has indicated that discoverable credential support is coming to Android soon to help support their passkey implementation.
For now I would recommend that you test your discoverable credential flow on another device with a platform authenticator to see if it works.
As for some of your other errors, I may need more information to help identify the issue.
Hope this helps

Related

Google one tap sign up always returns noCredentialsAvailable

I have set up a bare bones test for the new google one tap sign in / sign up.
<!DOCTYPE html>
<html lang="en">
<head></head>
<body>
<script src="https://smartlock.google.com/client"></script>
<script>
window.onGoogleYoloLoad = (googleyolo) => {
googleyolo.hint({
supportedAuthMethods: [
"https://accounts.google.com"
],
supportedIdTokenProviders: [{
uri: "https://accounts.google.com",
clientId: "xxxxx-xxxxx.googleusercontent.com"
}],
context: "signUp"
}).then((credential) => {
console.log(credential);
}, (error) => {
console.log(error.type);
});
};
</script>
</body>
I expected the above to either spit out the credential object to the console, or to prompt me to choose a google account to log in with. Instead a "noCredentialsAvailable" error type is thrown. This also happens when I use the googleyolo.retrieve promise.
I currently:
have a google account signed in and other "google sign in" oauth flows work
am loading the above at localhost:3000 (which is also listed as an authorised origin on my google api credentials page)
What am I missing?
We just published some troubleshooting guidance: https://developers.google.com/identity/one-tap/web/troubleshooting
The most important things to check are the following:
ensure that you have an active Google Account and that the Smart Lock feature is enabled. We are working on removing these requirements asap, but for now, try with a regular gmail account with default settings
make sure that you supply a Google client ID in any requests and that the domain you are running the code is an authorized origin, including the port. See documentation for details
check that you are using a supported user-agent. Importantly, the iOS emulation modes in Chrome Dev Tools are out-of-date (fix pending)
If you still cannot get it working, or have any feedback at all, we'd love to hear from you: reach out to sso#google.com

Geolocation API doesn't work on mobile

I'm writing my web application on React/Redux. And I need to get user location with a help of Geolocation API. On desktop browsers everything works fine, but on mobile phones (checked out on Xiaomi Redmi Note 3 and iPhone 5s) it throws error code 1 - permission denied. And it doesn't requests any permissions to get the location.
Here's a test sample which I ran on my site:
componentDidMount() {
if (window.navigator.geolocation) {
window.navigator.geolocation.getCurrentPosition(position => {
alert(position.coords.latitude + ' ' + position.coords.longitude);
}, err => {
alert('ERROR: ' + err.code);
});
} else {
alert('Geolocation API is not supported!');
}
}
What's the solution of this problem?
Got the same Problem... Solved:
Check your phone permissions for sharing your location.
On iPhone:
Settings -> Location Services -> [your Browser]
https://support.apple.com/en-ca/HT203033
Added:
Chrome requires https for geolocation usage:
https://developers.google.com/web/updates/2016/04/geolocation-on-secure-contexts-only
I've got the solution. I'm using the Web Application Manifest and it needed to set the permission to use Geolocation API.
We just need to set an "required_features" option at manifest.webapp file:
{
"required_features": ["geolocation"]
}
Hope it will be useful for somebody ;)
As of the Year 2021, this still does not work.
This is the link in that error message.In case you're wondering, it talks about "prefer secure origins for powerful new features" and location is consider one of those powerful features.
To generate the above, update the error section as follows:
if (navigator.geolocation) {
navigator.geolocation.getCurrentPosition(position => {
// other
},
err => {
// include the "code" part
alert(`ERROR(${err.code}): ${err.message}`)
});
};
On desktop during development...It works because if you read from the above link you will note that localhost is considered a secure origin.
In fact, even the chrome link shared in #chrisheyn's answer above, there is a section "Does this affect local development?" and explains why this should work on locahost.
So how about Mobile during development?Notice that react serves the app over your network e.g. http://192.168.0.134:3000 and that is definitely not considered a "secure origin" at all.
This question "Can I detect at runtime if the geolocation was blocked because of not being on a secure context
" mentions that... Errors due to this secure-context issue will return a code of 1 which is a "Permission Denied Error".
What's the solution?
Until the react team updates how your mobile picks the app during development, there is absolutely nothing you can to solve this issue.
To use the HTML5 Geolocation API, you will need to run the app over HTTPS. This means push your app to the cloud/host (in order to test this feature) or if you can some manage to get this network url http://192.168.0.134:3000 to do https The latter option, i believe, is much harder but I'd be interested to know if someone pulls it off.

How to handle basic authentication with protractor?

I'm trying protractor to write a few tests in a non angular application. I have to login in a page trough basic authentication in google chrome, but i have no idea how.
I already tried baseUrl: 'https://username:password#url' and capabilities: {
'browserName': 'chrome',
'chromeOptions' : {
args: ['--login-user=foo', '--login-password=bar']
}
}
But none if these worked for me. Anyone knows how to do it? I'm having some hard time on it.
You can set the URL as http://username:password#yourdomain.example. Chrome will handle it!
The short answer is there is no easy way of doing it on chrome because they do not support modifying request headers -- see https://code.google.com/p/selenium/issues/detail?id=141 (title says response headers, but if you read it, it's for all headers).
That being said, there are ways to do it, albeit difficult.
1) Find a chrome extension/plugin that allows you to modify header. A simple search bring up many of them: https://chrome.google.com/webstore/search/modify%20header. You'll need to add the plugin to webdriver: see Is it possible to add a plugin to chromedriver under a protractor test?.
2) You can use browsermob-proxy (https://github.com/lightbody/browsermob-proxy); this way you route your traffic through the proxy, which would add the headers for you.
From the docs:
POST /proxy/[port]/auth/basic/[domain] - Sets automatic basic authentication for the specified domain
Payload data should be json encoded username and password name/value pairs (ex: {"username": "myUsername", "password": "myPassword"}
There's a node project that may help you, https://github.com/zzo/browsermob-node, but you would still need to set up your proxy server yourself.
Both ways for chrome are complex, but would get you what you want. (or you can stick with firefox and follow Robert's answer)
As of version 59 Chrome no longer supports URLs with embedded credentials.
To work around this I wrote the authenticator-browser-extension Node module, which might be useful if you're using Protractor, WebDriver.io or similar test runners.
To use the module install it from npm:
npm install --save-dev authenticator-browser-extension
And import in the protractor.conf.js:
const { Authenticator } = require('authenticator-browser-extension');
exports.config = {
capabilities: {
browserName: 'chrome',
chromeOptions: {
extensions: [
Authenticator.for('username', 'password').asBase64()
]
}
},
}
Pro tip: remember not to commit your credentials with your code, consider using env variables instead.
Hope this helps!
Jan
It's because Firefox doesn't trust any site by default with sending the Windows auth info over. Even if you change it in the configurations manually, it won't affect protractor because it opens Firefox with an isolated configuration each time you run your end to end tests.
You'll need to programatically set up a Firefox profile and set its preferences such that it would trust localhost (or some other website, depending where the pages are loaded from)
First, check out this example. It shows how you can set up the profile and how you can set preferences.
https://github.com/juliemr/protractor-demo/tree/master/howtos/setFirefoxProfile
What it does is that it modifies the homepage for each new tab. In the same manner (with the firefoxProfile.setPreference method) you can change the preferences responsible for trusting websites. They're called "network.automatic-ntlm-auth.trusted-uris" and "network.negotiate-auth.delegation-uris". You'll need to set them both to "localhost". (Again, if they're at some other place, it's obviously that URL)
hankduan's browsermob-proxy solution worked for me on Chrome - but the latest revisions of browsermob are using a thing called littleproxy which does not support auth headers. Thusly I had to do browsermob-proxy -port 9090 --use-littleproxy false, which got things working.
You may use Windows Credentials Manager to avoid this pop-up being constantly shown on every attempt to log in.
Add your credentials to the 'Generic' category there, restart browser (including background apps running).
Some explanation I currently have: this pop-up is not 'browser' specific, it is 'in the middle', between browser and domain credentials verification. Thus browser features (save password, autofill) do not work completely. By the same reason Protractor / Selenium etc. do not have complete control over that pop-up - it is by design of the domain authentication.
As not completely sure if it is the only reason there are some other hints:
- you may also need to add your site to the IE (IE, not Chrome) list of trusted sites (Chrome grabs information from there);
- check "Automatic logon with current user name and password" in IE (not Chrome) - may not work if credentials you are using for the site are different from those you use to login to the machine.
If you're reading this in 2019, with Angular 7/8, consider this:
https://www.npmjs.com/package/authenticator-browser-extension
I find it much easier than the solutions suggested above.

FB.UI does not show thumbnail when secure browsing is enabled

I am using FB.UI api to allow users to post to their wall, here is the code for that:
FB.ui(
{
method: 'feed',
name: name,
link: linkPath,
picture: thumbnailPath,
caption: iconName,
description: 'Come check out my my awesome post'
},
function(response) {
if (response && response.post_id) {
alert('Post was published!');
} else {
alert('Post was not published!');
}
}
);
Normally this works fine and the Facebook dialog pops up showing the picture linked from "thumbnailPath", but when I use a test account that has enabled secure browsing, the thumbnail does not show up and when it is posted to the wall there is no picture. I am running this off of a MAMP Pro server and I created the certificate using MAMP's "Generate self signed certificate" feature, so it is not a valid certificate. I am wondering if this is the reason that my thumbnail won't show up. When I goto the path linked by the variable "thumbnailPath", it shows up just fine. I have tested this in Chrome, Safari and Firefox and I get the same behavior in all browsers.
I am wondering if my invalid certificate is likely to be the cause of this issue, or does that not make sense? I have been unable to find any other links online that describe similar problems so I am unsure if this has anything to do with my certificates.
If you don't have a valid certificate, then FB cannot/won't scrape the object over https including metadata like the image. Get a real cert, then make sure FB can scrape your url using the Debugger.
Also, if this is a brand new object and nothing has been published on the object, then FB doesn't know it exists (and won't have a thumbnail to show). You can initiate a pre-emptive scrape using the Debugger or programmatically with a GET/curl (see "Updating Objects", https://developers.facebook.com/docs/opengraph/objects/ ).
When publishing an app on apps.facebook.com (Canvas), you also need a valid cert for secure browsing.
If you are just testing the app, then you can put your app in Sandbox mode using the App Dashboard, which will let you, admins, testers, and other people you define in the Roles section use the app on Canvas with http (not requiring secure browsing).
I solved the problem by simply making sure to always link the the thumbnail with an http address instead of an https address, one I did this the thumbnail would always show up.

localStorage and 'file:' protocol not persistent, SQLite gives SECURITY_ERR

Introduction
I work with RapidWeaver — Mac OS X CMS application — and it uses no server environment. It has an editor and a preview mode. The preview mode is a Webkit based renderer, and I can use 'Inspect Element', like you normally could do in Safari.
I want to store some settings for a toolbar, either using localStorage or SQLite. I have read some information about indexedDB, though I have found no concrete implementations on how to use it.
Problems with localStorage
localStorage works fine when I stay in the preview mode, when I switch between editor and preview mode the url — location.href — is slightly altered:
file:///private/var/folders/s7/x8y2s0sd27z6kdt2jjdw7c_c0000gn/T/TemporaryItems/RapidWeaver/98970/document-143873968-28/RWDocumentPagePreview/code/styled/index.html
file:///private/var/folders/s7/x8y2s0sd27z6kdt2jjdw7c_c0000gn/T/TemporaryItems/RapidWeaver/98970/document-143873968-29/RWDocumentPagePreview/code/styled/index.html
document-143873968-28 changes into
document-143873968-29
What I have read about localStorage, that it's basically globalStorage[location.hostname] for FireFox. As far as I know globalStorage is not supported in Safari, so I can't try that.
Problems with SQLite
When I try to open a database:
var shortName = 'mydatabase';
var version = '1.0';
var displayName = 'My Important Database';
var maxSize = 65536; // in bytes
var db = openDatabase(shortName, version, displayName, maxSize);
I get this in my console:
SECURITY_ERR: DOM Exception 18: An attempt was made to break through the security policy of the user agent.
That basically wraps up my question, I will appreciate any answers or comments sincerely.
Using the following solution: Implementing a WebView database quota delegate with a few modifications I was able to get it to work.
The following delegate method worked for me (place in your webViewDelegate):
- (void)webView:(WebView *)sender frame:(WebFrame *)frame exceededDatabaseQuotaForSecurityOrigin:(id) origin database:(NSString *)databaseIdentifier
{
static const unsigned long long defaultQuota = 5 * 1024 * 1024;
if ([origin respondsToSelector: #selector(setQuota:)]) {
[origin performSelector:#selector(setQuota:) withObject:[NSNumber numberWithLongLong: defaultQuota]];
} else {
NSLog(#"could not increase quota for %#", defaultQuota);
}
}
By default the database is given 0 bytes, which results in the vague error message you get above. The above method is called after an attempt is made to create a database when there is not enough space. Note that this method is defined in WebUIDelegatePrivate.h ( http://opensource.apple.com/source/WebKit/WebKit-7533.16/mac/WebView/WebUIDelegatePrivate.h ) and using may preclude you from submitting your app to the mac app store.
localStorage is a html5 mechanism to give scripts a bit more space than cookies. Safari supports it: https://developer.apple.com/library/archive/documentation/iPhone/Conceptual/SafariJSDatabaseGuide/Name-ValueStorage/Name-ValueStorage.html
I don't know offhand what, if any, path restrictions it should have for file:/// based apps.
Edit: looking into the path restrictions further, I see that what you got should work with Safari, FF recently fixed a bug that would keep it from working there: https://bugzilla.mozilla.org/show%5Fbug.cgi?id=507361

Categories

Resources