Unable to authorize to couchdb using $.couch.login - javascript

in my CouchDB setup i have the following configuration
CORS in configuration is enabled (it worked before i locked down the database)
a basic admin with name admin and password admin exists
localsite is http://localhost/mysite and couchdb is located in http://localhost:5984/
i have avoided to use any server-side scripting and just serve the static files, the rest is handled in client-side, so if it's possible, do not write your entire answer based on server-side PHP or node.js.
Tried to login with $.couch.login it returns
{"ok":true,"name":null,"roles":["_admin","admin"]}
the i try to request $.couch.session and instead of a populated json it justs returns
{"ok":true,"userCtx":{"name":null,"roles":[]},"info":{"authentication_db":"_users","authentication_handlers":["oauth","cookie","default"]}}
when i tried with a REST tool , the result was
{"ok":true,"userCtx":{"name":"admin","roles":["_admin","admin"]},"info":{"authentication_db":"_users","authentication_handlers":["oauth","cookie","default"],"authenticated":"cookie"}}
when worked with the REST tool, it allowed me to continue , with adding documents, deleting , and so on.
What exactly am i missing here?

Well the following code allowed for cookie in headers.
$.ajaxSetup({
crossDomain:true
,xhrFields:{ withCredentials:true}});

Related

Is it possible for PaaS's HTTP trigger by Javascript's fetch to be restricted on domain basis?

I've developed a js to help users input there info in a form by fetching public data.
Now I'm thinking to deploy it as kind of an API service.
Is it possible and safe enough for HTTP trigger of PaaS's like GCF and Amazon Lambda to be triggered only from specif domains I allow? Like js's fetching and reading its header's origin and check its domain.
I've considered generating passcodes per my customer and placing it in key.js in user's directory or env value, have my js file open on URL, let user website read the js with return of key.js in query param and check its validity.
But forms can be everywhere in cutomers tree, placing it in env for each custmomer can be bothersome at scaling.
you can use ReCaptcha v3, add the allowed domains that can access your function endpoint, and verify the token is valid on the function implementation.
This isn't a native GCF feature, but you could try
Adding a filter in your GCF code (e.g. express.js) to check the requested domain
Making your GCF private and letting it ensure callers are authorized (GCP callers)
Run in Cloud Run, App Engine or another service with Identity Aware Proxy and screen out callers that way

How to get around auth level when testing API on laravel project?

I am using the basic Auth from laravel that you get from running the following command.
php artisan make:auth
I have an API written so that the backend on the server can update/create services and statuses. The issue i'm running into is that the Admin also has a UI on the web app and can create a service, or update its status manually. Therefore, I have an Auth level on the methods where you have to be logged in to use them.
Now when I call the method in postman it redirects me to the login page, I was wondering if there was a way around this Auth level strictly for an API?
I was told of a way to do pre-request scripts directly in postman but i'm fairly lost when it comes to the whole java script part of that and feel like there is an easier way to do it. I also already tried to do 'basic auth' with the username and password, it didnt seem to work though.
Thank you for the help in advance!
Edit: Here is the screenshot from my header.
I am presuming if you have the API in place you have an api_token set for the specific user. You can use that inside Postman in one of two ways.
You will goto the Headers tab and add:
Key: Authorization
Value: Bearer API_TOKEN_VALUE
Edited: Added the screenshot of postman
You can amend the url for the request and add the token:
url_to_api_endpoint?api_token=API_TOKEN_VALUE
On the api routes if you have ->middleware('auth:api') Laravel will read the authorization token from the header or using the query parameter and check it to the database value.
Adding the api_token to the user table
If you don't have an api_token field in your user table then add one. It is not the same as remember_token, they are different. So add to your user migration the following:
$table->string('api_token', 60)->unique();
You will need to update the users api_token using something like the following:
$user = User::find(1);
$user->update(['api_token' => str_random(60)]);
That 60 character string you will use where I put VALUE_OF_TOKEN_FROM_DB

AngularJS $http.get does not return expected data from API

I am attempting to create a mobile phone application with a javascript / AngularJS frontend that communicats with a node js / express js backend.
I believe that I have properly enabled cors but am not completely certain that it has been done in the correct manner. None of the frontend files are hosted on a server (not even a local one). The node js server is hosted online as well as a mongo db server that it interacts with.
So far I am able to make POST's to my API that create a new user and reflect this in the database. I also have a login that POST's to an authentication function which returns a JSON Web Token (JWT). From here I should be able to put the JWT in the header of requests with the key "Authorization" to get access to the other parts of the API (eg: GET /currentUser).
Attempting to GET /currentUser when the JWT is in the header with postman returns all of the expected data. When I attempt to perform the same GET from my frontend (with JWT in header), I get the following OPTIONS response via firebug: "Reload the page to get source for: MyHostedApi/api/users"
I'm wondering if this is some kind of cors issue, incorrectly set authorization header, bad formatting of the $http.get, etc. Any help is greatly appreciated! I'd be glad to provide any parts of the source that are relevant.
This is what my GET looks like:
$http.get("MyHostedApi/api/users/currentUser")
.success(function(response) {
$scope.userData = response.data.firstName;
});

What is the best/proper configuration? (javascript SOAP)

I need to retrieve data from a web service (via SOAP) during a nightly maintenance process on a LAMP server. This data then gets applied to a database. My research has returned many options and I think I have lost sight of the forest for the trees; partially because of the mix of client and server terms and perspectives of the articles I have read.
Initially I installed node.js and node-soap. I wrote a simple script to test functionality:
var soap = require('/usr/local/lib/node_modules/npm/node_modules/soap');
var url = "https://api.authorize.net/soap/v1/Service.asmx?WSDL";
soap.createClient(url, function(err, client)
{
if(typeof client == 'undefined')
{
console.log(err);
return;
}
console.log('created');
});
This uses a demo SOAP source and it works just fine. But when I use the actual URL I get a 5023 error:
[Error: Invalid WSDL URL: https://*****.*****.com:999/SeniorSystemsWS/DataExportService.asmx?WSDL
Code: 503
Response Body: <html><body><b>Http/1.1 Service Unavailable</b></body> </html>]
Accessing this URL from a browser returns a proper WSDL definition. I am told by the provider that the 503 is due to a same-origin policy violation. Next, I researched adding CORS to node.js. This triggered my stepping back and asking the question: Am I in the right forest? I'm not sure. So, I am looking for a command-line, SOAP capable, CORS app (or equivalent) configuration. I am a web developer primarily using PHP and Javascript, so Javascript is where I turned first, but that is not a requirement. Ideas? Or, is there a solution to the current script error (the best I think I have found is using jQuery in node.js which includes CORS)
Most likely, this error belongs to your website server.
Please go through this link, it might be helpful.
http://pcsupport.about.com/od/findbyerrormessage/a/503error.htm
Also you can open your wsdl in web browser, search for soap:address location tag under services. And figure out correct url, you are trying to invoke from your script. Directly access this url in browser and see what are you getting.
I think I have a better approach to the task. I found over the weekend that PHP has a full SOAP client. I wrote the same basic login script in PHP and it runs just fine. I get a valid authentication code in the response to loginExt (which is required in further requests), so it looks like things are working. I will comment here after verifying that I can actually use the web service.

Google OAuth WildCard Domains

I am using the google auth but keep getting an origin mismatch. The project I am working has sub domains that are generated by the user. So for example there can be:
john.example.com
henry.example.com
larry.example.com
In my app settings I have one of my origins being http://*.example.com but I get an origin mismatch. Is there a way to solve this? Btw my code looks like this:
gapi.auth.authorize({
client_id : 'xxxxx.apps.googleusercontent.com',
scope : ['https://www.googleapis.com/auth/plus.me',
state: 'http://henry.example.com',
'https://www.googleapis.com/auth/userinfo.email', 'https://www.googleapis.com/auth/userinfo.profile'],
immediate : false
}, function(result) {
if (result != null) {
gapi.client.load('oath2', 'v2', function() {
console.log(gapi.client);
gapi.client.oauth2.userinfo.get().execute(function(resp) {
console.log(resp);
});
});
}
});
Hooray for useful yet unnecessary workarounds (thanks for complicating yourself into a corner Google)....
I was using Google Drive using the javascript api to open up the file picker, retrieve the file info/url and then download it using curl to my server. Once I finally realized that all my wildcard domains would have to be registered, I about had a stroke.
What I do now is the following (this is my use case, cater it to yours as you need to)
On the page that you are on, create an onclick event to open up a new window in a specific domain (https://googledrive.example.com/oauth/index.php?unique_token={some unique token}).
On the new popup I did all my google drive authentication, had a button to click which opened the file picker, then retrieved at least the metadata that I needed from the file. Then I stored the token (primary key), access_token, downloadurl and filename in my database (MySQL).
Back on step one's page, I created a setTimeout() loop that would run an ajax call every second with that same unique_token to check when it had been entered in the database. Once it finds it, I kill the loop and then retrieve the contents and do with them as I will (in this case I uploaded them through a separate upload script that uses curl to fetch the file).
This is obviously not the best method for handling this, but it's better than entering each and every subdomain into googles cloud console. I bet you can probably do this with googles server side oauth libraries they use, but my use case was a little complicated and I was cranky cause I was frustrated at the past 4 days I've spent on a silly little integration with google.
Wildcard origins are not supported, same for redirect URIs.
The fact that you can register a wildcard origin is a bug.
You can use the state parameter, but be very careful with that, make sure you don't create an open redirector (an endpoint that can redirect to any arbitrary URL).

Categories

Resources