Access-Control-Allow-Origin using ShareJS - javascript

I have my website on one domain/server: www.mysite.com and I'm running ShareJS on another server: www.my-other-server.com:8000.
www.mysite.com/index.html
<script src="http://www.my-other-server.com:8000/bcsocket.js"></script>
<script src="http://www.my-other-server.com:8000/share.js"></script>
<script src="http://www.my-other-server.com:8000/textarea.js"></script>
...
<textarea id='sharetext' ></textarea>
<script>
// get the textarea element
var elem = document.getElementById("sharetext");
// connect to the server
var options = {
origin: "www.my-other-server.com:8000",
browserChannel:{cors:"*"}
}
var connection = sharejs.open('test', 'text', options, function(error, doc) {
doc.attach_textarea(elem);
});
</script>
I get the following error in the JS console:
XMLHttpRequest cannot load http://www.my-other-server.com:8000/test?VER=8&MODE=init&zx=v44znr3caqea&t=1. Origin http://www.mysite.com is not allowed by Access-Control-Allow-Origin.
This ShareJS GitHub Issue (https://github.com/share/ShareJS/issues/77) suggests adding browserChannel:{cors:"*"} to the share options, as I did above, but it did not seem to have any effect...
What do I do here? It's important that my sharejs traffic is on a separate server than my static/dynamic web server.

On server side in node.js, if you are using express.js you need to add extra headers, that will allow cross-domain traffic from server side:
app.configure(function() {
app.use(function(req, res, next) {
res.header('Access-Control-Allow-Credentials', true);
res.header('Access-Control-Allow-Origin', req.headers.origin);
res.header('Access-Control-Allow-Methods', 'GET,PUT,POST,DELETE');
res.header('Access-Control-Allow-Headers', 'X-Requested-With, X-HTTP-Method-Override, Content-Type, Accept');
next();
});
app.set('jsonp callback', true);
});
On client side you still might end up with security issues, so it is even better to use JSONP, so from server side response like that:
res.jsonp({ hello: 'world' });
And on client side AJAX like that:
$.ajax({
url: "www.my-other-server.com:8000",
type: 'GET',
dataType: 'jsonp',
success: function(data) {
console.log(data)
},
error: function(xhr, status, error) {
console.log('error[' + status + '] jsonp');
}
});

Try adding browserChannel: { cors:"*" } in bin/options.js. It should work.
The final options.js may look like this
// ShareJS options
module.exports = {
// Port to listen on
port: 8000,
// Database options
db: {
// DB type. Options are 'redis', 'couchdb' or 'none'. 'redis' requires the
// redis npm package.
//
// If you don't want a database, you can also say db: null. With no database,
// all documents are deleted when the server restarts.
// By default, sharejs tries to use the redis DB backend.
type: 'redis',
// The prefix for database entries
prefix: 'ShareJS:',
// The hostname, port and options to pass to redis.
// null lets the database decide - redis by default connects to localhost port 6379.
//hostname: null,
//port: null,
//redisOptions: null
// To use CouchDB uncomment this section then run bin/setup_couch.
// Database URI Defaults to http://localhost:5984/sharejs .
//type: 'couchdb',
//uri: "http://admin:admin#localhost:5984/ot",
// To use postgresql uncomment this section then run bin/setup_pg
//type: 'pg',
//uri: 'tcp://josephg:#localhost/postgres',
// By default, sharejs will create its tables in a schema called 'sharejs'.
//schema: 'sharejs',
//operations_table: 'ops',
//snapshot_table: 'snapshots',
// sharejs will automatically try and create the DB tables if they don't exist. You
// can create the database tables manually using bin/setup_pg.
//create_tables_automatically: true,
},
// The server will statically host webclient/ directory at /share/*.
// (Eg, the web client can be found at /share/share.js).
// Set staticpath: null to disable.
staticpath: '/share',
// REST frontend options. Set rest: null to disable REST frontend.
rest: {
},
// SocketIO frontend options. Set socketio: null to disable socketIO frontend.
socketio: {
// Specify tuples for io.configure:
// 'transports': ['xhr-polling', 'flashsocket']
},
// Browserchannel server options. Set browserChannel:null to disable browserchannel.
browserChannel: {cors:"*"},
// Authentication code to test if clients are allowed to perform different actions.
// See documentation for details.
//auth: function(client, action) {
// action.allow();
//}
}

Related

Meteor JS (Iron Router) - Restricting access to server routes

I have a download route in my MeteorJs app which i want to restrict access to. The route code is as follows
Router.route("/download-data", function() {
var data = Meteor.users.find({ "profile.user_type": "employee" }).fetch();
var fields = [...fields];
var title = "Employee - Users";
var file = Excel.export(title, fields, data);
var headers = {
"Content-type": "application/vnd.openxmlformats",
"Content-Disposition": "attachment; filename=" + title + ".xlsx"
};
this.response.writeHead(200, headers);
this.response.end(file, "binary");
},
{ where: "server" }
);
The route automatically downloads a file. This is currently working but I want to restrict access to the route. I only want admins to be able to download it.
I have created an onBeforeAction Hook as below
Router.onBeforeAction(
function() {
//using alanning:roles
if(Roles.userIsInRole(this.userId, "admin"){
console.log('message') //testing
}
},
{
only: ["downloadData"]
}
);
and renamed my route as below
//code above
this.response.writeHead(200, headers);
this.response.end(file, "binary");
},
{ where: "server", name: "downloadData" }
);
The onBeforeAcion hook does not take any effect
Also I noticed neither this.userId nor Meteor.userId works on the route
For the server side hook, I am pretty sure you need the onBeforeAction to have the { where: "server" } part as you do for your route.
Also, I don't think iron:router ever implemented server side user auth on their routing. You may want to check into a package around server routing with larger features such as mhagmajer:server-router that has access to authenticated routes.
https://github.com/mhagmajer/server-router

How can I access my req.body data/json in Express?

I am new to node.js and I am trying to access json on my node.js server from a post request so I can send it to an API and feed it back to my front-end js file. I can see the json object, but I can't seem to access it(ex: req.body.name) after reading some documentation/stackoverflow posts.
Here is my post route from my server.js file, and packages:
var prettyjson = require('prettyjson');
var express = require('express');
var http = require('http');
var cors = require('cors');
var bodyParser = require('body-parser');
var app = express();
// create application/json parser
app.use(bodyParser.json());
// create application/x-www-form-urlencoded parser
app.use(bodyParser.urlencoded({
extended: true
}));
app.post('/', function(req, res) {
var test = req.body; //If I req.body.name here, it will return undefined
console.log(test);
});
Here is my front end map.js file post function and data:
var locations = [
{name:'Le Thai', coords:{lat:36.168743, lng:-115.139866}},
{name:'Atomic Liquors', coords:{lat:36.166782, lng:-115.13551}},
{name:'The Griffin', coords:{lat:36.168785, lng:-115.140329}},
{name:'Pizza Rock', coords:{lat:36.17182, lng:-115.142304}},
{name:'Mob Museum', coords:{lat:36.172815,lng:-115.141242}},
{name:'Joe Vicari’s Andiamo Italian Steakhouse', coords:{lat:36.169437, lng:-115.142903}},
{name:'eat', coords:{lat:36.166535, lng:-115.139067}},
{name:'Hugo’s Cellar', coords:{lat:36.169915, lng:-115.143861}},
{name:'Therapy', coords:{lat:36.169041, lng:-115.139829}},
{name:'Vegenation', coords:{lat:36.167401, lng:-115.139453}}
];
//convert array to JSON
var jsonStr = JSON.stringify(locations);
$.post('http://localhost:3000/', jsonStr, function(data){
//empty for now
},'json');
End goal: I want to be able to access my data like req.body.name. I tried using typeof on req.body, and it returns an object, however I can't seem to access this object. And I tried using JSON.parse, but realized req.body is already an object. I would like to serve this data to the Yelp API eventually.
Current output(per request) from console.log(req.body):
{ '{"name":"Le Thai","coords":{"lat":36.168743,"lng":-115.139866}},
{"name":"Atomic Liquors","coords":{"lat":36.166782,"lng":-115.13551}},
{"name":"The Griffin","coords":{"lat":36.168785,"lng":-115.140329}},
{"name":"Pizza Rock","coords":{"lat":36.17182,"lng":-115.142304}},
{"name":"Mob Museum","coords":{"lat":36.172815,"lng":-115.141242}},
{"name":"Joe Vicari’s Andiamo Italian Steakhouse","coords":
{"lat":36.169437,"lng":-115.142903}},{"name":"eat","coords":
{"lat":36.166535,"lng":-115.139067}},{"name":"Hugo’s Cellar","coords":
{"lat":36.169915,"lng":-115.143861}},{"name":"Therapy","coords":
{"lat":36.169041,"lng":-115.139829}},{"name":"Vegenation","coords":
{"lat":36.167401,"lng":-115.139453}}': '' }
You're using an array, so it will not be:
req.body.name
but e.g.
req.body[0].name
You probably want to iterate over the array that you get with .forEach or a for loop etc.
The problem is you're not telling the server you're sending it JSON, so it's not getting parsed. Also, as rsp pointed out, to access the first name, you'd want req.body[0].name, not req.body.name.
The dataType parameter on $.post isn't to tell the server what you're sending it, it's to tell jQuery what you're expecting back from the server. To tell the server what you're sending it, use $.ajax and the contentType option:
$.ajax({
url: 'http://localhost:3000/',
type: "POST",
contentType: "application/json", // <====
data: jsonStr,
success: function(data){
//empty for now
}
});
Now, the body-parser module sees the content type on the request, and parses it for you. So for instance, if I change your server file to do this:
app.post('/', function(req, res) {
req.body.forEach(function(entry, index) {
console.log(index, entry.name)
});
});
...then with the change above to the client code, I get this on the server console:
0 'Le Thai'
1 'Atomic Liquors'
2 'The Griffin'
3 'Pizza Rock'
4 'Mob Museum'
5 'Joe Vicari’s Andiamo Italian Steakhouse'
6 'eat'
7 'Hugo’s Cellar'
8 'Therapy'
9 'Vegenation'
For those getting an empty object in req.body
I had forgotten to set headers: {"Content-Type": "application/json"} in the request. Changing it solved the problem

Seem to Have the Wrong Content Type When POSTing with Chai-HTTP

I am looking to make use of Chai-HTTP for some testing. Naturally I want to test more than my GETs however I seem to be hitting a major roadblock when attempting to make POSTs.
In an attempt to figure out why my POSTs weren't working I began hitting them against a POST test server.
Here is a POST attempt formatted using an entirely different toolchain (Jasmine-Node and Frisby) for testing (that works just fine):
frisby.create('LOGIN')
.post('http://posttestserver.com/post.php', {
grant_type:'password',
username:'helllo#world.com',
password:'password'
})
.addHeader("Token", "text/plain")
.expectStatus(200)
})
.toss();
Which results in:
Time: Mon, 27 Jun 16 13:40:54 -0700
Source ip: 204.191.154.66
Headers (Some may be inserted by server)
REQUEST_URI = /post.php
QUERY_STRING =
REQUEST_METHOD = POST
GATEWAY_INTERFACE = CGI/1.1
REMOTE_PORT = 19216
REMOTE_ADDR = 204.191.154.66
HTTP_CONNECTION = close
CONTENT_LENGTH = 64
HTTP_HOST = posttestserver.com
HTTP_TOKEN = text/plain
CONTENT_TYPE = application/x-www-form-urlencoded
UNIQUE_ID = V3GPVkBaMGUAAB1Uf04AAAAc
REQUEST_TIME_FLOAT = 1467060054.9575
REQUEST_TIME = 1467060054
Post Params:
key: 'grant_type' value: 'password'
key: 'username' value: 'hello#world.com'
key: 'password' value: 'password'
Empty post body.
Upload contains PUT data:
grant_type=password&username=hello%40world.com&password=password
And here is a POST attempt using Chai and Chai-HTTP. I would expect this to work the same as the above example using Jasmine and Frisby, however, you'll see the actual request differs in several ways.
describe('/post.php', function() {
var endPointUnderTest = '/post.php';
it('should return an auth token', function(done) {
chai.request('http://posttestserver.com')
.post(endPointUnderTest)
.set('Token', 'text/plain')
.send({
grant_type: 'password',
username: 'hello#world.com',
password: 'password'
})
.end(function(err, res) {
console.log(res);
res.should.have.status(200);
done();
});
});
});
Which results in:
Time: Tue, 28 Jun 16 06:55:50 -0700
Source ip: 204.191.154.66
Headers (Some may be inserted by server)
REQUEST_URI = /post.php
QUERY_STRING =
REQUEST_METHOD = POST
GATEWAY_INTERFACE = CGI/1.1
REMOTE_PORT = 1409
REMOTE_ADDR = 204.191.154.66
HTTP_CONNECTION = close
CONTENT_LENGTH = 76
CONTENT_TYPE = application/json
HTTP_TOKEN = text/plain
HTTP_USER_AGENT = node-superagent/2.0.0
HTTP_ACCEPT_ENCODING = gzip, deflate
HTTP_HOST = posttestserver.com
UNIQUE_ID = V3KB5kBaMGUAAErPF6IAAAAF
REQUEST_TIME_FLOAT = 1467122150.9125
REQUEST_TIME = 1467122150
No Post Params.
== Begin post body ==
{"grant_type":"password","username":"hello#world.com","password":"password"}
== End post body ==
Upload contains PUT data:
{"grant_type":"password","username":"hello#world.com","password":"password"}
Notice the difference in CONTENT_TYPE, Post Params and PUT data in particular (I think this is the source of my problem).
Where Jasmine/Frisby would submit the POST using the 'application/x-www-form-urlencoded' format, Chai-HTTP seems to be using the 'application/json' format.
Am I somehow misusing Chai-HTTP's POST capabilities? Or does Chai-HTTP not allow for 'application/x-www-form-urlencoded' POST requests? I do not seem to be able to resolve this and it is the final hurdle for me to jump to make the transition to using a Mocha/Chai toolchain for my testing (which is the goal, I would prefer to not use a different library unless it's absolutely necessary).
Having discussed this further on Chai-HTTP's Git-Hub page, I was able to find out that this is expected behaviour of SuperAgent, the HTTP request library under the hood of Chai-HTTP, which auto-detects the content-type based on what kind of data is contained in the .send() call.
I stumbled across this particular question as well which helped clarify what the difference between content-types actually was.
If anyone else runs into this problem, I've learned that Chai-HTTP's POST requests can be altered quite easily (kudos to meeber's help here) using calls like this:
//Override auto-detection by specifying the header explicitly
.set('content-type', 'application/x-www-form-urlencoded')
//Select the type 'form'
.type('form')
//Pass multiple strings in send instead of using an object
.send('grant_type=password')
.send('username=hello#world.com')
.send('password=password')
Creating a request that looks like this:
describe('/post.php', function() {
var endPointUnderTest = '/post.php';
it('should return an auth token', function(done) {
chai.request('http://posttestserver.com')
.post(endPointUnderTest)
.set('Token', 'text/plain')
.set('content-type', 'application/x-www-form-urlencoded')
.type('form')
.send('grant_type=password')
.send('username=hello#world.com')
.send('password=password')
.end(function(err, res) {
console.log(res);
res.should.have.status(200);
done();
});
});
});

How to monitor HTTP calls using browsermob-proxy and nightwatch.js?

I am writing testcases using Nightwatch.js framework for SPA application. A requirement came in here we have to monitor HTTP calls and get the performance results for the site. As this could be easily achieved using JMeter.
Using automation testing tool, we can do it by using browsermob-proxy and selenium.
Is it possible to do the same using Nightwatch.js and browsermob-proxy?
Also what are the steps to do to the same.
For using Nightwatchjs and browsermob-proxy together, check out this repo, which includes info on the NodeJS bindings for browsermob-proxy and programmatically generating HAR (HTTP Archive) files.
If you're content with just using Nightwatchjs, this repo has code in the tests directory for the following:
Custom command to get the requests made so far
Custom assertion for checking if a request, given a filter and query string params, exists.
You might have to brush up on how to add custom commands and assertions to your Nightwatch project, but after that you should be set to go!
You can use browsermob-proxy-api
just simply download browsermob-proxy server then
install by npm command: npm install browsermob-proxy-api --save-dev
configure you night watch like this in desiredCapabilites:
'test_settings': {
'default': {
'launch_url': 'http://localhost:3000',
'screenshots': {
'enabled': true, // if you want to keep screenshots
'path': './screenshots' // save screenshots here
},
'globals': {
'waitForConditionTimeout': 30000 // sometimes internet is slow so wait.
},
'desiredCapabilities': { // use Chrome as the default browser for tests
'browserName': 'chrome',
'proxy': {
'proxyType': 'manual',
'httpProxy': 'localhost:10800'
},
'acceptSslCerts': true,
'javascriptEnabled': true, // turn off to test progressive enhancement
}
},
then download index.js from here:
https://github.com/jmangs/node-browsermob-proxy-api
and add code from example to your step_definitions if you use gherkin or describe step
Bit late into dance. I managed to integrate browsermob to nightwatch. Here are the detailed steps
Download browsermob proxy https://bmp.lightbody.net/
Open your cmd and go to bin folder and then start browsermob using "browsermob-proxy".
I am assuming you have basic nightwatch setup. You also need mobproxy. Install it from "npm i browsermob-proxy-api"
Create a global hook in nightwatch. Say 'globalmodule.js' and give this file path in globals_path in nightwatch.json
In globalmodule, create global hooks as described in http://nightwatchjs.org/guide#external-globals
In beforeEach hook, add below code: //if you are not under corporate proxy and you dont need to chain to upstream proxy
var MobProxy = require('browsermob-proxy-api');
var proxyObj = new MobProxy({'host': 'localhost', 'port': '8080'});
//assuming you started browsermob in 8080 port. That is in step 2.
//if you are working under corporate proxy, you might have to chain your request. This needs editing in browsermob-proxy-api package. Follow steps given at end of this section.
Start proxy on new port
proxyObj.startPort(port, function (err, data) {
if (err) {
console.log(err);
} else {
console.log('New port started')
}
})
Once we have new port, we have to start our chrome browser in above port so that all browser request are proxied through browsermob.
proxyObj.startPort(port, function (err, data) {
if (err) {
console.log(err);
} else {
console.log('New port started')
var dataInJson = JSON.parse(data);
//Step 8:
this.test_settings.desiredCapabilities = {
"browserName": "chrome",
"proxyObj": proxyObj, //for future use
"proxyport": dataInJson.port, //for future use
"proxy": {
"proxyType": "manual",
"httpProxy": "127.0.0.1:" + dataInJson.port,
"sslProxy": "127.0.0.1:" + dataInJson.port //important is you have https site
},
"javascriptEnabled": true,
"acceptSslCerts": true,
"loggingPrefs": {
"browser": "ALL"
}
}
}
})
Try to run with above setting, you can check if cmd [created in step2 to confirm request are going via above port. There will be some activiy]
For creating HAR and getting created HAR, browsermob-proxy-api gives excellent api.
add createHAR.js in any path and mention that path in nightwatch.json[custom_commands section]
exports.command = function (callback) {
var self = this;
if (!self.options.desiredCapabilities.proxyObj) {
console.error('No proxy setup - did you call setupProxy() ?');
}
this.options.desiredCapabilities.proxyObj.createHAR(this.options.desiredCapabilities.proxyport, {
'captureHeaders': 'true',
'captureContent': 'true',
'captureBinaryContent': 'true',
'initialPageRef': 'homepage'
}, function (err, result){
if(err){
console.log(err)
}else{
console.log(result)
if (typeof callback === "function") {
console.log(this.options.desiredCapabilities.proxyObj);
console.log(this.options.desiredCapabilities.proxyport);
// console.log(result);
callback.call(self, result);
}
}
});
return this;
};
then to getHAR, add getHAR.js, add below code.
var parsedData;
exports.command = function(callback) {
var self = this;
if (!self.options.desiredCapabilities.proxy) {
console.error('No proxy setup - did you call setupProxy() ?');
}
self.options.desiredCapabilities.proxyObj.getHAR(self.options.desiredCapabilities.proxyport, function (err, data) {
console.log(self.options.desiredCapabilities.proxyObj);
console.log(self.options.desiredCapabilities.proxyport);
//console.log(result);
if(err){
console.log(err)
}else{
parsedData = JSON.parse(data)
console.log(parsedData.log.entries)
}
if (typeof callback === "function") {
console.log(self.options.desiredCapabilities.proxyObj);
console.log(self.options.desiredCapabilities.proxyport);
callback.call(self, parsedData);
}
});
return this;
};
At start of test, createHAR will not have proxyObj, So this step should be executed sync. Wrap that step with browser.perform()
browser.perform(function(){
browser.createHAR()
})
////some navigation
browser.perform(function(){
browser.getHAR()
})
Note: If you are working behind corporate proxy, You might have to use chain proxy piece which browsermob offers.
According to browsermob proxy documentation, get down to api section, -> /proxy can have request parameters "proxyUsername" and "proxyPassword"
In node_modules->browsermob-proxy-api->index.js
add below line after line 22:
this.proxyUsername = cfg.proxyUsername || '';
this.proxyPassword = cfg.proxyPassword || '';
this.queryString = cfg.queryString || 'httpProxy=yourupstreamProxy:8080'; //you will get this from pac file
then at line 177, where package is making request '/proxy' to browser.
replace
path: url
to
path: url + '?proxyUsername=' +this.proxyUsername + '&proxyPassword=' + this.proxyPassword + '&' + this.queryString

POST request treated as OPTIONS on beego framework

I'm using beego framework as my API framework and AngularJS on the client.
I have set all CORS setting correctly. I can do GET request. But, when i try to POST, beego treat is as OPTIONS request. It also throw a warning: multiple response.WriteHeader calls. what could possibly wrong?
my beego CORS setting:
func init() {
orm.RegisterDataBase("default", "mysql", "root:#tcp(127.0.0.1:3306)/fakeapi")
beego.InsertFilter("*", beego.BeforeRouter, cors.Allow(&cors.Options{
AllowOrigins: []string{"*"},
AllowMethods: []string{"GET", "DELETE", "PUT", "PATCH", "POST"},
AllowHeaders: []string{"Origin"},
ExposeHeaders: []string{"Content-Length"},
AllowCredentials: true,
}))
}
My ANgularJS request
var transaction = $http.post(BASE_URL + "transaction", transactionData);
return $q.all([transaction]).then(function(response) {
console.log(response);
});
my system:
Ubuntu 14.04
beego: 1.4.2
bee: 1.2.4
angularJS: 1.3.12
That might because of an issue/pull request currently pending to be merged into master: issue 912
Without this line everything is fine:: router.go#L861
That seems to be in line with commit 3bb4d6f which shows:
// Write status code if it has been set manually
// Set it to 0 afterwards to prevent "multiple response.WriteHeader calls"
(and router.go do set a status, hence the error message)
Commit f962457 is supposed to solve this issue, but isn't merged yet.
The other issue 904 mentions something about being unable to retrieve the Session data previously registered in the Session Engine.
Maybe Session.on flag can help.
I handle it like that, I hope it helps
import (
_ "info_apoyo/routers"
"github.com/astaxie/beego"
"github.com/astaxie/beego/plugins/cors"
)
func main() {
if beego.BConfig.RunMode == "dev" {
beego.BConfig.WebConfig.DirectoryIndex = true
beego.BConfig.WebConfig.StaticDir["/swagger"] = "swagger"
}
beego.InsertFilter("*", beego.BeforeRouter, cors.Allow(&cors.Options{
AllowOrigins: []string{"*"},
AllowMethods: []string{"GET", "POST", "DELETE", "PUT", "PATCH"},
AllowHeaders: []string{"Origin", "content-type", "Access-Control-
Allow-Origin"},
ExposeHeaders: []string{"Content-Length", "Access-Control-Allow-
Origin"},
AllowCredentials: true,
}))
beego.Run()
}

Categories

Resources