Using Breeze.js EntityManager from within Node - javascript

I'm interested in being able to use the Breeze.js EntityManager and query capabilities within a node console service to access a remote Data Service that exposes an BreezeJS/OData compliant RESTful endpoint.
We currently have a Data Service implemented using Node.js, MongoDB and the Breeze.js breeze-mongodb module.
We have web browser hosted clients that access the MondgoDB using the Breeze.js client API (EntityManager) and the Data Service described above.
I need to create another Node.js service that can access the same MongoDB database that the web browser hosted clients do, and for consistency/simplicity I would like to use the same data acceess API as I am using in the web browser.
Has anyone experimented with this configuration?
I experimented with loading Breeze and its dependencies using the Node.js module infrastructure, but am getting errors when Breeze tries to initialize Angular as an ajax handler. Angular is installed and configured as a node module dependency, but I am getting an error thrown:
Error: [$injector:nomod] http://errors.angularjs.org/1.2.2/$injector/nomod?p0=ngLocale
In theory I shouldn't need angular, but I get additional errors if Angular is not present.
I may be able to debug this particular issue, but it will require stepping through Breeze.js code in detail and possibly modifying it to fix. Was curious if anyone else has gotten this working.

I'm running Breeze in Node at the moment. It used to work just fine without any modification, but a few versions ago they added a check that it's running in the browser... so now I manually remove that check :-)
My use-case is a little bit different: I'm running breeze on the server so that I can use the same business logic as in the client, and just have a really really thin layer between breezejs and the DB.
The only thing I needed to change to get it to run in the browser is add a fake ajax handler that delegates to my skinny DB wrapper - you could equally delegate to anything else, including your existing API.
var ctor = function () {
this.name = 'node';
this.defaultSettings = { };
};
ctor.prototype.initialize = function () {
};
var query = require('../../../../server/db/query');
ctor.prototype.ajax = function (config) {
if (config.url === '/api/all') {
query.get()
.then(function (result) {
var httpResponse = {
data: result,
status: '400',
getHeaders: undefined,
config: config
};
config.success(httpResponse);
})
.otherwise(function (error) {
var httpResponse = {
data: '',
status: '500',
getHeaders: undefined,
error: error,
config: config
};
config.error(httpResponse);
});
} else if (config.url === '/api/SaveChanges') {
query.save(JSON.parse(config.data))
.then(function (result) {
var httpResponse = {
data: result,
status: '400',
getHeaders: undefined,
config: config
};
config.success(httpResponse);
})
.otherwise(function (error) {
var httpResponse = {
data: '',
status: '500',
getHeaders: undefined,
error: error,
config: config
};
config.error(httpResponse);
});
}
};
breezejs.config.registerAdapter('ajax', ctor);
breezejs.config.initializeAdapterInstance('ajax', 'node', true);

It's a good question. We haven't actually tried running Breeze within Node but your use case is interesting. This sounds like a perfect item for the Breeze User Voice. We take these suggestions seriously.

Related

Custom auth (OAuth2) for Parse-server on docker container

So, as the title suggests, I'm looking for a way to integrate my own custom authentication service into the parse server, which is installed inside a docker container. This authentication is basically an OpenID implementation of KeyCloak.
The point is that I don't (and it would be best for my architecture not to) have parse server served with express on my local machine.
What I've been trying so far, was to search the internet, read the issues, read the parse server documents for JavaScript and the guide and other stuff to find out, how can I achieve it.
It seems that it doesn't matter what I do, at the end of each test, I get a 252 This authentication method is unsupported error! (this happens even if I use facebook, oauth, oauth2, etc).
So right now, the docker-compose service looks like this:
parse-server:
image: parseplatform/parse-server
ports:
- "${SERVER_PORT}:1337"
restart: "always"
volumes:
- ./server/parse/custom-auth:/parse-server/custom-auth
depends_on:
- mongodb
links:
- mongodb:mongo
environment:
- PARSE_SERVER_APPLICATION_ID=${APP_ID}
- PARSE_SERVER_MASTER_KEY=${MASTER_KEY}
- PARSE_SERVER_DATABASE_URI=mongodb://mongo:${MONGO_PORT}/dev
- PARSE_SERVER_START_LIVE_QUERY_SERVER=1
- PARSE_SERVER_LIVE_QUERY={"classNames":${LIVE_QUERY_CLASSES}}
- PARSE_SERVER_MOUNT_GRAPHQL=${GQL_API}
- PARSE_SERVER_MOUNT_PLAYGROUND=${GQL_PLAYGROUND}
- PARSE_SERVER_AUTH_PROVIDERS={"swwwan-mail-auth":{"module":"/parse-server/custom-auth/swwwan-mail-auth/index.js"}}
and the login/signup part:
export const loginWithParse = async (account: IUserColumnTypes) => {
if (account.username === null || account.password === null) {
throw "validation failed";
}
// #ts-ignore
const loggedIn = await Parse.User.logInWith("swwwan.mail-auth", {
authData: {
id: "",
access_token: "",
},
});
console.log({ loggedIn });
//return await Parse.User.logIn(account.username, account.password);
};
another alternative for login/signup:
export const loginWithParse = async (account: IUserColumnTypes) => {
if (account.username === null || account.password === null) {
throw "validation failed";
}
const u = new Parse.User();
u._linkWith("swwwan-mail-auth", {
authData: {
id: "tester",
access_token: "sample_access_token",
},
})
.then(res => console.log(res))
.catch(e => console.log(e));
//return await Parse.User.logIn(account.username, account.password);
};
UPDATE: by using the second alternative, I actually get the error:
error: Parse error: Invalid key name: authData.swwwan-mail-auth.id {"code":105,"stack":"Error: Invalid key name: authData.swwwan-mail-auth.id
Is there a way to make it work? probably I'm missing something here.
tnx :)
note that the 'dangle' in the link functions will be deprecated in the forthcoming 2.9 release of the Parse JS SDK
Sorry that the documentation isn't better yet. Will be getting some more work
What you're attempting is doable!
Your final error is giving one big clue: the name of your adapter can't have any characters that aren't valid for a javascript identifier. In this case, the - is causing a problem since when we save it in the database, the adapter name is used as a key.
The unit tests are often the best documentation and you may find them helpful in this case.
See:
the declaration of a custom adapter
Configuring the server to load the adapter (you're doing this right)
using the adapter

Read/Write and store data internelly in a Local App (no server) with JavaScript

So I am making a local App using Javascript , React and Electron and I want it to be able to work just fine without internet.
I can't use 'localStorage' because the data might get deleted if the user deletes the cache.
I tried reading/writing using differant Modules, none of them worked mostly because of CROS. Using XMLHTTPrequests and Ajax doesn't work either and am running out of time.
When I use them on the test server, they return the index.html for the mainpage (They can at least access that ... and still they can't read the data) but when I try it on the build I get CORS the error.
My Idea for now is to enable CORS on my webpage since I have no worries about security : The App will run ONLY offline so there is no danger.
But After many hours...I didn't find a solution to do it on the client side.
If anyone has an idea or suggestion I would be grateful.
I tried : fs,FileReader,FileSaver, $.ajax,XMLHTTPrequests
//using $ajax
var test = $.ajax({
crossDomain:true,
type: 'GET',
url:'../data/DefaultCategorie.txt',
contentType: 'text/plain',
success: function(data){
console.log(data);
},
error: function(){
alert('failed');
},
})
//using fs
fs.readFile('../data/DefaultCategorie.txt', 'utf8', (err, data) => {
if (err) {
console.log("Failed");
throw err
}
console.log(data);
fs.close(data, (err) => {
if (err) throw err;
});
});
This article covers the 3 most common ways to store user data: How to store user data in Electron
The Electron API for appDatadoes what you want. It is very easy to use.
From the above article:
const userDataPath = (electron.app || electron.remote.app).getPath('userData');
this.path = path.join(userDataPath, opts.configName + '.json')
this.data = parseDataFile(this.path, opts.defaults);
function parseDataFile(filePath, defaults) {
try {
return JSON.parse(fs.readFileSync(filePath));
} catch(error) {
// if there was some kind of error, return the passed in defaults instead.
return defaults;
}
}
Docs
app.getPath(name)
name String
Returns String - A path to a special directory or file associated with
name. On failure, an Error is thrown.
You can request the following paths by the name:
appData - Per-user application data directory, which by default points to:
%APPDATA% on Windows
$XDG_CONFIG_HOME or ~/.config on Linux
~/Library/Application Support on macOS
userData - The directory for storing your app's configuration files,
which by default it is the appData directory appended with your app's
name.

How can you create a PushTopic with SOAP/REST API in Node (jsforce)

According to https://salesforce.stackexchange.com/questions/48901/create-streaming-pushtopic-using-rest-api I should be able to use the standard sobject API to create a PushTopic. However when I do so I get an error of
The requested resource does not exist.
In fact, I can't even describe the object.
I am using node and jsforce to test this. I have successfully used the execute anonymous apex code from the developer console to create a topic, but I require it to be done inside of my own server.
My code looks like:
var jsforce = require('jsforce');
var config = {...};
var conn = new jsforce.Connection({
oauth2 : {
clientId : config.oauthClientId,
clientSecret : config.oauthSecret,
redirectUri : config.oauthCallbackUrl
},
instanceUrl : config.instanceUrl,
accessToken : config.accessToken,
refreshToken: config.refreshToken
});
conn
.sobject('PushTopic')
.describe()
.then(function(ret){
console.log('Description:', ret)
}, function(err){
console.log('Error:', err)
});
I Get:
Error { [NOT_FOUND: The requested resource does not exist] name: 'NOT_FOUND', errorCode: 'NOT_FOUND' }
If I use 'Account' instead of 'PushTopic' I get:
Description { actionOverrides: [],
activateable: false,
childRelationships:
...
Is this a problem with jsforce? Any ideas appreciated!
Ok. The problem was that my user did not have permissions for the PushTopic object. Even though I can subscribe to the streaming channel that the topic's produce (streaming api enabled), you need additional permissions to create a topic.
Setup -> Manage Users -> Permission Sets -> New -> Save -> Object Settings -> Push Topics -> Edit
Then
Manage Assignments -> Add Assignments
I created a permission set called 'PushTopic_Creator' and only added that permission, and then applied it to my user. I can now describe and create PushTopics!
According to this document supported calls for REST are:
REST: DELETE, GET, PATCH, POST (query requests are specified in the
URI)
Together with this, if you check samples in Workbench Streaming API config does not require Salesforce describe() call.

Meteor.call not working

I had this code working before I wanted to change the client side collection find/insert methods to server side. I removed insecure and autopublish from my meteor project, and changed my code to what it is below.
My angular Code in client/controllers/item-controller.js
angular.module('prototype').controller('ItemController', ['Config','$window','$meteor', function(Config, $window, $meteor) {
this.items = function(){
Meteor.call('getAllItems', function(err, res){
alert("error: " +err + " res: " + res );
return res;
});
}
My item-collection codee in server/item-collection-methods.js
Meteor.methods({
getAllItems : function(){
console.log("i got here")
return Items.find();
}
});
My main file in lib/app.js
Items = new Mongo.Collection("Items");
Before I had 15 items showing, now none of them show.
when I copy my Meteor.call function into the chrome console, all I get back is undefined.
I have a feeling it either has to do with the project structure, or the fact that autopublish and insecure are removed. Any advice would be helpful.
EDIT:
I did get something in my server console
I20150629-00:54:54.402(-4)? Internal exception while processing message { msg: '
method', method: 'getAllItems', params: [], id: '2' } Maximum call stack si
ze exceeded undefined
Meteor data transmission works with a publish/subscribe system. This system is able to replicate part of or all the data that is stored in your MongoDB (server) to the client in an in-memory DB (MiniMongo). Autopublish was publishing everything on the client, as you removed it there is nothing in your Items collection anymore.
In order to publish some data to the client you have to declare a publication on the server side:
Meteor.publish('allItems', function () {
//collection to publish
return Items.find({});
});
And subscribe on the client (either in the router or in a template):
Meteor.subscribe('allItems');
To learn more about this system you can read the official docs.
Concerning your method "getAllItems", you cannot directly send a cursor (Items.find()) on your data, that is why you are getting the error message "Maximum call stack size exceeded".
But you could send an array of these data by returning Items.find().fetch(). Also the call to a Meteor method is asynchronous, so you have to use the callback (more on Meteor methods)
Please note that by sending data over a method (which is perfectly acceptable) you lose the reactivity offered by the publish/subscribe system.

Meteor integration testing, rest api endpoint in velocity's mirror with jasmine

I'm trying to create a test for an API endpoint written with meteor. I'm using jasmine and velocity. It's intended to run within the same project, that's why I'm using them.
The problem comes when I'm trying to run the test and check for data in the endpoint. I have a bootstraped dataset in the mongodb replica, and when I POST it, it doesn't match the one that's bootstrapped in the local app.
Here's the example code:
Jasmine.onTest(function () {
describe('RestApi.MyMethod', function () {
it('Expects to fail because it lacks of valid parameters', function () { /*but it fails because of the user can't be found in the real app*/
var response = "";
var userId = Meteor.users.findOne({"username": "MyUser"})._id;
try {
response = Meteor.http.call(
"POST",
"http://localhost:3000/api/myMethod",
{
data: {
"userId":
},
timeout: 1000
}
);
} catch(error){
expect(error.message.indexOf("failed [400]")).toBeGreaterThan(-1);
expect(error.message.indexOf("Invalid parameters provided")).toBeGreaterThan(-1);
}
expect(response).toBe('');
});
});
});
I think it should point to the mirror's rest api. Is there a way to do that? I changed localhost:3000 to localhost:5000 and it didn't work. How can I check the mirror's port?
Thanks in advance!
Use Meteor.absoluteUrl rather than hard coding the port.
In your code, do this:
response = Meteor.http.call(
"POST",
Meteor.absoluteUrl("api/myMethod"), // this bit has changed.
{
data: {
"userId":
},
timeout: 1000
}
);
When the test runs, your test mirror will dynamically generate an absolute url.

Categories

Resources