I am using the Inbox module for node to process incoming mail with the following function call:
client.listMessages(-1, function(err, messages){
messages.forEach(function(message){
client.createMessageStream(message.UID)
.pipe(process.stdout, {end: false});
});
});
This logs the mail to console with 'process.stdout', however I want to save the result to mongo, or do other javascript stuff, how can i do it?
It would appear that createMessageStream is returning a stream interface object. The methods to access the data are shown in that link.
As saving the data into MongoDB there is the basic driver or modules such as Mongoose that can provide you with methods to do this.
Related
I'm currently attempting to setup an XMPP client using Stanza.js
https://github.com/legastero/stanza
I have a working server that accepts connections from a Gajim client, however when attempting to connect using Stanza.js client.connect method, the server opens up a websocket connection, but no events for authentication, or session started ever fire.
The server logs do not show any plaintext password authentication attempts.
How can I actually see any of the stanza logs to debug this issue?
import * as XMPP from 'stanza';
const config = { credentials: {jid: '[jid]', password: '[password]'}, transports: {websocket: '[socketurl]', bosh: false} };
const client = XMPP.createClient(config)
client.on('raw:*', (data) => {
console.log('data', data)
})
client.connect();
onconnect event does fire, but this is the only event that fires.
Is there a way to manually trigger authentication that isn't expressed in the documentation?
The raw event handler should be able to give you the logging you want - but in your code sample, you are invoking it incorrectly. Try the following.
client.on('raw:*', (direction, data) => {
console.log(direction, data)
})
For reference, the docs state that the callback for the raw data event handler is
(direction: incoming | outgoing, data: string) => void
So the data that you are looking for is in the second argument, but your callback only has one argument (just the direction string "incoming" or "outgoing", although you have named the argument "data").
Once you fix the logging I expect you will see the stream immediately terminates with a stream error. Your config is incorrect. The jid and password should be top level fields. Review the stanza sample code. For the options to createClient - there is no credentials object. Try the following:
const config = { jid: '[jid]', password: '[password]', transports: {websocket: '[socketurl]', bosh: false} };
Since your username and password are hidden behind an incorrect credentials object, stanza.io does not see them and you are effectively trying to connect with no username and password so no authentication is even attempted.
This issue happened to be caused by a configuration problem.
The jabber server was using plain authentication.
Adding an additional line to the client definition file helped.
Also adding
client.on('*', console.log)
offered more complete server logs.
client.sasl.disable('X-OAUTH2')
How can I actually see any of the stanza logs to debug this issue?
If the connection is not encrypted, you can sniff the XMPP traffic with tools like
sudo tcpflow -i lo -Cg port 5222
You can force ejabberd to not allow encryption, so your clients don't use that, and you can read the network traffic.
Alternatively, in ejabbed.yml you can set this, but probably it will generate a lot of log messages:
loglevel: debug
I want to send asynchronous data to the node on configuration. I want to
perform a SQL request to list some data in a .
On node creation, a server side function is performed
When it's done, a callback send data to the node configuration
On node configuration, when data is received, the list is created
Alternatively, the binary can request database each x minutes and create a
cache that each node will use on creation, this will remove the asynchronous
part of code, even if it's no longer "live updated".
In fact, i'm stuck because i created the query and added it as below :
module.exports = function(RED) {
"use strict";
var db = require("../bin/database")(RED);
function testNode(n) {
// Create a RED node
RED.nodes.createNode(this,n);
// Store local copies of the node configuration (as defined in the
.html
var node = this;
var context = this.context();
this.on('input', function (msg) {
node.send({payload: true});
});
}
RED.nodes.registerType("SQLTEST",testNode);
}
But I don't know how to pass data to the configuration node. I thought of
Socket.IO to do it, but, is this a good idea and is it available? Do you know any solution ?
The standard model used in Node-RED is for the node to register its own admin http endpoint that can be used to query the information it needs. You can see this in action with the Serial node.
The Serial node edit dialog lists the currently connected serial devices for you to pick from.
The node registers the admin endpoint here: https://github.com/node-red/node-red-nodes/blob/83ea35d0ddd70803d97ccf488d675d6837beeceb/io/serialport/25-serial.js#L283
RED.httpAdmin.get("/serialports", RED.auth.needsPermission('serial.read'), function(req,res) {
serialp.list(function (err, ports) {
res.json(ports);
});
});
Key points:
pick a url that is namespaced to your node type - this avoids clashes
the needsPermission middleware is there to ensure only authenticated users can access the endpoint. The permission should be of the form <node-type>.read.
Its edit dialog then queries that endpoint from here: https://github.com/node-red/node-red-nodes/blob/83ea35d0ddd70803d97ccf488d675d6837beeceb/io/serialport/25-serial.html#L240
$.getJSON('serialports',function(data) {
//... does stuff with data
});
Key points:
here the url must not begin with a /. That ensures the request is made relative to wherever the editor is being served from - you cannot assume it is being served from /.
I have a question regarding express (connect) middleware.
What i'm trying to do is downloading DoubleClick Bid Manager Reports, parse and process them into my own MongoDB database.
My route looks as following:
app.route('/v1/spends/')
.get(dbmPolicy.isAllowed, buckets.read, buckets.check, reports.create, buckets.process, reports.update);
Where buckets.read reads files from Google Cloud Storage, buckets.check checks if report has already been processed into MongoDB, reports.create creates the report that holds the metadata of the csv. buckets.process processes the data that resides inside of the csv and reports.update updates the previously created report if all went succesfull.
As I find it very difficult to test the above process, I'm starting to doubt whether this is the correct way to implement the chain of processes. If this is the correct way, how do I test each middleware function individually on it's behaviour?
Regards,
You may want to look into the Async package and especially the waterfall method. That way you can run something like:
app.get('/v1/spends', function(req, res) {
async.waterfall([
dbmPolicy.isAllowed,
buckets.read,
buckets.check,
reports.create,
buckets.process,
reports.update
], function (err, result) {
if (err) res.status(500).send(err);
res.status(200).send(result);
});
});
I had this code working before I wanted to change the client side collection find/insert methods to server side. I removed insecure and autopublish from my meteor project, and changed my code to what it is below.
My angular Code in client/controllers/item-controller.js
angular.module('prototype').controller('ItemController', ['Config','$window','$meteor', function(Config, $window, $meteor) {
this.items = function(){
Meteor.call('getAllItems', function(err, res){
alert("error: " +err + " res: " + res );
return res;
});
}
My item-collection codee in server/item-collection-methods.js
Meteor.methods({
getAllItems : function(){
console.log("i got here")
return Items.find();
}
});
My main file in lib/app.js
Items = new Mongo.Collection("Items");
Before I had 15 items showing, now none of them show.
when I copy my Meteor.call function into the chrome console, all I get back is undefined.
I have a feeling it either has to do with the project structure, or the fact that autopublish and insecure are removed. Any advice would be helpful.
EDIT:
I did get something in my server console
I20150629-00:54:54.402(-4)? Internal exception while processing message { msg: '
method', method: 'getAllItems', params: [], id: '2' } Maximum call stack si
ze exceeded undefined
Meteor data transmission works with a publish/subscribe system. This system is able to replicate part of or all the data that is stored in your MongoDB (server) to the client in an in-memory DB (MiniMongo). Autopublish was publishing everything on the client, as you removed it there is nothing in your Items collection anymore.
In order to publish some data to the client you have to declare a publication on the server side:
Meteor.publish('allItems', function () {
//collection to publish
return Items.find({});
});
And subscribe on the client (either in the router or in a template):
Meteor.subscribe('allItems');
To learn more about this system you can read the official docs.
Concerning your method "getAllItems", you cannot directly send a cursor (Items.find()) on your data, that is why you are getting the error message "Maximum call stack size exceeded".
But you could send an array of these data by returning Items.find().fetch(). Also the call to a Meteor method is asynchronous, so you have to use the callback (more on Meteor methods)
Please note that by sending data over a method (which is perfectly acceptable) you lose the reactivity offered by the publish/subscribe system.
Please help, I'm unflushed in programming . I created a route router.get('/statistics', routesCtrl.statistics) for a page where I want to display some statistics in some charts, using Angular.
I realized that I need to send some query results like, how many registrations are in db, how many of them have "this property", how many of them have “those properties” and so on, being something new for me. Until now in responses of the routes I sent only one statistic from the above list.
How should I send this info to Angular, should I create a literal object containing those statistics, packing them in the response and send it to the Angular? Or to send the entire json from db and parse it in Angular, but it seems to be a wrong idea also because I can't use then mongoose queries from Angular if it's true :) .
I don’t have any other idea and there probably are more professional ways. Thank you!
It is very simple, whenever you need data from server side, which is node.js in your case or anything it may be like PHP, asp.nrt etc you have to send http requests to server using angular $http (GET/POST) and along with required parameters as a json object or query string. At server end write some rest service type stuff to handle this and make use of send parmeters at server to query mongo and than pass collected data as a json object back to angular end. At angular you can parse this JSON, and angular have also scope to use JSON/Array directly in view using attribute ng-repeat and so on depends on your exact requirement.
Request in angular.js
var sendConfig = {
method : "POST",
url: "YOUR_URL",
data: {action : 'Test_Service'},
headers : {}
};
$http(sendConfig).success(function(response) {
console.log(response);
}).error(function(error) {
console.log(error);
});
Response In node.js
var server = http.createServer(function(req,res){
var webservice_actions_instance = new webservice_actions(req, res);
var service_response = {data : 'RESPONSE_HERE_AFTER_CERTAIN_ALGOS'};
res.writeHead(200, {'Content-Type': 'text/html',"Access-Control-Allow-Origin": "*","Access-Control-Allow-Headers":" X-Requested-With"});
res.write('Response Data '+ service_response);
res.end();
}).listen( 8080 );