Accessing MySQL database in node.js - internal assertion fails - javascript

I'm trying to connect to MySQL database from node.js using db-mysql library. Everything works great with example code, but now I'm trying to write a simple ORM.
This is my code:
require.paths.unshift('/usr/lib/node_modules'); // default require.paths doesn't include this
function Model() {
this.database = null;
}
Model.prototype.getDB = function(callback) {
if (this.database != null)
callback(this.database);
else {
console.log("Connecting to database...");
this.database = require("db-mysql").Database({
"hostname" : "localhost",
"user" : "root",
"password" : "pass",
"database" : "miku",
}).on("ready", function() {
console.log("Database ready, saving for further use.");
this.database = this;
callback(this);
});
}
}
exports.Model = Model;
And simple code I use for testing:
orm = require('./orm');
model = new orm.Model();
model.getDB(function (db) { console.log("callback'd"); });
It looks fine (at least for me), however node.js fails with internal assertion:
Connecting to database...
node: /usr/include/node/node_object_wrap.h:61: void node::ObjectWrap::Wrap(v8::Handle<v8::Object>): Assertion `handle->InternalFieldCount() > 0' failed.
Przerwane (core dumped)
After further investigation, if fails before/during creating Database object. I have no idea why this happens. I first though that it's because I was using git node.js version, but it fails with current release (v0.4.7) as well.
I'm trying to track down this issue in node.js code, with no success at the moment.

Oh, it was me to fail hard. It seems that having Python background is undesired in JavaScript world ;).
Database initialization should look like:
db = require("db-mysql");
new db.Database({
"hostname" : "localhost",
"user" : "root",
"password" : "pass",
"database" : "miku",
}).on("ready", function() {
console.log("Database ready, saving for further use.");
this.database = this;
callback(this);
}).connect();
So, I basically forgot to call new and connect().

Related

Query MariaDB from NodeJS without Caching from MariaDB

I try to do a query from MariaDB with my NodeJS without using the MariaDB caching. I use Sequelize to query the database.
Everything works fine but I recognized that there is definitly some issue with caching. The select query I use always reports a long time that a new user is not available. But another MariaDB insert generated the data before.
To check the user I use:
isUserExists: async function ( inUser, inChat, inLevel ) {
const iskunde = await kunden.findAll({
where: {
kunde: inUser,
chat: inChat,
level: inLevel,
created_at: {
[Op.gte]: Sequelize.literal("DATE_SUB(NOW(), INTERVAL 15 MINUTE)"),
},
},
raw: true,
});
//var currentdate = new Date();
//console.log(currentdate);
//console.log(iskunde);
if(Array.isArray(iskunde) && iskunde.length) {
return true;
}else{
return false;
}
},
This becomes a big problem with my app. So I try to find a way to get the "realtime" data from database. From what I read there is a SELECT possible with SQL_NO_CACHE but I do not find any information how to manage this with Sequelize. Any idea?

Node.Js using tmi.js and cleverbot.io trouble

I am currently having trouble with a twitch bot that I am trying to make. I decided to try out Node.js, but I am having a couple of errors. I am using the tmi.js and cleverbot.io libraries, installed via npm. My code so far is as shown below:
var tmi = require('tmi.js');
var cleverbot = require('cleverbot.io');
var options = {
options: {
debug: true
},
connections: {
cluster: "aws",
reconnect: true
},
identity: {
username: "TwitchCleverBot",
password: "APIKEY"
},
channels: ["klausmana"]
};
var client = new tmi.client(options);
var smartbot = new
cleverbot('APIUSERNAME','APIKEY');
client.connect();
client.on("chat", function(channel, userstate, message, self){
if(self){
return;
}
if(message.toLowerCase().includes("cleverbot")){
var lowmessage = message.toLowerCase();
var newmessage = lowmessage.replace("cleverbot", " ");
smartbot.create(function(err, session){
smartbot.ask(newmessage, function(err, response){
client.action(channel, response);
});
});
}
});
This is all the code I have in my app.js so far. The error occurs when I try to make a request to the cleverbot.io, so the tmi.js part works (with as much as I know). It gives me the following error:
Apparently, I am trying to make a JSON parse to a html file, but I really do not understand where and how that happens if anyone is able to help, I would really appreciate it.
P.S : The project is indeed a twitch bot, but my problem was in Node.js and Javascript, so that is why I decided to turn to StackOverflow

Meteor CollectionFS Collection is Undefined?

I am trying to use CollectionFS and GridFS to upload some images to my app and serve them back.
I have the following definitions:
ImageStore.js:
var imageStore = new FS.Store.GridFS("images", {
mongoUrl: 'mongodb://127.0.0.1:27017/test/',
transformWrite: myTransformWriteFunction,
transformRead: myTransformReadFunction,
maxTries: 1,
chunkSize: 1024*1024
});
EventImages = new FS.Collection("images", {
stores: [imageStore]
});
ImageStorePub.js:
Meteor.publish("EventImages", function() {
return EventImages.find();
});
ImageUploadHandler.js:
if (Meteor.isServer) {
EventImages.allow({
'insert': function() {
// add custom authentication code here
return true;
}
});
}
After typing all of this I tried wrapping them all in a if(Meteor.isServer){...} despite the fact that they're already in my server folder, but my app is still crashing due to error ReferenceError: EventImages is not defined
at server/route handlers/ImageUploadHandler.js:2:1
I made a mistake in not assigning the variable on both the client and server.

Meteor: Not able to upload image to S3 using CollectionFS

I am trying to test the upload functionality using this guide with the only exception of using cfs-s3 package. This is very basic with simple code but I am getting an error on the client console - Error: Access denied. No allow validators set on restricted collection for method 'insert'. [403]
I get this error even though I have set the allow insert in every possible way.
Here is my client code:
// client/images.js
var imageStore = new FS.Store.S3("images");
Images = new FS.Collection("images", {
stores: [imageStore],
filter: {
allow: {
contentTypes: ['image/*']
}
}
});
Images.deny({
insert: function(){
return false;
},
update: function(){
return false;
},
remove: function(){
return false;
},
download: function(){
return false;
}
});
Images.allow({
insert: function(){
return true;
},
update: function(){
return true;
},
remove: function(){
return true;
},
download: function(){
return true;
}
});
And there is a simple file input button on the homepage -
// client/home.js
'change .myFileInput': function(e, t) {
FS.Utility.eachFile(e, function(file) {
Images.insert(file, function (err, fileObj) {
if (err){
console.log(err) // --- THIS is the error
} else {
// handle success depending what you need to do
console.log("fileObj id: " + fileObj._id)
//Meteor.users.update(userId, {$set: imagesURL});
}
});
});
}
I have set the proper policies and everything on S3 but I don't think this error is related to S3 at all.
// server/images.js
var imageStore = new FS.Store.S3("images", {
accessKeyId: "xxxx",
secretAccessKey: "xxxx",
bucket: "www.mybucket.com"
});
Images = new FS.Collection("images", {
stores: [imageStore],
filter: {
allow: {
contentTypes: ['image/*']
}
}
});
I have also published and subscribed to the collections appropriately. I have been digging around for hours but can't seem to figure out what is happening.
EDIT: I just readded insecure package and everything now works. So basically, the problem is with allow/deny rules but I am actually doing it. I am not sure why it is not acknowledging the rules.
You need to define the FS.Collection's allow/deny rules in sever-only code. These are server-side rules applied to the underlying Mongo.Collection that FS.Collection creates.
The best approach is to export the AWS keys as the following environment variables: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, remove the accessKeyId and secretAccessKey options from the FS.Store, and then move the FS.Collection constructor calls to run on both the client and server. The convenience of using env vars is mentioned on the cfs:s3 page
In addition to this you can control the bucket name using Meteor.settings.public, which is handy when you want to use different buckets based on the environment.

How do I use persist with node.js?

I am attempting to get familiar with this persist package for node. Can someone tell me if the connection is being made here with persist.connect or are these properties?
var persist = require("persist");
var type = persist.type;
// define some model objects
Phone = persist.define("Phone", {
"number": type.STRING
});
Person = persist.define("Person", {
"name": type.STRING
}).hasMany(this.Phone);
persist.connect({
driver: 'sqlite3',
filename: 'test.db',
trace: true
}, function(err, connection) {
Person.using(connection).all(function(err, people) {
// people contains all the people
});
});
The code above runs without error on my system if I just change this.Phone to Phone.
persist.connect "connects" to a test.db file which contains the database.

Categories

Resources