Basically, the JSON object that's returned from a callback in my gRPC server is empty no matter what I do.
For the most part I'm following this tutorial, except I'm using a SQLite3 server instead of knex, and I've worked to the listProducts method. I haven't tried working on the other product methods yet.
In server.js I get some data from a SQLite3 database, and try to return it in a callback (at the bottom of the method). I also print out the data from the DB to confirm I'm actually getting valid data.
gRPC server.js
function listProducts(call, callback) {
console.log("******** Listed the products *********");
var data = "";
let db = new sqlite3.Database('../data/testDB.db', sqlite3.OPEN_READONLY, (err) => {
if(err){
console.error(err.message);
}
console.log("connected to DB");
});
db.serialize(() => {
db.get('SELECT NAME as name FROM PEEPS', (err, row) => {
if(err){
console.error(err.message);
}
console.log(row.name);
data.name = row.name;
});
});
db.close((err) => {
if(err) {
console.error(err.message);
}
console.log('closed db');
});
callback(null, { products: data.name });
}
Out put from gRPC server.js
******** Listed the products *********
connected to DB
Jeff // Correct data from DB.
closed db
The callback returns to client.js, where it was called. However, the object is always empty.
If I uncomment res.json({ name: "jessie" }); and comment res.json(result);, the code works as expected; name: jessie is sent to the browser as a JSON object.
So that tells me that from the client to the browser the data is being handled correctly. Therefore the problem is when the data is passed from the server.js to client.js.
gRPC client.js
// requirements
const path = require('path');
const protoLoader = require('#grpc/proto-loader');
const grpc = require('grpc');
// gRPC client
const productProtoPath = path.join(__dirname, '..', '..', 'protos', 'product.proto');
const productProtoDefinition = protoLoader.loadSync(productProtoPath);
const productPackageDefinition = grpc.loadPackageDefinition(productProtoDefinition).product;
const client = new productPackageDefinition.ProductService('localhost:50051', grpc.credentials.createInsecure());
// handlers
const listProducts = (req, res) => {
client.listProducts({}, (err, result) => {
console.log(result);
console.log(typeof result);
// console.log(res.json(result));
res.json(result);
// res.json({ name: "jessie" });
console.log("*******************");
});
};
Output from gRPC client.js
Server listing on port 3000
{} //Oh no! An empty JSON object!
object
*******************
Edit
Here is a link to my repository: https://github.com/burke212/grpc-node
The main problem here is that in your server code, your db methods are asynchronous but you are trying to access the result synchronously. You need to call the main callback for listProducts in the callback for db.get to ensure that you have the result of that database request before trying to use it. After making this change your listProducts method implementation should look more like this:
function listProducts(call, callback) {
let db = new sqlite3.Database('../data/testDB.db', sqlite3.OPEN_READONLY);
db.serialize(() => {
db.get('SELECT NAME as name FROM PEEPS', (err, row) => {
if(err){
console.error(err.message);
}
// Call the callback here to use the result of db.get
callback(null, { products: row.name });
});
});
db.close();
}
For simplicity I omitted the logging. Also, the sqlite3.Database constructor and db.close do not have callbacks in the example in the sqlite3 README. I suggest checking again whether those functions actually take callbacks.
In addition to that, now that you have shared the product.proto file that defines your service, there is another problem. The listProducts method in the ProductService service is declared as returning a ProductList object. In that message type, the products field must be an array of Product objects. All of the code in your method implementation is directed towards returning a string in that field, and that does not result in a compatible object.
Related
Some context
I've created a service worker to send notifications to registered users.
It works well until I tried to implement a sort of id to each people who register to a service worker (to send notification).
I do that because I have to delete old registration from my database, so I took the choice to let each users three registration (one for mobile device and two others for different navigator on computer) and if there is more, I want to remove from the database the older.
Tools
I'm using nodejs, express and mySql for the database.
The issue
When I launch a subscription I got this error:
SyntaxError: Unexpected token o in JSON at position 1
at JSON.parse (<anonymous>)
I saw in an other post that it's because they try to JSON.parse what's already an object.
But in my case, I can't find where I parse, see the part which are concerned:
// service.js (service worker file)
// saveSubscription saves the subscription to the backend
const saveSubscription = async (subscription, usrCode) => {
const SERVER_URL = 'https://mywebsite:4000/save-subscription'
subscription = JSON.stringify(subscription);
console.log(subscription); // I got here what I expect
console.log(usrCode); // <-------------------------------- HERE I GOT UNDEFIND
const response = await fetch(SERVER_URL, {
method: 'post',
headers: {
'Content-Type' : 'application/json',
},
body : {
subscription: subscription,
usrCode: usrCode
}
})
return response
}
But when I console.log(usrCode) in my inspector, I got the good value.
So how should I do to get the value in service.js
Maybe the problem is from:
const bodyParser = require('body-parser')
app.use(bodyParser.json())
At the beginning I thought that the issue is from the back (because I'm not really good with async function).
And here is the back, If maybe I got something wrong.
// index.js (backend)
// Insert into database
const saveToDatabase = async (subscription, usrCode) => {
// make to connection to the database.
pool.getConnection(function (err, connection) {
if (err) throw err; // not connected!
console.log(usrCode);
console.log(subscription);
connection.query(`INSERT INTO webpushsub (webpushsub_info, webpushsub_code) VALUES ('${subscription}', '${usrCode}')`, function (err, result, fields) {
// if any error while executing above query, throw error
if (err) throw err;
// if there is no error, you have the result
console.log(result);
connection.release();
});
});
}
// The new /save-subscription endpoint
app.post('/save-subscription', async (req, res) => {
const usrCode = req.body.usrCode; // <------------------ I'm not sure about this part
const subscription = req.body.subscription
await saveToDatabase(JSON.stringify(subscription, usrCode)) //Method to save the subscription to Database
res.json({ message: 'success' })
})
By searching on google, I've found this tutorial. So the reason why usrCode is undefined is because the service worker doesn't have access to a data stored in front.
First you have to pass it in the URL as following:
// swinstaller.js (front)
// SERVICE WORKER INITIALIZATION
const registerServiceWorker = async (usrCode) => {
const swRegistration = await navigator.serviceWorker.register('service.js?config=' + usrCode); //notice the file name
return swRegistration;
}
And then get it in the service worker:
// service.js (service worker file)
// get the usrCode
const usrCode = new URL(location).searchParams.get('config');
I'm trying to get the temperature data from my node.js backend sent to react.js but i kept getting res.send is not a funtion
Sample code here
app.get("/gettemperature", (req, res) => {
const email = req.query.email;
let stmt = `SELECT * FROM users WHERE email=?`;
let todo = [email];
db.query(stmt, todo, (err, results, fields) => {
if (err) {
console.error(err.message);
}
if(results.length > 0 ){
let id = results[0].id;
let getID = `SELECT * FROM controlModules WHERE deviceowner=?`;
let getidData = [id];
db.query(getID, getidData, (err, resulta, fields) => {
if (err) {
console.error(err.message);
}
if(resulta.length > 0){
let lanip = resulta[0].ipaddress;
let url = "http://"+lanip+"/data";
http.get(url,(res) => {
let body = "";
res.on("data", (chunk) => {
body += chunk;
});
res.on("end", () => {
try {
let json = JSON.parse(body);
const temp_actual = json.temperature.value;
console.log(temp_actual);
res.setHeader('Content-Type', 'application/json');
res.end(
JSON.stringify({
value: temp_actual
})
);
} catch (error) {
console.error(error.message);
};
});
}).on("error", (error) => {
console.error(error.message);
});
}
});
}
});
});
i really need to return/send/respond the temperature data to my front end but i'm getting said error, is there a different way to return data?
It looks like you are mixing up an HTTP server you wrote in Node (although you haven't shown any relevant code) and an HTTP client you also wrote in Node.
res is an argument received by the callback you pass to http.get and contains data about the response received by your HTTP client.
Meanwhile, somewhere else (not shown) you have a different variable also called res which is the object your HTTP server uses to send its response to the browser running your React code.
You are calling res.send and wanting res to be the latter but it is really the former.
Since you haven't shown us the HTTP server code, it is hard to say where that res is, but there is a good chance you have shadowed it and can solve your problem by using different names (e.g. client_res and server_res).
That said. I strongly recommend avoiding using the http module directly as the API follows out of date design patterns and isn't very friendly. Consider using fetch or axios for making HTTP requests and Express.js for writing HTTP servers.
I am working on small backend project. I send GET requests via postman to express.js app. Express send request to mongoose and return data.
I am trying to make it shorter by writing req.query.data instead of object name.
req.query.data is object name which is imported to node file but mongoose "find" function read it as "req.query.data" instead of acuall data.
I tried putting req data in () but it still didn't want to read value. I have no idea how to make it working
Code:
const Daily = require("./DailyStats/DailySchema")
module.exports.GetData = async (req, res) => {
await Daily.find({"Date.month": 3}, function (err, data) {
if(err){
console.error(err)
}
res.send(data)
})
}
What I want is
await (req.query.data).find({"Date.month": 3}, function (err, data) {
if(err){
console.error(err)
}
res.send(data)
})
While using second code I got error "Cannot use method find on req.query.data"
find should be called on a mongoose.Model
You may use mongoose.model(req.query.data) assuming req.query.data is your model name
That said you should
check that provided data is only a valid model name
name data better, like modelName
const mongoose = require('mongoose')
mongoose.connect('mongodb://localhost:27017/dummy')
const NameModel = mongoose.model('Name', { name:String }, 'names')
;(async()=>{
try {
console.log(mongoose.model('Name') === NameModel) // true
} finally {
mongoose.disconnect()
}
})()
I do not quite understand how to properly break the logic on the controllers and models in nodeJS when working with the backend application. Suppose I have an example
This code is in the model of my application, and logically I understand that the model is only responsible for choosing from the database, and the controller and everything else should be done by the controller, but I don’t quite understand how to do this and I tried to transfer part of the code to the controller and export it, but I did not succeed (Please, help, at least with this example! The main thing for me is to understand the principle of working with MVC in the node !!!
exports.currentPostPage = function(req, res){
db.query('SELECT * FROM `posts`', function (err, result) {
if (err){
console.log(err);
}
var post = result.filter(item => {return (item.id == req.params.id)? item: false})[0];
if (post === undefined){
res.render('pages/404');
} else {
res.render('pages/post-page', {postId: req.params.id, item: post});
}
});
};
So, you're on the right track. There's a lot of different ways to do it depending on preferences, but one pattern I've seen pretty commonly is to use the callback as a way to integrate. For example, let's say you have your model file:
exports.getPostById = (id, cb) => {
db.query('SELECT * FROM `posts` WHERE id=?', [id], function (err, result) {
if (err){
return cb(err); // or, alternatively, wrap this error in a custom error
}
// here, your logic is just returning whatever was returned
return cb(null, result);
});
};
Note I also am letting the DB handling the ID lookup, as it's probably more efficient at doing so for larger data sets. You didn't say what DB module you're using, but all the good ones have some way of doing parametrized queries, so use whatever works w/ your DB driver.
Anyway, the Model file therefore handles just the data interaction, the controller then handles the web interaction:
// postController.js
const model = require('../models/postModel.js'); // or whatever you named it
exports.populatePost = (req, res, next, id) => {
model.getPostById(id, (err, post) => {
if (err) return next(err); // centralized error handler
req.post = post;
next();
});
}
export.getOnePost = (req, res, next) => {
if (req.post) {
return res.render('pages/post-page', req.post);
}
// again, central error handling
return next({ status: 404, message: 'Post not found' });
}
I have mentioned central error handling; I vastly prefer it to scattering error handling logic all over the place. So I either make custom errors to represent stuff, or just do like above where I attach the status and message to an anonymous object. Either will work for our purposes. Then, in a middleware file you can have one or more handler, the simplest like this:
// middleware/errors.js
module.exports = (err, req, res, next) => {
console.error(err); // log it
if (err.status) {
return res.status(err.status).render(`errors/${err.status}`, err.message);
}
return res.status(500).render('errors/500', err.message);
}
Finally, in your routing setup you can do things like this:
const postController = require('../controllers/postController');
const errorHandler = require('../middleware/errors.js');
const postRouter = express.Router();
postRouter.param('postId', postController.populatePost);
postRouter.get('/:postId', postController.getOnePost);
// other methods and routes
app.use('/posts', postRouter)
// later
app.use(errorHandler);
As was pointed out in the comments, some folks prefer using the Promise syntax to callbacks. I don't personally find them that much cleaner, unless you also use the async/await syntax. As an example, if your db library supports promises, you can change the model code to look like so:
exports.getPostById = async (id, cb) => {
// again, this assumes db.query returns a Promise
return await db.query('SELECT * FROM `posts` WHERE id=?', [id]);
}
Then your controller code would likewise need to change to handle that as well:
// postController.js
const model = require('../models/postModel.js'); // or whatever you named it
exports.populatePost = async (req, res, next, id) => {
try {
const post = await model.getPostById(id)
req.post = post
return next()
} catch (err) {
return next(err)
}
}
I've got this error when trying to POST
> process.nextTick(function() { throw err; });
> ^
>
> TypeError: first argument must be a string or Buffer
> at ServerResponse.OutgoingMessage.end (_http_outgoing.js:524:11)
Errors shows that something's wrong with utils and cursor both from mongodb module, but what are they?
Everything works nice on GET but brakes on POST (postman and passing as text {"name":"Computer","price":2500}) - i cannot trace which module or instance is braking the code.
This is my conn with db:
// Our primary interface for the MongoDB instance
var MongoClient = require('mongodb').MongoClient;
// Used in order verify correct return values
var assert = require('assert');
var connect = function (databaseName, callBack) {
var url = 'mongodb://localhost:27017/' + databaseName;
MongoClient.connect(url,
function (error, database) {
assert.equal(null, error);
console.log("Succesfully connected to MongoDB instance!");
callBack(database);
});
};
exports.find = function (databaseName, collectionName, query, callback) {
connect(databaseName, function (database) {
var collection = database.collection(collectionName);
collection.find(query).toArray(
// Callback method
function (err, documents) {
// Make sure nothing went wrong
assert.equal(err, null);
// Print all the documents which we found, if any
console.log("MongoDB returned the following documents:");
console.dir(documents)
callback(err, documents);
// Close the database connection to free resources
database.close();
})
})
};
exports.insert = function (databaseName, collectionName, object, callback) {
connect(databaseName, function (database) {
var collection = database.collection(collectionName);
collection.insert(document, {w: 1}, function (err, documents) {
console.log("Added a new document");
console.log(documents[0]);
callback(err, documents[0]);
});
})
};
exports.remove = function (databaseName, collectionName, object, callback) {
connect(databaseName, function (database) {
var collection = database.collection(collectionName);
collection.remove(object, function (err, result) {
callback(err, result);
database.close();
});
})
};
The issue is actually pretty straightforward, so I'm surprised that you're not getting a better error message.
In your code:
collection.insert(document, {w: 1}, function (err, documents) {
console.log("Added a new document");
console.log(documents[0]); // I expect this to log undefined
callback(err, documents[0]);
});
The second argument passed into the collection.insert callback is actually a results object, not the documents that were inserted. So, documents[0] ends up being undefined because it's not an array of documents. Thus, when you trying to send undefined as a response, it's failing.
If you intention is to pass the newly created documents, you're going to have to use the result object to get the _id and attach it to the document you inserted.
As a side note, I would consider keeping a connection open to your database rather than creating a new connection every time you want to talk with Mongo.