I'm trying to extract data from a request (in this case a POST) and am having trouble. I'm doing so using the body-parser module. Below is a portion of my code (note I am using ES6 syntax):
let bodyParser = require('body-parser')
var urlEncodedParser = bodyParser.urlEncoded({extended: true})
app.post('*', setFileMeta, setDirDetails, urlEncodedParser, (req, res, next) => {
async ()=> {
if (!req.stat) return res.send(405, 'File does not exist')
if (req.isDir) return res.send(405, 'Path is a directory') // This is an advanced case
await fs.promise.truncate(req.filePath, 0)
req.pipe(fs.createWriteStream(req.filePath)) // Filepath is a file
// This below line is where I need the body
sendToClients('update', req.url, 'file', req.body, Date.now())
res.end()
}().catch(next)
})
For the actual extraction of the data using body-parser, urlEncoded is the only way I was able to successfully do it (the data is just a string for now), and it's giving me in the format {content: ''} where content is the actual string I'm using. This isn't ideal but it works in this simple. However, this is breaking the createWriteStream(req.filePath) as seen above - the file is created, but there is no content.
There must be something obvious that I'm doing incorrectly, as I'm new to Node and Express. Since I wrote the majority of this with the help of an instructional video, my gut tells me it's the body extraction part since I'm doing that on my own.
body-parser exhausts (fully reads) the request stream in order to parse the incoming parameters, so there's no data left in the request stream to write to your file.
It seems to me that you're trying to implement file uploads. In that case, you probably want to use a module like multer instead of body-parser.
Related
I am creating a MERN app that adds meta tags to React pages without SSR. So, I need to read the query inside the main file of the server and pass the appropriate metadata content to each page.
I am using this in the server.js file:
const indexPath = path.resolve(__dirname, 'build', 'index.html');
// static resources should just be served as they are
app.use(express.static(
path.resolve(__dirname, 'build'),
{ maxAge: '30d' },
));
// here we serve the index.html page
app.get('/*', (req, res, next) => {
fs.readFile(indexPath, 'utf8', (err, htmlData) => {
if (err) {
console.error('Error during file reading', err);
return res.status(404).end()
}
// get post info
const postId = req.query.id;
const post = getPostById(postId);
if(!post) return res.status(404).send("Post not found");
// inject meta tags
htmlData = htmlData.replace(
"<title>React App</title>",
`<title>${post.title}</title>`
)
.replace('__META_OG_TITLE__', post.title)
.replace('__META_OG_DESCRIPTION__', post.description)
.replace('__META_DESCRIPTION__', post.description)
.replace('__META_OG_IMAGE__', post.thumbnail)
return res.send(htmlData);
});
});
Here the getPostById is statically defined in a file. But I want to fetch it from my db.
My file structure is:
server.js
controllers
- posts.js
routes
- posts.js
I've separated the logic from route. So my routes/posts.js file looks like:
import { getPost, createPost } from '../controllers/posts.js';
const router = express.Router();
router.get('/', getPost);
router.post('/', createPost);
export default router;
So, in order to dynamically pass the meta content, I need to read the API endpoint for each request and pass the appropriate data. For this, I need to call the endpoints directly inside my node project. How to do that?
I'd appreciate any help. Thank you.
If you really want to call your own http endpoints, you would use http.get() or some higher level http library (that is a little easier to use) such as got(). And, then you can make an http request to your own server and get the results back.
But ... usually, you do not make http requests to your own server. Instead, you encapsulate the functionality that gets you the data you want in a function and you use that function both in the route and in your own code that wants the same data as the route. This is a ton more efficient than packaging up an http request, sending that request to the TCP stack, having that request come back to your server, parsing that request, getting the data, forming it as an http response, sending that response back to the requester, parsing that response, then using the data.
Instead, if you have a common, shared function, you just call the function, get the result from it (probably via a promise) and you're done. You don't need all that intermediate packaging into the http request/response, parsing, loopback network, etc...
I am writing my first very simple express server for data a collection purpose. This seems like a beginner question but I failed to find an answer so far. The data is very small (less than 500 integers) and will never grow, but it should be able to be changed through POST requests.
I essentially (slightly simplified) want to:
Have the data in a .json file that is loaded when the server starts.
On a POST request, modify the data and update the .json file.
On a GET request, simply send the .json containing the data.
I don't want to use a database for this as the data is just a single small array that will never grow in size. My unclarities are mainly how to handle modifying the global data and file reading / writing safely, i.e. concurrency and how exactly does Node run the code.
I have the following
const express = require('express');
const fs = require('fs');
let data = JSON.parse(fs.readFileSync('./data.json'));
const app = express();
app.listen(3000);
app.use(express.json());
app.get("/", (req, res) => {
res.sendFile('./data.json', { root: __dirname });
});
app.post("/", (req, res) => {
const client_data = req.body;
// modify global data
fs.writeFileSync("./data.json", JSON.stringify(data), "utf8");
});
Now I have no idea if or why this is safe to do. For example, modifying the global data variable and writing to file. I first assumed that requests cannot run concurrently without explicitly using async functions, but that seems to not be the case: I inserted this:
const t = new Date(new Date().getTime() + 5000);
while(t > new Date()){}
into the app.post(.. call to try and understand how this works. I then made simultaneous POST requests and they finished at the same time, which I did not expect.
Clearly, the callback I pass to app.post(.. is not executed all at once before other POST requests are handled. But then I have a callback running concurrently for all POST requests, and modifying the global data and writing to file is unsafe / a race condition. Yet all code I could find online did it in this manner.
Am I correct here? If so, how do I safely modify the data and write it to file? If not, I don't understand how this code is safe at all?
Code like that actually opens up your system to race conditions. Node actually runs that code in a single-threaded kind of way, but when you start opening files and all that stuff, it gets processed by multiple threads (opening files are not Node processes, they are delegated to the OS).
If you really, really want to use files as your global data, then I guess you can use an operating system concept called Mutual Exclusions. Basically, its a 'lock' used to prevent race conditions by forcing processes to wait while something is currently accessing the shared resource (or if the shared resource is busy). In Node, this can be implemented in many ways, but one recommendation is to use async-mutex library to handle concurrent connections and concurrent data modifications. You can do something like:
const express = require('express');
const fs = require('fs');
const Mutex = require('async-mutex').Mutex;
// Initializes shared mutual exclusion instance.
const mutex = new Mutex()
let data = JSON.parse(fs.readFileSync('./data.json'));
const app = express();
app.listen(3000);
app.use(express.json());
app.get("/", (req, res) => {
res.sendFile('./data.json', { root: __dirname });
});
// Turn this into asynchronous function.
app.post("/", async (req, res) => {
const client_data = req.body;
const release = await mutex.acquire();
try {
fs.writeFileSync('./data.json', JSON.stringify(data), 'utf8');
res.status(200).json({ status: 'success' });
} catch (err) {
res.status(500).json({ err });
finally {
release();
}
});
You can also use Promise.resolve() in order to achieve similar results with the async-mutex library.
Note that I recommend you to use a database instead, as it is much better and abstracts a lot of things for you.
References:
Node.js Race Conditions
EDIT:
The title has been changed so that anyone can suggest an alternative solution that achieves the same result with similar technology and platform. Not necessary has to be res.attachment.
I was trying to achieve a force download of PDF file by cross-origin URL. The code seems to work as expected but the downloaded file is ZERO BYTES, why?
On server:
app.get("/download", (req, res) => {
res.type("application/octet-stream");
// Note that the PDF URL is Cross-Origin.
res.attachment("https://cross-origin-domain-name.com/downloads/fiename.pdf");
res.send();
});
On HTML:
<a class="a-very-big-button" href="/download">Download PDF</a>
Do I miss anything? I did try out many other options like res.download() and readStream.pipe(res) methods, but most of them require the files to be on the same server. For my app, I need to help my clients to offer a download PDF button based on the URL submitted by them, which could be on their web server. Any advice would be appreciated! Thank you.
res.attachment does take a string as its only argument, but that string is used as a hint to the browser what the filename should be if the user decides to save the file. It does not allow you to specify a URL or filename to fetch.
Because you're not sending any data (res.send() without a Buffer or .write() calls), just a suggestion as to what the filename should be, the download is 0 bytes.
What you could do is pipe a HTTP request to res, which will have your server download and forward the file. The file will not be cached on your server and will 'cost' both upload and download capacity (but no storage).
An example on how to pipe a HTTPS request to a response.
Instead of Node's built-in https.request you could use many other libraries. Most of them support streaming files. These libraries can make it easier to handle errors.
const express = require('express');
const https = require('https');
const app = express();
const url = 'https://full-url-to-your/remote-file.pdf';
const headerAllowList = [
'content-type', 'content-length', 'last-modified', 'etag'
];
app.use('/', async (req, res, next) => {
// Create a HTTPS request
const externalRequest = https.request(url, {
headers: {
// You can add headers like authorization or user agent strings here.
// Accept: '*/*',
// 'User-Agent': '',
},
}, (externalResponse) => {
// This callback won't start until `.end()` is called.
// To make life easier on ourselves, we can copy some of the headers
// that the server sent your Node app and pass them along to the user.
headerAllowList
.filter(header => header in externalResponse.headers)
.forEach(header => res.set(header, externalResponse.headers[header]));
// If we didn't have content-type in the above headerAllowList,
// you could manually tell browser to expect a PDF file.
// res.set('Content-Type', 'application/pdf');
// Suggest a filename
res.attachment('some-file.pdf');
// Start piping the ReadableStream to Express' res.
externalResponse.pipe(res);
});
externalRequest.on('error', (err) => {
next(err);
});
// Start executing the HTTPS request
externalRequest.end();
});
app.listen(8000);
If you visit localhost:8000 you'll be served a PDF with a save-file dialog with the suggested filename, served from the specified URL.
I am creating a PUT method to update a given set of data however in my update function, req.body is undefined.
controller.js
async function reviewExists(req, res, next) {
const message = { error: `Review cannot be found.` };
const { reviewId } = req.params;
if (!reviewId) return next(message);
let review = await ReviewsService.getReviewById(reviewId);
if (!review) {
return res.status(404).json(message);
}
res.locals.review = review;
next();
}
async function update(req, res, next) {
console.log(req.body);
const knexInstance = req.app.get('db');
const {
review: { review_id: reviewId, ...review },
} = res.locals;
const updatedReview = { ...review, ...req.body.data };
const newReview = await ReviewsService.updateReview(
reviewId,
updatedReview,
knexInstance
);
res.json({ data: newReview });
}
service.js
const getReviewById = (reviewId) =>
knex('reviews').select('*').where({ review_id: reviewId }).first();
const updateReview = (reviewId, updatedReview) =>
knex('reviews')
.select('*')
.where({ review_id: reviewId })
.update(updatedReview, '*');
How it should look:
"data": {
"review_id": 1,
"content": "New content...",
"score": 3,
"created_at": "2021-02-23T20:48:13.315Z",
"updated_at": "2021-02-23T20:48:13.315Z",
"critic_id": 1,
"movie_id": 1,
"critic": {
"critic_id": 1,
"preferred_name": "Chana",
"surname": "Gibson",
"organization_name": "Film Frenzy",
"created_at": "2021-02-23T20:48:13.308Z",
"updated_at": "2021-02-23T20:48:13.308Z"
}
The first function where I check if a review exists work with my delete method. Am I missing something in one of these functions that would make req.body undefined?
Nobody in the other answers is really explaining the "why" here. By default req.body is undefined or empty and the Express engine does not read the body of the incoming request.
So, if you want to know what's in the body of the request and, even further, if you want it read and then parsed into req.body so you can directly access it there, then you need to install the appropriate middleware that will see what type of request it is and if that's the type of request that has a body (like a POST or a PUT) and then it will look at the incoming content-type and see if it's a content-type that it knows how to parse, then that middleware will read the body of the request, parse it and put the parsed results into req.body so it's there when your request handler gets called. If you don't have this type of middleware installed, then req.body will be undefined or empty.
Express has middleware like this built in for several content-types. You can read about them here in the Express doc. There is middleware for the following content types:
express.json(...) for "application/json"
express.raw(...) reads the body into a Buffer for you to parse yourself
express.text(...) for "text/plain" - reads the body into a string
express.urlencoded(...) for "application/x-www-form-urlencoded"
So, you will need some middleware to parse your specific content so you can access it in req.body. You don't say exactly data type you're using, but from your data expectations, I would guess that perhaps it's JSON. For that, you would place this:
app.use(express.json());
Somewhere before your PUT request handler. If your data format is something else, then you could use one of the other built-in middleware options.
Note, for other data types, such as file uploads and such, there are whole modules such as multer for reading and parsing those data types into properties that your request handler can offer.
You should use a body parser. Basically, Express can't know how to parse the incoming data the users throw at it, so you need to parse the data for it. Luckily, there are a few body parsers (and for quite some time body parsers for JSON and UrlEncoded (not sure if there are any others) are built into Express itself). The way you do that is you add a middleware in your express app like that:
const app = express();
...
app.use(express.json({
type: "*/*" // optional, only if you want to be sure that everything is parsed as JSON. Wouldn't recommend
}));
...
// the rest of the express app
Make sure that you have body-parser middleware installed, which extract the body of request stream and exposes it on req.body (source)
npm install body-parser --save
Here's how express server definitions would look like:
var express = require('express')
var bodyParser = require('body-parser')
var app = express()
// parse application/x-www-form-urlencoded
app.use(bodyParser.urlencoded({ extended: false }))
// parse application/json
app.use(bodyParser.json())
To get access to request body, you need to use express.json(). Make sure before using router, just wire your app.use(express.json()) that should make the request object available.
I would like to do the following inside a client side java script from a file hosted using node and express
var rootURL = <%= "someurlfromserverconfig" %>;
I simply host a web directory from my node app. I have no need for template engines. I just want to access some simple server properties for examples an API URL. ASP and PHP have a similar feature.
Simple things as that are easy to handle with toString and replace:
var url = 'example.com'
app.get('/', function(req, res, next) {
fs.readFile('index.html', function(err, data) {
if (err) return res.sendStatus(500)
res.set('Content-Type', 'text/html')
res.send(data.toString().replace('<%= "someurlfromserverconfig" %>', '"' + url + '"'))
})
})
This would yield: var rootUrl = "example.com";
For caching purposes you might want to read the file into memory and run your replace beforehand instead of on each request, but that's your choice.
To elaborate on the workflow; fs.readFile returns a Buffer that you can run toString() on which then allows you to run replace().
If you are intent on not having to process a template on every request, and if the data you want to include are not going to change on the fly, you might consider ES6 template strings. You could host your code in a file like this:
'use strict';
const config = require('./server-config');
module.exports = `
var rootURL = "${config.rootURL}";
// ...
`;
And then you would require the file in whatever file is handling the routing. The template will only be processed once, even if it is required by multiple files.
Alternatively, you could just use a lightweight template engine, render it once, and then serve it whenever it is requested. If you want to use exactly that format, I would recommend EJS.
'use strict';
const ejs = require('ejs');
const config = require('./server-config');
let template = fs.readFileSync('./some-template.js', 'utf8');
let rendered = ejs.render(template, config);
app.get('/', (req, res) => {
res.send(rendered);
});
If the data you are sending are constantly changing, you will have to render the template every time. Even ASP and PHP have to do that under the hood.