I'm studying Node.js right now, using the "Beginning Node.js" textbook.
The example in the book does not execute properly in the command prompt. (I'm using Ubuntu 18.04 with Node v9.4.0, and the book is three years old, so this may be related?)
In the downloadable source code for the book, this is the code that is provided:
var fs = require('fs');
// Create readable stream
var readableStream = fs.createReadStream('./cool.txt');
// Pipe it to out stdout
readableStream.pipe(process.stdout);
The file, cool.txt, is in the parent directory.
--parentFolder
----jsFileToExecute.js
----cool.txt
When I run node jsFileToExecute.js in the command prompt, I get this response:
events.js:137
throw er; // Unhandled 'error' event
^
Error: ENOENT: no such file or directory, open './cool.txt'
As this source code is directly from the textbook publisher's website and it still doesn't run, I'm assuming there's something wrong with it?
Looking for solutions to figure out why this isn't working. I've looked at the documentation at nodejs.org, but didn't find any clues.
Thank you for any help!
To avoid paths issues, it's recommended to use path.join, like:
const path = require('path');
const coolPath = path.join(__dirname, 'cool.txt');
const readableStream = fs.createReadStream(coolPath);
With above example, you're creating a path to the file referring to the current directory of the script stored in global variable called __dirname.
if the file is in the same directory where currently you are running your project then you can simply type the name of the file instead of giving it the path like ./ or / like here:
var fs = require('fs');
// Create readable stream
var readableStream = fs.createReadStream('cool.txt');
// Pipe it to out stdout
readableStream.pipe(process.stdout);
It will work as will read the file from the same directory
Related
I’m looking to allow a user to upload a javascript file containing a thread and then to spawn that thread, but I’m facing some issues with the device paths.
First I use the DocumentPicker to pick the file:
const pickerResult = await DocumentPicker.pickSingle({
type: [types.allFiles],
copyTo: 'documentDirectory'
});
Then I use the RNFS.readDir to get the file and its path comes back as /data/user/0/playground.com/files/my-thread.js.
Now, the react-native-threads expects a relative path like such new Thread('./worker.thread.js’);. If I try to pass the path received by RNFS I receive the following no such file error:
Error: ENOENT: no such file or directory, open '/data/user/0/playground.com/files/my-thread.js'.
I feel like this may be possible to avoid by using a Blob, but perhaps there is an easier way?
I am trying to import images I have stored in a folder called resources located under the server section.
When I place the images in the app folder on my local drive, it works fine, but since this has to be deployed to our live server, it makes things hard to have to copy files back and forth.
The error I get:
Error: ENOENT: no such file or directory, open '../../resources/cp_pdf_logo.png'
at Object.openSync (node:fs:585:3)
at Object.readFileSync (node:fs:453:35)
at RouteEmails.orderAuth (C:\Development\kaizen\server\routes\emails\file:\C:\Development\kaizen\server\routes\emails\func.js:187:43)
at RouteEmails.sendMail (C:\Development\kaizen\server\routes\emails\file:\C:\Development\kaizen\server\routes\emails\func.js:430:17)
at RouteOrders.orderAuthorizedMail (C:\Development\kaizen\server\routes\orders\file:\C:\Development\kaizen\server\routes\orders\func.js:840:24)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at RequestRouter._routeRequest (C:\Development\kaizen\server\socketController\file:\C:\Development\kaizen\server\socketController\requestRouter.ts:64:34)
at RequestRouter.processMessage (C:\Development\kaizen\server\socketController\file:\C:\Development\kaizen\server\socketController\requestRouter.ts:43:9)
at Socket. (C:\Development\kaizen\server\socketController\file:\C:\Development\kaizen\server\socketController\socketClient.ts:62:7) {
errno: -4058,
syscall: 'open',
code: 'ENOENT',
path: '../../resources/cp_pdf_logo.png'
}
Here is an example of my code:
let ordercplogo = this.base64_encode("../../resources/cp_pdf_logo.png")
let orderthankyoulogo = this.base64_encode("../../resources/thankyou_logo.png")
// let ordercplogo = this.base64_encode("/app/resources/cp_pdf_logo.png");
// let orderthankyoulogo = this.base64_encode("/app/resources/thankyou_logo.png");
The commented-out section is what works, but from the image below you can see that I have these files stored inside the app.
Can anyone tell me what I'm doing wrong?
PS. I have also tried to import the images above with
const logo1 = require("../../resources/image.png")
but also doesn't work and throws another error.
Your function probably tries to find the image relative to the working directory. It makes a difference if you were to do cd /; node /app/index.js or cd /app; node index.js. The working directory may be different in your live environment than locally.
There are a number of functions to help you find the path to a file in the built-in path module.
Let's assume you're in a file called routes/companies/a.js:
const { resolve } = require('path');
this.base64_encode(resolve(__dirname, "../../resources/cp_pdf_logo.png"));
// Will find the absolute path to the file.
( path.relative documentation page )
In CommonJS files (files that use require()) the value __filename always refers to the file it's in, rather than the main script file (so /app/routes/companies.a.js in this example) and __dirname is the directory it's in. That's the same value, without the a.js part.
I'm attempting to use the "file-type" NPM module (which I have working on the server) client side to validate mime type prior to a file upload to an S3 bucket.
The readme for the module includes an example of using it in the browser:
const FileType = require('file-type/browser');
const url = 'https://upload.wikimedia.org/wikipedia/en/a/a9/Example.jpg';
(async () => {
const response = await fetch(url);
const fileType = await FileType.fromStream(response.body);
console.log(fileType);
//=> {ext: 'jpg', mime: 'image/jpeg'}
})();
That obviously won't work directly in the browser due to the "require", and if I link to the NPM file directly, I get:
Uncaught ReferenceError: require is not defined
So I've tried using Webpack (with which I'm not well versed at all, but have followed the official Webpack tutorial, and a few other tutorials) to create a "main.js" file and then accessed that via a script tag.
For example, running this through Webpack:
import * as mimeFromBuffer from 'file-type/browser.js';
export function mimeTyping(mimeValidationBytes){
(async () => {
const fileType = await mimeFromBuffer(mimeValidationBytes);
console.log(fileType);
//=> {ext: 'jpg', mime: 'image/jpeg'}
})();
return;
}
Results in this when I call the mimeTyping client side:
Uncaught ReferenceError: mimeTyping is not defined
I've tried Vinay's answer from this question in the browser:
import fileType from 'file-type';
const blob = file.slice(0, fileType.minimumBytes);
const reader = new FileReader();
reader.onloadend = function(e) {
if (e.target.readyState !== FileReader.DONE) {
return;
}
const bytes = new Uint8Array(e.target.result);
const { ext, mime } = fileType.fromBuffer(bytes);
// ext is the desired extension and mime is the mimetype
};
reader.readAsArrayBuffer(blob);
Which gets me this:
Uncaught SyntaxError: Cannot use import statement outside a module
I've researched that error a bit, with a common solution being to add "type":"module" to package.json. This did not help (error was unchanged).
I found a similar question in the module's Github repository, which suggests:
Although this module is primary designed for Node.js, using it in the
browser is possible as well. That is indeed where 'file-type/browser'
intended for. It provides the right dependencies and right functions
to the JavaScript module bundler. Some dependencies, which are
typically present in a Node.js environment, but are missing in a
browser environment, you may need to pass (polyfill) to your module
bundler. Although in some cases it may a bit tricky to configure it,
note that this is a pretty common task handled by a module bundler.
I'm struggling to understand what next steps that suggests.
Another comment at the bottom of that thread indicates the author had success with:
import { fromBuffer } from 'file-type/core';
...
const buffer = await uploadedFile.arrayBuffer();
const types = await fromBuffer(buffer);
I'm not sure how to implement that, and what other code I need (I'm guessing this gets passed to Webpack, and probably requires an export statement, and then an import on the client side?)
I've tried passing this into Webpack:
const fileType = require('file-type/browser');
module.exports = fileType;
But linking to the output file again gets:
Uncaught SyntaxError: Cannot use import statement outside a module
I think I understand conceptually what I need to do: pass the NPM module to Webpack, which in turn parses it and finds any dependencies, gets those dependencies, and creates a JavaScript file I can use client side. It seems I'm doing something in there wrong.
I've spent days trying to understand how to use NPM modules client side (still quite foggy in my mind) and trying all sorts of variants on the above code - would really appreciate some guidance (first time posting a question here - please go easy on me!).
Thank you!
Edit: I don't think this is a duplicate - I did review How to import JS library from node_modules, Meteor Npm-module client-side?, how to use node.js module system on the clientside, How to call a server-side NodeJS function using client-side JavaScript and Node Js application in client machine but no suggestion I tried in any of those seemed to help.
Finally got this working. In case anyone else is stuck on this, here's an explanation (apologies for the lack of brevity - probably this should be a blog post...).
To flesh out the use case a bit further, I'm using Uppy to allow users to upload files to an AWS S3 bucket. The way this works is that, when the user uploads a file, Uppy makes a call to my server where an AWS pre-signed URL is generated and passed back to the client. The client then uses that pre-signed URL to upload the file directly to the S3 bucket, bypassing the server, such that the file doesn't pass through the server at any point.
The problem I was attempting to solve was that files missing an extension ended up uploaded with the content / MIME type set as "application/octet", because it seems the browser, Uppy, and S3 all rely on the file extension to decide the file type (rather than parsing the so-called "magic bytes" of the file), and if the file extension is missing, AWS defaults to "application/octet". This causes issues when users attempt to open the file, as they are not handled correctly (i.e. a png file without an extension and with an "application/octet" content / MIME type opens a download dialog rather than being previewed, etc.). I also want to validate the MIME type / file type in cases even where the extension exists so that I can exclude certain types of files, and so the files get handled appropriately when they are later downloaded (where the MIME type will again be validated) if an incorrect file extension is used.
I use the "file-type" NPM module to determine the mimetype server side, and that's straight forward enough, but changing the file's content type / MIME type when generating the AWS pre-signed URL is not enough to fix the problem - it still gets uploaded as "application/octet". I wanted to use the same module client side so we get the exact same results on the client as on the server, and needed in any case to determine the MIME type and set it accordingly pre-upload but post-pre-signed URL. I had no idea how to do this (i.e. use "file-type" client side - the meat of my question).
I finally gave up on Webpack - nothing I tried worked. So I switched to Browserify, and the sample browser code at the "file-type" repository worked at once! So then I was left trying to figure out how to pass a function through Browserify to use in the client side code.
This proved impossible for me - I couldn't figure out how to pass the asynchronous IIFE through to my code. So instead, I moved my Uppy code into the code I pass to Browserify:
// Get the dependency, which we've added to node via "npm install file-type":
const FileType = require('file-type/browser');
// When the user adds a file for upload to Uppy...
uppy.on('file-added', (file) => {
// 1. Create a filereader:
const filereader = new FileReader();
filereader.onloadend = function(evt) {
// 4. Once the filereader has successfully finished reading the file...
if (evt.target.readyState === FileReader.DONE) {
// Get the unsigned 8 bit int8Array (ie Uint8Array) of the 600 bytes (this looks like '[119,80,78,71...]'):
const uint = new Uint8Array(evt.target.result);
// Determine the mime type using the "file-type" NPM package:
(async () => {
// Pass in our 600 bytes ("uint"):
const fileType = await FileType.fromBuffer(uint);
console.log(fileType); // outputs => {ext: 'jpg', mime: 'image/jpeg'}
// Do some validation here...
//
// Assign the results to the file for upload - we're done!:
file.extension = fileType.ext;
file.meta.type = fileType.mime;
file.type = fileType.mime;
})();
}
}
// 2. Grab the first 600 bytes of the file for mime type analysis server side - most mime
// types use the first few bytes, but some (looking at you, Microsoft...) start at the
// 513th byte; ISO CD files start at the 32,770th byte, so we'll ignore those rather than
// send that much data for each file - users of this system certainly aren't expected to
// upload ISO CD files! Also, some .zip files may start their identifying bytes at the
// 29,153nd byte - we ignore those too (mostly, .zip files start at the first, 31st, or 527th byte).
const blob = file.data.slice(0, 600);
// 3. Start reading those 600 bytes...continues above at 'filereader.onloadend':
filereader.readAsArrayBuffer(blob);
})
That all goes into a file I call "index.js" and then, having installed Browserify at the command line via "npm install -g browserify", I use this at the command line to create the file ("main.js") I link to in my client side code:
browserify index.js -o main.js
I've written a script which takes a JSON file and outputs it to an API endpoint using Netlify's Functions feature (https://functions.netlify.com/). For the most part, this works without a hitch, however, one of my endpoints has a lot of text and for ease of editing, I've split the large text blocks into markdown files which I then loaded into the endpoint.
Locally, this works perfectly, but when deployed I get a console error saying Failed to load resource: the server responded with a status of 502 (). I presume this is because I used a node fs method and Netlify doesn't allow that, however, I can't find any information about this.
The code I've used is here:
const marked = require('marked')
const clone = require('lodash').cloneDeep
const fs = require('fs')
const resolve = require('path').resolve
const data = require('../data/json/segments.json')
// Clone the object
const mutatedData = clone(data)
// Mutate the cloned object
mutatedData.map(item => {
if (item.content) {
const file = fs.readFileSync(resolve(`./src/data/markdown/${item.content}`), 'utf-8')
item.content = marked(file)
}
})
exports.handler = function(event, context, callback) {
callback(null, {
statusCode: 200,
body: JSON.stringify({data: mutatedData})
});
}
I've also attempted to replace
const file = fs.readFileSync(resolve(`./src/data/markdown/${item.content}`), 'utf-8')
with
const file = require(`../data/markdown/${item.content}`)
but that complains about a loader and I'd like to avoid adding webpack configs if possible as I'm using create-react-app, besides, I doubt it will help as I'd still be accessing the file-system after build time.
Has anyone else come across this issue before?
At the time when this answer is written (September 2019), Netlify does not seem to upload auxiliary files to AWS Lambda, it appears that only the script where the handler is exported will be uploaded. Even if you have multiple scripts exporting multiple handlers, Netlify seems to upload them into isolated "containers" (different AWS instances), which means the scripts will not be able to see each other in relative paths. Disclaimer: I only tested with a free account and there could be settings that I'm not aware of.
Workaround:
For auxiliary scripts, make them into NPM packages, add into package.json and require them in your main script. They will be installed and made available to use.
For static files, you can host them on Netlify just like before you have AWS Lambda, and make http requests to fetch the files in your main script.
I'm following Heroku's tutorial to create a contact list using the MEAN stack (Heroku's running example here). I'm able to deploy it to Heroku and it works there. But when I run it locally on my machine, the browser (Chrome 67.0.3396.87 on macOS High Sierra) only displays a "Cannot GET /" message.
I believe it's related to how the Angular build directory /dist/ referenced in line 12 of server.js does not exist (as far as I can tell). The beginning of server.js looks like this:
var express = require("express");
var bodyParser = require("body-parser");
var mongodb = require("mongodb");
var ObjectID = mongodb.ObjectID;
var CONTACTS_COLLECTION = "contacts";
var app = express();
app.use(bodyParser.json());
// Create link to Angular build directory
var distDir = __dirname + "/dist/";
app.use(express.static(distDir));
// Create a database variable outside of the database connection callback to reuse the connection pool in your app.
var db;
I looked into it and found that Angular deletes the /dist/ directory upon ng serve. I also found that there is a flag --delete-output-path whose default is true.
I set the --delete-output-path flag to false in .angular-cli.json as recommended by this answer as well as in /node_modules/#angular/cli/lib/config/schema.json. Despite those changes (trying to set the flag in one file, or the other file, or both files at the same time), I'm still getting the "Cannot GET /" message and the /dist/ directory still doesn't appear to be there.
The only way I've been able to even run part of the app is to change server.js's line 12 reference from /dist/ to /src/. This allows /src/index.html to begin loading at localhost:5000/ (the browser displays the text "Loading..." as specified in line 16 of index.html) and gets the contacts API up and running at localhost:5000/api/contacts/. But the Angular components (the list of contacts that is the purpose of the tutorial) don't load. Maybe because I changed the build directory to a totally different location.
Is there something with the /dist/ directory that I'm missing? Or does my issue with getting the app to run locally have nothing to do with /dist/ at all?
Notice that you don't have a way of handling requests to the route '/' since the line:
app.use(express.static(distDir));
only ensures that all bundled files generated in your "dist" folder are accessible when your index.html requires them, but you still have to serve the index.html itself. When using the MEAN stack one normally would do something like this:
app.use ('/api', yourApiRouter);
//and for everything else let the client-side routing handle the route:
app.get ('*', function(req, res) {
res.sendFile(distDir + 'index.html');
}
I recommend to use the native "path" module to join your __dirname with your "dist" folder and your index.html location rather than simple concatenation.
You can use an arrow function instead of a callback when using app.get function if you are using ES6