Absolute path using fs in nodejs - javascript

I am trying to pipe an http audio stream from my nodejs server:
var streamPath = 'http://127.0.0.1:1485/mystream.mp3';
var stat = fs.statSync(streamPath);
response.writeHead(200, {'Content-Type': 'audio/mpeg','Content-Length': stat.size});
fs.createReadStream(streamPath).pipe(response);
The problem is that fs doesn't like the absolute path and I get the following error:
Error: ENOENT: no such file or directory, stat 'C:\myserver\http:\127.0.0.1:1485\mystream.mp3'
I can't find a way to use the absolute path. Is this even possible using fs?

'http://127.0.0.1:1485/mystream.mp3' is not an absolute path, it's a URL.
Absolute paths are something like /home/x/dir/file.txt on your filesystem.
If you want to get a stream from a URL then you need to use http or request. Or you need to use a path on your filesystem and not a URL if you want to get a file on your filesystem without using the network.
fs.createReadStream can only open local file in your filesystem.
For more details, see:
https://nodejs.org/api/fs.html
To get a file over a network, see:
https://www.npmjs.com/package/request
https://www.npmjs.com/package/request-promise

You are trying to access a remote resource through fs - that is wrong. fs is meant to be used for the Local Filesystem. If you want to access any remote resource you have to use http or https. However if I see things correctly you are trying to access your localhost, which should work.
Your Application is trying to access following file: C:\myserver\http:\127.0.0.1:1485\mystream.mp3 if you look closely that can't work. Your Path is mixed with your local path and a remote source (which is localhost actually). Try to fix your path, that should solve your problem. Keep in mind that fs will only work on your local system.
You should also think about fs.statSync this will block everything else until its finished.
Docs:
fs: https://nodejs.org/api/fs.html
http: https://nodejs.org/api/http.html
https: https://nodejs.org/api/https.html
Regards,
Megajin

Related

How to obtain a relative path from device filesystem path?

I’m looking to allow a user to upload a javascript file containing a thread and then to spawn that thread, but I’m facing some issues with the device paths.
First I use the DocumentPicker to pick the file:
const pickerResult = await DocumentPicker.pickSingle({
type: [types.allFiles],
copyTo: 'documentDirectory'
});
Then I use the RNFS.readDir to get the file and its path comes back as /data/user/0/playground.com/files/my-thread.js.
Now, the react-native-threads expects a relative path like such new Thread('./worker.thread.js’);. If I try to pass the path received by RNFS I receive the following no such file error:
Error: ENOENT: no such file or directory, open '/data/user/0/playground.com/files/my-thread.js'.
I feel like this may be possible to avoid by using a Blob, but perhaps there is an easier way?

I have a public folder in my Node.js/Express.js project which contains some images. When I try to access those images through URL, I get 404 error

I have a Node.js/Express js project with the following folder structure
root
----bin
----controllers
----middleware
----models
----node_modules
----public
--------images
------------test.png
----routes
----views
I'm trying to figure out what URL I need to access in order to be served the test.png image that is inside the public/images folder. I thought it would be the following url:
http://localhost:3000/public/images/test.png
However, I get a "Not Found, 404" error
You should register an API middleware to serve the static files from the disk.
var express = require("express");
var app = express();
app.use(express.static('path/to/static/directory'))
in your case, you can use,
app.use(express.static('public'))
Refer https://expressjs.com/en/starter/static-files.html for additional details.

Load local file using Netlify functions

I've written a script which takes a JSON file and outputs it to an API endpoint using Netlify's Functions feature (https://functions.netlify.com/). For the most part, this works without a hitch, however, one of my endpoints has a lot of text and for ease of editing, I've split the large text blocks into markdown files which I then loaded into the endpoint.
Locally, this works perfectly, but when deployed I get a console error saying Failed to load resource: the server responded with a status of 502 (). I presume this is because I used a node fs method and Netlify doesn't allow that, however, I can't find any information about this.
The code I've used is here:
const marked = require('marked')
const clone = require('lodash').cloneDeep
const fs = require('fs')
const resolve = require('path').resolve
const data = require('../data/json/segments.json')
// Clone the object
const mutatedData = clone(data)
// Mutate the cloned object
mutatedData.map(item => {
if (item.content) {
const file = fs.readFileSync(resolve(`./src/data/markdown/${item.content}`), 'utf-8')
item.content = marked(file)
}
})
exports.handler = function(event, context, callback) {
callback(null, {
statusCode: 200,
body: JSON.stringify({data: mutatedData})
});
}
I've also attempted to replace
const file = fs.readFileSync(resolve(`./src/data/markdown/${item.content}`), 'utf-8')
with
const file = require(`../data/markdown/${item.content}`)
but that complains about a loader and I'd like to avoid adding webpack configs if possible as I'm using create-react-app, besides, I doubt it will help as I'd still be accessing the file-system after build time.
Has anyone else come across this issue before?
At the time when this answer is written (September 2019), Netlify does not seem to upload auxiliary files to AWS Lambda, it appears that only the script where the handler is exported will be uploaded. Even if you have multiple scripts exporting multiple handlers, Netlify seems to upload them into isolated "containers" (different AWS instances), which means the scripts will not be able to see each other in relative paths. Disclaimer: I only tested with a free account and there could be settings that I'm not aware of.
Workaround:
For auxiliary scripts, make them into NPM packages, add into package.json and require them in your main script. They will be installed and made available to use.
For static files, you can host them on Netlify just like before you have AWS Lambda, and make http requests to fetch the files in your main script.

How to download files and store them to s3 by nodejs (through heroku)?

I want to know how to download the files to aws-s3 through heroku, and dosen't use the web page.
I'm using wget in Nodejs to download audio files, and wanting to re-encoding them by ffmpeg. Though I have to store the file first, but heroku doesn't provide the space to store.
Then I find the way that heroku can connect with aws-s3, however the sample code is the page for user to upload, and I'm just want to using codes.
This is the code that haven't connected with s3 yet.
const wget = spawn('wget',['-O', 'upload_audio/'+ sender_psid +'.aac', audioUrl]);
//spit stdout to screen
wget.stdout.on('data', function (data) { process.stdout.write(data.toString()); });
// //spit stderr to screen
wget.stderr.on('data', function (data) { process.stdout.write(data.toString()); });
wget.on('close', function (code) {
console.log(process.env.PATH);
const ffmpeg = spawn('ffmpeg',['-i', 'upload_audio/'+ sender_psid +'.aac', '-ar', '16000', 'upload_audio/'+ sender_psid +'.wav', '-y']);
...
});
And it comes the error:
upload_audio/1847432728691358.aac: No such file or directory
...
Error: ENOENT: no such file or directory, open 'upload_audio/1847432728691358.wav'
How can I solve the problem?
Could anyone give some suggest plz?
Maybe won't use s3 can get it?
I don't recommend using S3 until you've tried the NodeJS Static Files API. On the server, that looks like this:
app.use(express.static(path.join(__dirname, 'my_audio_files_folder')));
More on NodeJS's Static Files API available here.
AWS S3 seems to be adding unneeded complexity, but if you have your heart set on it, have a look at this thread.

Does restify support image upload at server side?

I am trying to create a REST API server that can accept image uploads with node.js restify. I looked at the documentation at http://restify.com/ but unable to ascertain if restify supports image upload as a server.
Does restify support image upload at server side? If no, which node.js module can one use to do image upload as server?
restify comes with a bundled BodyParser which is able to handle uploads (multipart/form-data only) and allow to have a custom handler (see multipartFileHandler option) for uploaded files read the docs on BodyParser for details and sample.
Notice, there is a req.files attribute, which is a hash/object. Each value is also a hash, which a path property indicating the name and location of the uploaded file.

Categories

Resources