Send file stream using axios in node js - javascript

I have a node.js server and client using graphql. I need to upload file from a client, send it to the node server and then forward this file to another service, lets name it X.
When i upload file to node.js server i can get a readable stream of file using apollo graphql API and it works fine.
But when i try send this stream to X using axios, on X i can read only one chunk of the file.
Here's code from my node.js server:
const { createReadStream } = await file;
const stream = createReadStream();
axios.post('http://localhost:7000', stream, {
maxBodyLength: Infinity,
maxContentLength: Infinity
});
And here is a receiver (X server) controller code
const writeStream = fs.createWriteStream('final.csv');
req.pipe(writeStream);
res.send('Hello');
In final.csv file i can see only small part of data i send. I want to be able to send files up to 1GB size.

Related

Send a file from NodeJS socketio server to Python socketio client

I have a Frontend application sending file over HTTP to Nodejs Backend. The Nodejs application acts as a socketio server and needs to send the file to a python socketio client. I have used multer to parse the HTTP request. Here are the nodejs and python code I have written so far (only pasted required parts)
// NodeJS server code I have written so far (could be completely wrong)
// req.file contains the file parsed from multer
var socket = req.app.get('socket');
let readStream = createReadStream(req.file.path);
readStream.on('ready', () => {
socket.emit('image', readStream, data => console.log(data));
readStream.pipe(createWriteStream(req.file.originalname));
});
# Python socketio client
#sio.event
async def image(data):
try:
print(data)
# want to save data preferably in a local file or a buffer is also fine
except:
return 'NOT OK'
finally:
return 'OK'
I am facing issues on both fronts, for both of which I have almost no idea how to procede:
sending image from nodejs (thought of using fs.createReadStream but don't know how to use the stream over a socketio event)
receiving image on python and storing it locally.

Feathers js upload file using raw json in postman

How do we configure feathers js to support form-data ? . Basically my current implementation right now supports raw json , but I have a feature
where I have to upload file to amazon bucket and the only way to upload the file like using postman is to support form-data . Thanks
or like is there a way we can upload file without using form-data ? like using raw in post-man ? (edited)
For those kinds of file-uploads you need an additional middleware to handle the multipart/form-data upload - usually multer is used. Here's some sample code that should help you get started:
const multer = require('multer');
const fileUploadHandler = multer();
// Upload Service with multipart support
app.use('/photos',
// you can define different storage options, the default is to keep the uploaded data in memory
fileUploadHandler.single('filename'),
function(req,res,next){
// the uploaded file is accessible now under req.file
next();
}
);

How to download files and store them to s3 by nodejs (through heroku)?

I want to know how to download the files to aws-s3 through heroku, and dosen't use the web page.
I'm using wget in Nodejs to download audio files, and wanting to re-encoding them by ffmpeg. Though I have to store the file first, but heroku doesn't provide the space to store.
Then I find the way that heroku can connect with aws-s3, however the sample code is the page for user to upload, and I'm just want to using codes.
This is the code that haven't connected with s3 yet.
const wget = spawn('wget',['-O', 'upload_audio/'+ sender_psid +'.aac', audioUrl]);
//spit stdout to screen
wget.stdout.on('data', function (data) { process.stdout.write(data.toString()); });
// //spit stderr to screen
wget.stderr.on('data', function (data) { process.stdout.write(data.toString()); });
wget.on('close', function (code) {
console.log(process.env.PATH);
const ffmpeg = spawn('ffmpeg',['-i', 'upload_audio/'+ sender_psid +'.aac', '-ar', '16000', 'upload_audio/'+ sender_psid +'.wav', '-y']);
...
});
And it comes the error:
upload_audio/1847432728691358.aac: No such file or directory
...
Error: ENOENT: no such file or directory, open 'upload_audio/1847432728691358.wav'
How can I solve the problem?
Could anyone give some suggest plz?
Maybe won't use s3 can get it?
I don't recommend using S3 until you've tried the NodeJS Static Files API. On the server, that looks like this:
app.use(express.static(path.join(__dirname, 'my_audio_files_folder')));
More on NodeJS's Static Files API available here.
AWS S3 seems to be adding unneeded complexity, but if you have your heart set on it, have a look at this thread.

How to upload folders to GCS from node js using client library

To upload files to google-cloud-storage (gcs) from node there is documented process like
const Storage = require('#google-cloud/storage');
const storage = new Storage();
{
const bucket = storage.bucket("bt-name");
const res = await bucket.upload("./output_images/dir/file.png", {
public: true
});
}
But I want to upload entire dir to gcs in said bucket bt-name, when I try to do so it throws promise rejection error -- EISDIR: illegal operation on a directory, read
It is not a built-in feature to upload a whole folder using the API or client library. If you would like to upload the whole folder, you would need your code to iterate through all the folder files in a loop and upload all of them one at a time, or you could use the recursive upload option with the command-line utility gsutil.

Nodejs - running external process

Need to build a back-end service using node-js which does the following.
Accepts file-upload from client (browser)
Save the file on disk and update DB (mongodb) with new filename and status as saved
Starts an long running process to parse the file ( written in python) and get feedback ( progress, error, success) , update DB with status feedback
How do we call # 3 from node.js - it could be system call like python parse_file.py filename
app.post('/upload',function(req,res, next){
// what should go here to call the process
res.writeHead(201, {'Content-type': 'text/plain'});
res.end("File uploaded.");
}
});
You can use require('child_process').spawn() or require('child_process').exec() to spawn a child process. The docs for those can be found at https://nodejs.org/dist/latest-v6.x/docs/api/child_process.html#child_process_child_process_spawn_command_args_options

Categories

Resources