Sending an image/video using sockets and node - javascript

My application requires a section for users to communicate each other. For this, I am using socket-io. For sending text(as strings). I use utf-8, which works perfectly.
However, when sending an image or a video on a socket, how do I approach this? Would I turn the image or the video into binary format, and send that on the socket?

Yes there is an example about how to send your files with socket.io :
var fileReader = new FileReader(),
slice = file.slice(0, 100000);
fileReader.readAsArrayBuffer(slice);
fileReader.onload = (evt) => {
var arrayBuffer = fileReader.result;
socket.emit('slice upload', {
name: file.name,
type: file.type,
size: file.size,
data: arrayBuffer
});
}
there is a full tutorial with example about send file with socket.io and receive it in the server nodeJs follow this

Related

How to upload an image of File type to Firebase Storage from Node.js with the Admin SDK

I have Angular running on the FrontEnd and Firebase Admin SDK for Node.js on the BackEnd.
What I want to achieve is to allow the user to select an image from his computer, using a simple <input> of type file. When I receive the user image which is of type File on the Angular side, I want to send this to my Node.js server and let him upload it to the Firebase Storage.
Here's how I'm sending the image to Node.js:
method(imageInput): void {
const image: File = imageInput.files[0];
const reader = new FileReader();
reader.addEventListener('load', (event: any) => {
const imageData = {
source: event.target.result,
file: image
}
this.myService.uploadImage(imageData.file).subscribe(
(res) => {
// image sent successfully
},
(err) => {
// error
})
});
reader.readAsDataURL(image);
}
So on the Node.js side I don't see a way to upload this image.
I'm trying:
admin.storage().bucket().upload(imageFromAngular, { --> Here's the problem
destination: "someDestination/",
contentType: "image/png",
metadata: {
contentType: "image/png"
}
}).then(() => {
// send successful response
}).catch(err => {
// send error response
});
The issue comes from the fact that the upload method only takes as a parameter the path to the image and the options. However in this case I can't pass the path to the image, rather I can pass the image itself. I read this - https://googleapis.dev/nodejs/storage/latest/ but I couldn't find anything that would suit my needs.
What would be the correct way to do this ?
Update:
Here's a more detailed explanation to the approach I took:
I'm using the arrayBuffer method of the image File inside my method. This method returns a promise of type ArrayBuffer. I get the value and send it to my Server.
The Server uses Buffer.from(ArrayBuffer, 'base64') to convert the data and then I can safely use the save API (https://googleapis.dev/nodejs/storage/latest/File.html#save).
To get the image later on I use download - (https://googleapis.dev/nodejs/storage/latest/File.html#download).
You can write a byte stream (or a Buffer) to Cloud Storage.
createWriteStream() API for streaming data to Cloud Storage: https://googleapis.dev/nodejs/storage/latest/File.html#createWriteStream
save() API for writing buffered data to Cloud Storage: https://googleapis.dev/nodejs/storage/latest/File.html#save

How to convert between MIME-Types for blob/buffer?

The Problem:
I record Audio from within the Browser which gives me a BLOB when the recording is done:
let blob = new Blob(chunks, { 'type' : 'audio/webm;codecs=opus' });
Changing the mime-type here won't help since the chunks already come with their MIME-Type which is audio/webm;codecs=opus for almost all browsers. So can not do anything here.
Sending this Blob via XHR to an node.js server will result in recieving a buffer from that blob:
Client:
var xhr = new XMLHttpRequest();
xhr.open('POST', 'http://localhost:3000/audio', true);
xhr.send(blob);
Server:
app.post('/audio' , (req, res) =>{
req.on('readable',()=>{
let buffer = req.read();
// sending this buffer to the external API results in error
// since it expects the mime-type audio/wav
});
res.send({msg: 'success'});
});
Most solutions out there require you to write the file to your disk and convert that ( ffmpeg).
Others use Browser-features which are experimental or not compatible with older browsers...
I also tried using the wavfile npm package, but that creates a corrupted file if i try writing it with the UInt8Array from that webm-formatted buffer (playable file but it only contains noise and is much shorter than the actual recording should be)
There must be a simple solution to convert the binary data server side, right? Best I could wish for would be a function convertWebmBufferToWavBuffer.

Decode image from base64 to jpg

I am capturing an image on one client and sending the image to another client via socket.io to be sent out to users as a jpg. On the client capturing the image I am doing :
fs.readFile('./app/image.jpg', function(err, buf) {
socket.emit('image', { image: true, buffer: buf.toString('base64') });
})
This part is working fine and is encoding the image and emiting it. On the other client I have :
socket.on('image', function(img) {
console.log(img);
});
This client is receiving the message and can log out the encoded image.
I am struggling converting the image from base64 to a jpg again. What do I need to do in order to accomplish this?
Something like this:
socket.on('image', function(img) {
var buffer = new Buffer(img, 'base64');
// Now you probably want to save it as a file...
});

AWS SDK JS: Multipart upload to S3 resulting in Corrupt data

Trying to upload an mp4 file using the AWS JS SDK initiating a multipart upload, I keep getting a file corrupt error when I try to download and play it on my local.
Gists of my code:
Initiating the multipart upload with params:
const createMultipartUploadParams = {
Bucket: bucketname,
Key: fileHash.file_name,
ContentType: 'video/mp4' // TODO: Change hardcode
};
Call:
s3Instance.createMultipartUpload(createMultipartUploadParams, function(err, data) {
}
Doing the chunking:
Params:
const s3ChunkingParams = {
chunkSize,
noOfIterations,
lastChunk ,
UploadId: data.UploadId
}
Reading the file:
const reader = new FileReader();
reader.readAsArrayBuffer(file)
Uploading each chunk:
reader.onloadend = function onloadend(){
console.log('onloadend');
const partUploadParams = {
Bucket: bucketname,
Key: file_name,
PartNumber: i, // Iterating over all parts
UploadId: s3ChunkingParams.UploadId,
Body: reader.result.slice(start, stop) // Chunking up the file
};
s3Instance.uploadPart(partUploadParams, function(err, data1) {
}
Finally completing the multipartUpload:
s3Instance.completeMultipartUpload(completeMultipartParams, function(err, data)
I am guessing the problem is how I am reading the file, so I have tried Content Encoding it to base64 but that makes the size unusually huge. Any help is greatly appreciated!
Tried this too
Only thing that could corrupt is perhaps you are uploading additionally padded content for your individual parts which basically leads to final object being wrong. I do not believe S3 is doing something fishy here.
You can verify after uploading the file what is the final size of the object, if it doesn't match with your local copy then you know you have a problem somewhere.
Are you trying to upload from browser?
Alternatively you can look at - https://github.com/minio/minio-js. It has minimal set of abstracted API's implementing most commonly used S3 calls.
Here is a nodejs example for streaming upload.
$ npm install minio
$ cat >> put-object.js << EOF
var Minio = require('minio')
var fs = require('fs')
// find out your s3 end point here:
// http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region
var s3Client = new Minio({
url: 'https://<your-s3-endpoint>',
accessKey: 'YOUR-ACCESSKEYID',
secretKey: 'YOUR-SECRETACCESSKEY'
})
var outFile = fs.createWriteStream('your_localfile.zip');
var fileStat = Fs.stat(file, function(e, stat) {
if (e) {
return console.log(e)
}
s3Client.putObject('mybucket', 'hello/remote_file.zip', 'application/octet-stream', stat.size, fileStream, function(e) {
return console.log(e) // should be null
})
})
EOF
putObject() here is a fully managed single function call for file sizes over 5MB it automatically does multipart internally. You can resume a failed upload as well and it will start from where its left off by verifying previously upload parts.
So you don't necessarily have to go through the trouble of writing lower level multipart calls.
Additionally this library is also isomorphic, can be used in browsers as well.

Send a file base64 to a Node.js socket.io server

I'm programming a game with socket.io drawing in an HTML5 canvas and when the time is over send the image to the node.js server. Is there some way to convert that canvas in an image, send the image in base64 to the node.js app, an finally save it into the server?
Yes, the canvas element provides a method named toDataURL(). This method returns a data: URL which includes the base64 representation of the image in the specified format, or PNG by default:
var canvas = document.getElementsByTagName('canvas')[0];
var dataUrl = canvas.toDataURL();
Assuming you are using socket.io, you can send this data URI over the socket by emitting an event:
var socket = io.connect('http://localhost');
socket.emit('image', dataUrl);
On the Node side, you can listen to this event on the socket to retrieve the image:
io.sockets.on('connection', function (socket) {
socket.on('image', function (dataUrl) {
console.log(dataUrl);
});
});
You can then trim the prefix data:image/png;base64, if necessary and upload the content to a server.
Read more on the MDN documentation for the canvas element.

Categories

Resources