JavaScript - Send ArrayBuffer data to backend over websocket [guacamole] - javascript

I need to send data (a file) via websocket to my guacamole backend using the guacamole-common-js library.
The scenario is the following:
Drag and drop area is created
User puts a file in this area
File is read
A guac filestream is created and the file is sent to the guac backend
Steps 1 to 3 are already working, but I do not know, how to send the file to the guacamole backend.
That's my function when a file is dropped: (guac is a global var that initialized the Guacamole-Client function)
function drop(ev){
ev.preventDefault();
if (ev.dataTransfer.items) {
for (var i = 0; i < ev.dataTransfer.items.length; i++) {
if (ev.dataTransfer.items[i].kind === 'file') {
var file = ev.dataTransfer.items[i].getAsFile();
var reader = new FileReader();
reader.onloadend = function fileContentsLoaded (e){
const stream = guac.createFileStream(file.type, file.name);
const bytes = new Uint8Array(reader.result);
stream.sendBlob(bytes.buffer)
stream.sendEnd()
};
console.log(file)
reader.readAsArrayBuffer(file);
}
}
} else {
for (var i = 0; i < ev.dataTransfer.files.length; i++) {
console.log(ev.dataTransfer.files[i].name);
}
}
}
The backend is also receiving the data and I am able to open the file on the remote server to which the file is sent by guacd, but the file does only contain kind of binary data.
Does someone already managed this or has an idea how I could send the data?

If you use Node.js in Backend, try handle it with Buffer.toString or Buffer.from.

I already found a solution...
The guacamole-common-js lib already provides a function for sending the buffer to the backend...
My "drop" functions looks now the following:
function drop(ev){
ev.preventDefault();
if (ev.dataTransfer.items) {
for (var i = 0; i < ev.dataTransfer.items.length; i++) {
if (ev.dataTransfer.items[i].kind === 'file') {
var file = ev.dataTransfer.items[i].getAsFile();
var reader = new FileReader();
reader.onloadend = function fileContentsLoaded (e){
const stream = guac.createFileStream(file.type, file.name);
var bufferWriter = new Guacamole.ArrayBufferWriter(stream)
bufferWriter.sendData(reader.result)
bufferWriter.sendEnd()
};
reader.readAsArrayBuffer(file);
}
}
} else {
for (var i = 0; i < ev.dataTransfer.files.length; i++) {
console.log(ev.dataTransfer.files[i].name);
}
}
}

Related

Can't open zip file created from System.IO.Compression namespace

I'm trying to zip varying amounts of files so that one zip folder can be served to the user instead of them having to click multiple anchor tags. I am using the System.IO.Compression namespace in asp.net core 3.1 to create the zip folder.
Here is the code I'm using to create the Zip folder.
public IActionResult DownloadPartFiles(string[] fileLocations, string[] fileNames)
{
List<InMemoryFile> files = new List<InMemoryFile>();
for (int i = 0; i < fileNames.Length; i++)
{
InMemoryFile inMemoryFile = GetInMemoryFile(fileLocations[i], fileNames[i]).Result;
files.Add(inMemoryFile);
}
byte[] archiveFile;
using (MemoryStream archiveStream = new MemoryStream())
{
using (ZipArchive archive = new ZipArchive(archiveStream, ZipArchiveMode.Create, true))
{
foreach (InMemoryFile file in files)
{
ZipArchiveEntry zipArchiveEntry = archive.CreateEntry(file.FileName, CompressionLevel.Fastest);
using (Stream zipStream = zipArchiveEntry.Open())
{
zipStream.Write(file.Content, 0, file.Content.Length);
zipStream.Close();
}
}
archiveStream.Position = 0;
}
archiveFile = archiveStream.ToArray();
}
return File(archiveFile, "application/octet-stream");
}
The files I am trying to zip are stored remotely so I grab them with this block of code. The InMemoryFile is a class to group the file name and file bytes together.
private async Task<InMemoryFile> GetInMemoryFile(string fileLocation, string fileName)
{
InMemoryFile file;
using (HttpClient client = new HttpClient())
using (HttpResponseMessage response = await client.GetAsync(fileLocation))
{
byte[] fileContent = await response.Content.ReadAsByteArrayAsync();
file = new InMemoryFile(fileName, fileContent);
}
return file;
}
The DownloadPartFiles method is called using Ajax. I grab the remote paths to the files and their respective names using javascript and pass them into the Ajax call.
function downloadAllFiles() {
let partTable = document.getElementById("partTable");
let linkElements = partTable.getElementsByTagName('a');
let urls = [];
for (let i = 0; i < linkElements.length; i++) {
urls.push(linkElements[i].href);
}
if (urls.length != 0) {
var fileNames = [];
for (let i = 0; i < linkElements.length; i++) {
fileNames.push(linkElements[i].innerText);
}
$.ajax({
type: "POST",
url: "/WebOrder/DownloadPartFiles/",
data: { 'fileLocations': urls, 'fileNames': fileNames },
success: function (response) {
var blob = new Blob([response], { type: "application/zip" });
var link = document.createElement('a');
link.href = window.URL.createObjectURL(blob);
link.download = "PartFiles.zip";
link.click();
window.URL.revokeObjectURL(blob);
},
failure: function (response) {
alert(response.responseText);
},
error: function (response) {
alert(response.responseText);
}
});
}
}
Now the issue I keep running into is that I can't open the zip folder within Windows 10. Every time I try to open the zip folder using Windows or 7-zip I get an error message that the folder can't be opened or the folder is invalid. I've tried looking at various similar issues on stackoverflow, ie Invalid zip file after creating it with System.IO.Compression, but still can't figure out why this is.
Could it be the encoding? I found that Ajax expects its responses to be encoded UTF-8 and when I view the zip file using notepad++ with UTF-8 I see that there are � characters indicating corruption.
Any thoughts on this would be helpful. Let me know if more information is needed.
If one of the corrupt zip files is needed I can provide that as well.
Edit:
I have since changed my method of receiving the byte array in javascript. I am using a XMLHttpRequest to receive the byte array.
var parameters = {};
parameters.FileLocations = urls;
parameters.FileNames = fileNames;
var xmlhttp = new XMLHttpRequest();
xmlhttp.open("POST", "/WebOrder/DownloadPartFiles/", true);
xmlhttp.setRequestHeader("Content-Type", "application/json");
xmlhttp.responseType = "arraybuffer";
xmlhttp.onload = function (oEvent) {
var arrayBuffer = xmlhttp.response;
if (arrayBuffer) {
var byteArray = new Uint8Array(arrayBuffer);
var blob = new Blob([byteArray], { type: "application/zip" });
var link = document.createElement('a');
link.href = window.URL.createObjectURL(blob);
link.download = "PartFiles.zip";
link.click();
window.URL.revokeObjectURL(blob);
}
}
xmlhttp.send(JSON.stringify(parameters));
From what I read, Ajax is not the best for receiving byte arrays and binary data. With this method I was able to open one of the zip file with 7-zip, but not Windows, however, one of the files within the archive was showing as a size of 0KB and couldn't be opened. The other three files in the archive were fine. Other zip folders with different files could not be opened at all though.
After some time I found a post that was able to fix my issue, Create zip file from byte[]
From that post this is the revised method I'm using to create a zip folder with files in it.
public IActionResult DownloadPartFiles([FromBody] FileRequestParameters parameters)
{
List<InMemoryFile> files = new List<InMemoryFile>();
for (int i = 0; i < parameters.FileNames.Length; i++)
{
InMemoryFile inMemoryFile = GetInMemoryFile(parameters.FileLocations[i], parameters.FileNames[i]).Result;
files.Add(inMemoryFile);
}
byte[] archiveFile = null;
using (MemoryStream archiveStream = new MemoryStream())
{
using (ZipArchive archive = new ZipArchive(archiveStream, ZipArchiveMode.Create, true))
{
foreach (InMemoryFile file in files)
{
ZipArchiveEntry zipArchiveEntry = archive.CreateEntry(file.FileName, CompressionLevel.Optimal);
using (MemoryStream originalFileStream = new MemoryStream(file.Content))
using (Stream zipStream = zipArchiveEntry.Open())
{
originalFileStream.CopyTo(zipStream);
}
}
}
archiveFile = archiveStream.ToArray();
}
return File(archiveFile, "application/octet-stream");
}
I still don't know why the previous method was having issues so if anyone knows the answer to that in the future I'd love to know.

Loop multiple files from input, save each file readAsDataURL data to array

I need your help with following problem:
I have HTML input which supports multiple files;
I upload let's say 5 files;
Each file is processed: it is readAsDataURL by FileReader and data of file is saved to object(there will be other params saved too, that is why object), which is pushed to array.
After I run flow I described, length of final array is NOT changed.
I believe problem is in async behaviour, but I cannot understand how should I change code to make it work, that is why I ask you for a help. Please find code below:
var controls = document.getElementById('controls');
function processUploadedFilesData(files) {
if (!files[0]) {
return;
};
var uploads = [];
for (var i = 0, length = files.length; i < length; i++) {
(function(file) {
var reader = new FileReader();
//I need object, as other params will be saved too in future;
var newFile = {};
reader.readAsDataURL(file);
reader.onloadend = function(e) {
newFile.data = e.target.result;
uploads.push(newFile);
}
})(files[i]);
}
return uploads;
}
controls.addEventListener('change', function(e) {
var uploadedFilesOfUser = processUploadedFilesData(e.target.files);
alert(uploadedFilesOfUser.length);
});
Codepen example - https://codepen.io/yodeco/pen/xWevRy

Javascript: FileUri to blob conversion

I want to upload files from local storage. But from local storage I'm getting file url something like file:///data/user/0/application_package/cache/IMG-199201092.jpg. I want to convert it to blob file or any array or object of file so that it can be send to server via XMLHttpRequest. So that on server i can save them.
I'ave tried this code:
var reader = new FileReader();
var blob_image = reader.readAsDataURL(imageUri);
but blob_image i couldn't fetch it on server via '$_FILES' as image or something.
I'm getting imageUri from
$cordovaImagePicker.getPictures(options)
.then(function (results)
{
for (var i = 0; i < results.length; i++)
{
var image_uri = results[i];
var fd = new FormData();
//Take the first selected file
var blobbb = new Blob([new Uint8Array(results[i])], { type: "file" });
fd.append("file", blobbb );
objXhr.onreadystatechange = function() {
if (this.readyState == 4 && this.status == 200) {
console.log( objXhr.responseText );
}
};
objXhr.open("POST", "http://someapi.com/api.php");
objXhr.send(fd);
}
}

Uploading large files via XHR fails with Chrome, works with Firefox

I am uploading video files to my server. The files are at least 20MB, some over 100MB.
For improved user experience, I upload via JavaScript and XMLHttpRequest, this way I can display upload speed and remaining time.
And to avoid and trouble on the server (such as requests timing out and taking too long to process) I submit the file in little packages on the server, and have a php script re-assemble the file.
My script works great, with one weird catch - and until just now I thought it was because of my ISP.
Using Google chrome I can upload files up to 20MB with no problems. But anything larger gets errors: For example my 100MB file will not send anything to the server - the second package never arrives. On my 50MB file it happens after around 47%, with the 7th package. And another file doesn't even send the first package.
I restarted my computer, and it keeps happening at the same position/package number for each file - though the position has nothing in common compared to the other failed files.
It doesn't matter if you try to start after one of the failed packages, say If I start at #8 if 7 failed - it will continue to fail. If I ignore errors (rather than to try again) it will just send the rest of the file in empty chunks.
I had already tried from a different internet connection, though I had to use firefox there. And it worked fine. So I install firefox on my machiene, and BAM works like a charm, correctly sending the 100MB file.
What could be going wrong on Chrome?
$(document).on('click','#video_upload',function(evt){
uploadProcess('vod_video_file');
});
function toBlob(text)
{
var data = new ArrayBuffer(text.length);
var ui8a = new Uint8Array(data, 0);
for (var i = 0; i < text.length; i++) ui8a[i] = (text.charCodeAt(i) & 0xff);
if(typeof window.Blob == "function")
{
var blob = new Blob([data]);
}else{
var bb = new (window.MozBlobBuilder || window.WebKitBlobBuilder || window.BlobBuilder)();
bb.append(data);
var blob = bb.getBlob();
}
return blob;
}
function splitFile(dataArray, size) {
blobs = new Array();
for (var i = 0; i < dataArray.size; i += size)
{
var copy = dataArray.slice();
var partial = copy.slice(i, i+size);
blobs.push(partial);
}
return blobs;
}
function uploadProcess(fileInputId)
{
var file = document.getElementById(fileInputId).files[0];
var reader = new FileReader();
reader.readAsBinaryString(file);
reader.onloadend = function(evt)
{
var fr = evt.target.result;
fileUpload( fr );
}
}
function fileUpload(inputDataArray)
{
var since;
var intervalid;
var totalBytes = inputDataArray.length;
var packets = new Array();
var packetNum = 0;
var packetCount = 0;
var packetSize = 0;
function startUpload()
{
intervalid = setInterval(function(){updateUploadStats();},1000);
calculatePaketSize()
createPackets();
submitPacket();
}
function calculatePaketSize()
{
var ideal_size = 3*1024*1024;
var packet_count = Math.ceil( totalBytes/ideal_size);
packetSize = Math.ceil(totalBytes/packet_count);
}
function createPackets()
{
packets = splitFile(toBlob(inputDataArray), packetSize)
packetCount = packets.length;
}
function updateUploadStats(e)
{
//displaying upload progress in GUI
}
function submitPacket()
{
xhr = new XMLHttpRequest();
xhr.open("POST", 'index.php?controller=AdminVodVideo&action=VideoUpload&ajax=1&r='+packetNum+'&token='+token, true);
xhr.setRequestHeader("Content-type","application/octet-stream");
XMLHttpRequest.prototype.mySendAsBinary = function(text){
this.send(text);
}
var eventSource = xhr.upload || xhr;
eventSource.addEventListener("progress", function(e) {
updateUploadStats(e);
});
xhr.onreadystatechange = function()
{
if(xhr.readyState == 4)
{
if(xhr.status == 200)
{
//server will return the string 'upload failed' if the file to be received was empty.
if( xhr.responseText == 'upload failed')
{
console.log('FAILED , trying again in 3 s');
setTimeout(submitPacket,3000);
}
else
{
updateUploadStats();
packetNum++;
if(packetNum == packetCount)
{
processOnServer();
}
else
{
submitPacket();
}
}
}else{
// process error
console.log('we got a 500 error');
}
}
};
since = Date.now();
xhr.mySendAsBinary( packets[packetNum] );
}
function processOnServer()
{
//telling the server to piece the file back together.
}
startUpload();
}
<script src="https://ajax.googleapis.com/ajax/libs/jquery/1.11.1/jquery.min.js"></script>

Javascript arraybuffer to file in Django

people!
I have tried to send via HTTP POST (using AngularJS) the content of a file (images to be more precise) converted in ArrayBuffer (UInt8) to a Django server but I cannot manage to save it correctly in file on the server.
Javascript code:
filesToUpload = document.getElementById('files')
var files = filesToUpload.files;
for(var i = 0; i < files.length; i++)
{
var reader = new FileReader();
reader.onload = (function(fileToBeUploaded) {
return function(e) {
var fileContent = e.target.result;
var bin = new Uint8Array(fileContent);
console.log(bin);
var request = $http({
url: urlAddImagesRestaurant,
method: "post",
data: {
fileStructure: fileToBeUploaded,
datafile: bin
}
});
return request.then( service.handleSuccess, service.handleError );
};
})(files[i]);
reader.readAsArrayBuffer(files[i]);
}
}
Has someone any idea how to save the content of the arraybuffer in django?

Categories

Resources