In an Express app, I have a simple image uploader which was working perfectly using Ajax PUT requests. But now I need to crop the images before sending them to the server.
I plugged in Croppie to help with this. On form submit I get the cropped image blobs from Croppie, convert them to files, append the files to the Ajax FormData object, and send the FormData off to Express.
But the server isn't receiving and files, and examining the PUT request in Chrome's dev tools shows that no file info is being send.
I've console.logged the blobs and the files, and as far as I can tell they are being created properly.
Getting the blobs and converting them to files:
var fileList = []
if($croppieContainers.length > 0){
$.each($croppieContainers, function(i){
$(this).croppie('result', {
type: 'blob', // Get the blob from the croppie container
}).then(function(blob) {
let file = new File([blob], "image-" + i, {
type: blob.type,
lastModifiedDate: new Date(); // Make a file from the blob...
});
fileList.push(file) // And push it to fileList
});
});
}
Appending the files to ajaxData:
if(fileList.length > 0) {
$.each('fileList', function(file){
ajaxData.append($input.attr('name'), file);
})
}
posting to Express:
$.ajax({
url: $form.attr('action'),
type: $form.attr('method'),
data: ajaxData,
// dataType: 'json',
cache: false,
contentType: false,
processData: false,
complete: function() {
$form.removeClass('is-uploading');
},
success: function(data) {
// Handlers...
Console.log(blob) returns:
Blob
size:288345
type: "image/png"
__proto__: Blob
And the file:
File
lastModified : 1478972364233
lastModifiedDate : Sat Nov 12 2016 17:39:24 GMT+0000 (GMT)
name : "image-0"
size : 288345
type : "image/png"
webkitRelativePath : ""
__proto__ : File
I've tried getting the file a different way, but it gives the same result:
if($croppieContainers.length > 0){
$.each($croppieContainers, function(i){
let file = new File([$(this).croppie('result', {
type: 'blob'}
)],
"image-" + i)
ajaxData.append($input.attr('name'),file)
});
}
I'm note sure how to debug this, apart from console.logging at various stages, and checking the XHR in dev tools...
Can anyone see what the problem is, or tell me how I can debug further?
Thanks you!
Related
I'm trying to get a blob from an URL (in my case leads to a .wav file).
The URL is located at my own website, so I only do requests to the same site.
The purpose:
A user has uploaded some .wav files to my website to one schema
The user wants to copy one or more .wav files from one to another schema
The user selects the audiofiles to copy without uploading the files again.
The user interface looks like this:
The problem:
Each audiofile is located in its own directory.
https://mywebsite.nl/media/audiofiles/schemaGUID/recording.wav
So when copying to another schema, the file or files needs to get re-uploaded to the directory of the other schema.
The code im using to upload the selected audiofiles is this:
newwavfiles.forEach(function (item) {
var name = item["name"];
var filename = item["filename"];
var fileblob = fetch('http://mywebsite.nl/media/audiofiles/12345677/recording.wav').then(res => res.blob());
var formData = new FormData();
formData.append('file', fileblob, filename);
formData.append('guid', guid);
// upload file to server
$.ajax({
url: 'index.php?action=uploadAudiofile',
type: 'post',
data: formData,
enctype: 'multipart/form-data',
contentType: false,
processData: false,
success: function (response) {
},
error: function (xhr, status, error) {
console.log(xhr);
var errorMessage = xhr.status + ': ' + xhr.statusText
}
});
});
To get the blob of the file I tried this:
var fileblob = fetch('http://mywebsite.nl/media/audiofiles/12345677/recording.wav').then(res => res.blob());
But then I get this error:
Failed to execute 'append' on 'FormData': parameter 2 is not of type 'Blob'.
Does anyone have a solution to get the .wav file blob from an URL?
I would like to upload a zip file as part of formdata in a Javascript fetch.
I can easily send in a zip file which I have read in via a file input control (see first segment of code). But when try to create a file object based on a blob, I receive an error (see second segment of code).
Basically I am asking how to create a zip file object from a blob. I tried many variations of code, but cannot nail the right syntax of creating the file object from blob, but it has to be a zip file. Note that the zip file is a shapefile - not sure if it makes any difference.
//here we upload the zip file
var fileInput = document.getElementById('avatar');
var filename = fileInput.files[0].name;
//This works fine. I can easily send the zip file in formdata
var headers = new Headers();
headers.set('X-CSRFToken', csrftoken);
var formData = new FormData()
formData.set("time", false)
formData.set("base_file", fileInput.files[0] )
fetch("/upload/", {
"credentials": "include",
"body": formData,
"headers": headers,
"method": "POST",
"mode": "cors"
})
But when I try to upload zip file I created from a blob, then I receive an error
var shpBuffer = await ShapeFileFromGeoJSON(geojson, fileName);
var base64String = Uint8Array.from(window.atob(shpBuffer), (v) => v.charCodeAt(0));
var shapeFileBlob = new Blob([ base64String ], {type:"application/zip"});
saveAs(shapeFileBlob, fileName); //this works and a zip file is saved
var file = new File([shapeFileBlob], 'application.zip', { type: shapeFileBlob.type,});
console.log(file);
//Returns: "File {name: 'application.zip', lastModified: 1656002582746,
//lastModifiedDate: Fri Jun 24 2022 04:43:02 GMT+1200 (New Zealand Standard Time),
//webkitRelativePath: '', size: 6091726, …}"
console.log(file instanceof File); //returns true
var headers = new Headers();
headers.set('X-CSRFToken', csrftoken);
var formData = new FormData()
formData.set("time", false)
formData.set("file", file)
fetch("/upload/", {
"credentials": "include",
"body": formData,
"headers": headers,
"method": "POST",
"mode": "cors"
})
The error returned in the console is
POST https://WEBSITE/upload/ 400 (Bad Request)
{success: false, errors: Array(1)}
Yes, I think I solved the question. The issue was in the way I was creating the zip file - which is a shapefile. The shapefile zip file should unzip into the files directly without into any intermediate folder. Geoserver was rejecting it as upon unzipping it could not directly find any shapefile.
I'm now working on grapesJS implementation in our application. I have gone through documents provided & some of the issues reported regarding the asset manager in Github, etc.. I was not able to show the list of uploaded images in the Asset manager after a hard refresh list disappears.
I have tried to upload files to AmazonS3, upload was ok & I get its response, also it is showing in the list. In this case, I was not able to edit images due to the CORS issue.
Later, I tried with the base64 encoded image. With this method, I was able to edit uploaded images. Also got it in the image listing. As I said earlier, the image list only available until a hard refresh. I think it is from the cache.
I have doubts in a few codes, can you help me out of it.
Here is the assetManager section code:
assetManager: {
storageType : '',
storeOnChange : true,
storeAfterUpload : true,
embedAsBase64 : true, // Make this false to upload image to AmazonS3
upload : siteURL+'assets/CONTENT/img', // For temporary storage, Upload endpoint, set `false` to disable upload
assets : [],
// headers: {}, // Custom headers to pass with the upload request
// params: {}, // Custom parameters to pass with the upload request, eg. csrf token
credentials: 'include', // The credentials setting for the upload request, eg. 'include', 'omit'
autoAdd : 1, // If true, tries to add automatically uploaded assets.
dropzone : 0, // Enable an upload dropzone on the entire editor (not document) when dragging files over it
openAssetsOnDrop : 1, // Open the asset manager once files are been dropped via the dropzone
multiUpload : true, // Allow uploading multiple files per request. If disabled filename will not have '[]' appended
showUrlInput: true, // Toggles visiblity of assets url input
uploadFile: function(e) {
// Ref: https://blog.webnersolutions.com/adding-image-upload-feature-in-grapesjs-editor/
var files = e.dataTransfer ? e.dataTransfer.files : e.target.files;
var formData = new FormData();
for(var i in files){
formData.append('file-'+i, files[i]); // Containing all the selected images from local
}
$.ajax({
url: siteURL + 'uploadImage_base64', // Save image as base64 encoded - Its a function
// url: siteURL + 'uploadImage', // Upload image to AmazonS3 - Its a function
type: 'POST',
data: formData,
contentType: false,
crossDomain: true,
dataType: 'json',
mimeType: "multipart/form-data",
processData: false,
success: function(result) {
var myJSON = [];
if ((typeof(result['data']) != 'undefined') && (result != 'null')) {
$.each(result['data'], function(key, value) {
myJSON[key] = value;
});
console.log(myJSON);
// while using base64 encode => 0: {name: "ipsumImage.png", type: "image", src: "data:image/png;base64,iVBORw0KGgAAVwA…AAAAAAAAAAAAAAAD4Pv4B6rBPej6tvioAAAAASUVORK5CYII=", height: 145, width: 348}
// while using AmazonS3 => 0: {name: "logo_sigclub.png", type: "image", src: "https://amazonaws.com/assets/CONTENT/img/logo_sigclub.png", status: true, message: "Uploaded successfully", …}
editor.AssetManager.add(myJSON); //adding images to asset manager of GrapesJS
}
}
});
}
}
I have doubt in:
upload : siteURL+'assets/CONTENT/img', is this image upload path or a function which uploads image?
If I'm using AmazonS3, can I able to get the editing option with the CORS issue?
How can I show the uploaded images in Asset Manager without disappearing later?
I am calling AWS S3 to retrieve images using AJAX call in jQuery 3.3.0 and instead of getting a blob object, I am receiving the response as "[object Blob]". Please let me know if I have to provide further details.
Please find the code below:
$.ajax({
url: "my_path/download_image.php",
data: {
name: "my_name"
},
cache: false,
xhrFields: {
responseType: 'blob'
},
success: function (data) {
console.log(data);
var image = new Image();
var url = window.URL || window.webkitURL;
self.DownloadStudentImages(students, index + 1);
image.src = url.createObjectURL(data);
self.SavePic(imageFilename);
},
error: function () {}
});
I have fixed the issue. Usually, AWS response from PHP is in string format unless it is encoded explicitly. So, I am sending base64 from PHP and in the javascript side, I am converting it to a byte array and thus into a blob which fixes the issue.
I am working on a simple chrome-extension that needs to upload files to the user's dropbox folder. I am using the simple AJAX requests as mentioned below to upload files, however it works for files with extensions such as .txt, .json, .c, etc i.e. files whose mime type is of type text/plain or similar type but all other file types such as pdfs, image files etc get corrupted and produce blank contents. What am I missing in uploading the files the correct way.
function startUpload()
{
var folderPath = $(this).closest('tr').attr('path')+'/';
var file = $("#upload_file")[0].files[0];
if (!file){
alert ("No file selected to upload.");
return false;
}
var reader = new FileReader();
reader.readAsText(file, "UTF-8");
reader.onload = function (evt) {
uploadFile(folderPath+file.name,evt.target.result,file.size,file.type);
}
}
//function to upload file to folder
function uploadFile(filepath,data,contentLength,contentType){
var url = "https://api-content.dropbox.com/1/files_put/auto"+filepath;
var headers = {
Authorization: 'Bearer ' + getAccessToken(),
contentLength: contentLength,
};
var args = {
url: url,
headers: headers,
crossDomain: true,
crossOrigin: true,
type: 'PUT',
contentType: contentType,
data : data,
dataType: 'json',
success: function(data)
{
getMetadata(filepath.substring(0,filepath.lastIndexOf('/')),createFolderViews);
},
error: function(jqXHR)
{
console.log(jqXHR);
}
};
$.ajax(args);
}
I believe the issue is reader.readAsTextFile(file, "UTF-8"). If the file isn't a text file, this will misinterpret the contents. I think you want reader.readAsBinaryString or reader.readAsArrayBuffer. (I haven't tested it myself.)
EDIT
After testing this myself, I found that readAsArrayBuffer is what you need, but you also need to add processData: false as an option to $.ajax to prevent jQuery from trying to convert the data to fields in a form submission.
Also be sure to use dataType: 'json' to properly parse the response from the server.