I'm now working on grapesJS implementation in our application. I have gone through documents provided & some of the issues reported regarding the asset manager in Github, etc.. I was not able to show the list of uploaded images in the Asset manager after a hard refresh list disappears.
I have tried to upload files to AmazonS3, upload was ok & I get its response, also it is showing in the list. In this case, I was not able to edit images due to the CORS issue.
Later, I tried with the base64 encoded image. With this method, I was able to edit uploaded images. Also got it in the image listing. As I said earlier, the image list only available until a hard refresh. I think it is from the cache.
I have doubts in a few codes, can you help me out of it.
Here is the assetManager section code:
assetManager: {
storageType : '',
storeOnChange : true,
storeAfterUpload : true,
embedAsBase64 : true, // Make this false to upload image to AmazonS3
upload : siteURL+'assets/CONTENT/img', // For temporary storage, Upload endpoint, set `false` to disable upload
assets : [],
// headers: {}, // Custom headers to pass with the upload request
// params: {}, // Custom parameters to pass with the upload request, eg. csrf token
credentials: 'include', // The credentials setting for the upload request, eg. 'include', 'omit'
autoAdd : 1, // If true, tries to add automatically uploaded assets.
dropzone : 0, // Enable an upload dropzone on the entire editor (not document) when dragging files over it
openAssetsOnDrop : 1, // Open the asset manager once files are been dropped via the dropzone
multiUpload : true, // Allow uploading multiple files per request. If disabled filename will not have '[]' appended
showUrlInput: true, // Toggles visiblity of assets url input
uploadFile: function(e) {
// Ref: https://blog.webnersolutions.com/adding-image-upload-feature-in-grapesjs-editor/
var files = e.dataTransfer ? e.dataTransfer.files : e.target.files;
var formData = new FormData();
for(var i in files){
formData.append('file-'+i, files[i]); // Containing all the selected images from local
}
$.ajax({
url: siteURL + 'uploadImage_base64', // Save image as base64 encoded - Its a function
// url: siteURL + 'uploadImage', // Upload image to AmazonS3 - Its a function
type: 'POST',
data: formData,
contentType: false,
crossDomain: true,
dataType: 'json',
mimeType: "multipart/form-data",
processData: false,
success: function(result) {
var myJSON = [];
if ((typeof(result['data']) != 'undefined') && (result != 'null')) {
$.each(result['data'], function(key, value) {
myJSON[key] = value;
});
console.log(myJSON);
// while using base64 encode => 0: {name: "ipsumImage.png", type: "image", src: "data:image/png;base64,iVBORw0KGgAAVwA…AAAAAAAAAAAAAAAD4Pv4B6rBPej6tvioAAAAASUVORK5CYII=", height: 145, width: 348}
// while using AmazonS3 => 0: {name: "logo_sigclub.png", type: "image", src: "https://amazonaws.com/assets/CONTENT/img/logo_sigclub.png", status: true, message: "Uploaded successfully", …}
editor.AssetManager.add(myJSON); //adding images to asset manager of GrapesJS
}
}
});
}
}
I have doubt in:
upload : siteURL+'assets/CONTENT/img', is this image upload path or a function which uploads image?
If I'm using AmazonS3, can I able to get the editing option with the CORS issue?
How can I show the uploaded images in Asset Manager without disappearing later?
Related
How to write output of an octet-stream to a file while the stream is being downloaded ?
I receive the stream as a blob and then download it at the end of the stream, but how to receive the
stream and write to a file ?(something like a FileStream that takes a stream and writes to a file)
AJAX code below
$.ajax({
type: "post",
url: apiURL,
contentType: "application/json; charset=utf-8",
data: JSON.stringify(String(strCustomData)),
xhrFields: {
responseType: "blob", // <-- can also be an arrayBuffer
onprogress: function(e) { // <-- get response body here, read its stream and save in file
// TODO
}
},
cache: false,
headers: { "RequestVerificationToken": get_CSRF_TOKEN() },
success: function (response, textStatus, request) {
const requestHeaders = request.getResponseHeader("Content-Disposition") || "";
const requestContentType = request.getResponseHeader("Content-Type") || "application/zip";
const strFileName = requestHeaders?.split(";")[1]?.split("=")[1] || "File_Not_Found.zip";
const blob = response;
const url = window.URL || window.webkitURL;
const urlLink = url.createObjectURL(blob);
// create Anchor element that holds the blob and then download it as a file
const elementAnchor = document.createElement("a");
elementAnchor.setAttribute("href", urlLink);
elementAnchor.setAttribute("download", strFileName);
// download the file
elementAnchor.click();
// dispose blob object and delete it from memory
url.revokeObjectURL(urlLink);
elementAnchor.remove();
},
error: function (xhr, ajaxOptions, thrownError) {
// More validation code...
console.error(String(xhr.statusText));
}
});
I am looking for something like the anchor download tag <a asp-page-handler="apiURL" download>Download</a> where it receives a stream and downloads its without using a blob.
I would have used the anchor tag, but I need some Javascript code to be executed alongside it.(Progress bar, some required custom info to display, ...etc.)
I tried using StreamSaver.js but could not make it work in the onprogress section of the AJAX request.(how to get the response.body.getReader() in onprogress event ?how to save stream to a file without waiting for the download to complete ?)
Using JQuery inside Razor pages with Asp.Net Core.
I'm trying to get a blob from an URL (in my case leads to a .wav file).
The URL is located at my own website, so I only do requests to the same site.
The purpose:
A user has uploaded some .wav files to my website to one schema
The user wants to copy one or more .wav files from one to another schema
The user selects the audiofiles to copy without uploading the files again.
The user interface looks like this:
The problem:
Each audiofile is located in its own directory.
https://mywebsite.nl/media/audiofiles/schemaGUID/recording.wav
So when copying to another schema, the file or files needs to get re-uploaded to the directory of the other schema.
The code im using to upload the selected audiofiles is this:
newwavfiles.forEach(function (item) {
var name = item["name"];
var filename = item["filename"];
var fileblob = fetch('http://mywebsite.nl/media/audiofiles/12345677/recording.wav').then(res => res.blob());
var formData = new FormData();
formData.append('file', fileblob, filename);
formData.append('guid', guid);
// upload file to server
$.ajax({
url: 'index.php?action=uploadAudiofile',
type: 'post',
data: formData,
enctype: 'multipart/form-data',
contentType: false,
processData: false,
success: function (response) {
},
error: function (xhr, status, error) {
console.log(xhr);
var errorMessage = xhr.status + ': ' + xhr.statusText
}
});
});
To get the blob of the file I tried this:
var fileblob = fetch('http://mywebsite.nl/media/audiofiles/12345677/recording.wav').then(res => res.blob());
But then I get this error:
Failed to execute 'append' on 'FormData': parameter 2 is not of type 'Blob'.
Does anyone have a solution to get the .wav file blob from an URL?
In an Express app, I have a simple image uploader which was working perfectly using Ajax PUT requests. But now I need to crop the images before sending them to the server.
I plugged in Croppie to help with this. On form submit I get the cropped image blobs from Croppie, convert them to files, append the files to the Ajax FormData object, and send the FormData off to Express.
But the server isn't receiving and files, and examining the PUT request in Chrome's dev tools shows that no file info is being send.
I've console.logged the blobs and the files, and as far as I can tell they are being created properly.
Getting the blobs and converting them to files:
var fileList = []
if($croppieContainers.length > 0){
$.each($croppieContainers, function(i){
$(this).croppie('result', {
type: 'blob', // Get the blob from the croppie container
}).then(function(blob) {
let file = new File([blob], "image-" + i, {
type: blob.type,
lastModifiedDate: new Date(); // Make a file from the blob...
});
fileList.push(file) // And push it to fileList
});
});
}
Appending the files to ajaxData:
if(fileList.length > 0) {
$.each('fileList', function(file){
ajaxData.append($input.attr('name'), file);
})
}
posting to Express:
$.ajax({
url: $form.attr('action'),
type: $form.attr('method'),
data: ajaxData,
// dataType: 'json',
cache: false,
contentType: false,
processData: false,
complete: function() {
$form.removeClass('is-uploading');
},
success: function(data) {
// Handlers...
Console.log(blob) returns:
Blob
size:288345
type: "image/png"
__proto__: Blob
And the file:
File
lastModified : 1478972364233
lastModifiedDate : Sat Nov 12 2016 17:39:24 GMT+0000 (GMT)
name : "image-0"
size : 288345
type : "image/png"
webkitRelativePath : ""
__proto__ : File
I've tried getting the file a different way, but it gives the same result:
if($croppieContainers.length > 0){
$.each($croppieContainers, function(i){
let file = new File([$(this).croppie('result', {
type: 'blob'}
)],
"image-" + i)
ajaxData.append($input.attr('name'),file)
});
}
I'm note sure how to debug this, apart from console.logging at various stages, and checking the XHR in dev tools...
Can anyone see what the problem is, or tell me how I can debug further?
Thanks you!
I am using the API for blueimp's jQuery File Upload plugin to upload large and small files to S3. I am sending a PUT request to the server, which handles the S3 uploading. My code is below:
// Initialize the file upload so we don't get errors
angular.element(document.querySelector('#add-file-btn')).fileupload({
url: '/test'
});
// Set up options for file upload
angular.element(document.querySelector('#add-file-btn')).fileupload(
'option',
{
url: '/temp', // Temporary, replaced in handleAdd
type: 'PUT',
dataType: 'json',
dropZone: 'dropzone-area',
acceptFileTypes: /(\.|\/)(csv|jpe?g|png)$/i, // Only accept csv files (images enabled for testing)
progressInterval: 2000, // Check progress every 2 seconds
autoUpload: true,
paramName: 'files',
formData: {
fileSize: null
},
beforeSend: function(xhr, data) {
// Set request headers
xhr.setRequestHeader('Authorization', 'bearer ' + storageService.get('token'));
xhr.setRequestHeader('Accept', 'application/json ');
},
multipart: true,
add: handleAdd,
submit: handleSubmit,
progress: handleProgress ,
processdone: handleProcessDone,
processfail: handleProcessFail,
start: handleStart,
stop: handleStop,
done: handleDone
}
);
When I upload a small file (200 KB) the file is successfully uploaded. When I try to upload a large file (2.66 GB) and log the progress shown in my handleProgress() function, I see that the progress hangs for several minutes, then it rapidly logs the last of the progress, so that right after it is 100% complete I see a 504 Gateway Timeout error has been returned.
function handleProgress(e, data) {
var progress = parseInt(data.loaded / data.total * 100, 10);
data.files[0].progress = progress;
console.log(progress)
}
The backend guy says his code works for uploading large files via the command line so we're trying to figure out why it's not also working from the browser. Any ideas? I know the 504 error is supposed to be just a server-side problem but it's confusing that it's working for the CLI...
I am working on a simple chrome-extension that needs to upload files to the user's dropbox folder. I am using the simple AJAX requests as mentioned below to upload files, however it works for files with extensions such as .txt, .json, .c, etc i.e. files whose mime type is of type text/plain or similar type but all other file types such as pdfs, image files etc get corrupted and produce blank contents. What am I missing in uploading the files the correct way.
function startUpload()
{
var folderPath = $(this).closest('tr').attr('path')+'/';
var file = $("#upload_file")[0].files[0];
if (!file){
alert ("No file selected to upload.");
return false;
}
var reader = new FileReader();
reader.readAsText(file, "UTF-8");
reader.onload = function (evt) {
uploadFile(folderPath+file.name,evt.target.result,file.size,file.type);
}
}
//function to upload file to folder
function uploadFile(filepath,data,contentLength,contentType){
var url = "https://api-content.dropbox.com/1/files_put/auto"+filepath;
var headers = {
Authorization: 'Bearer ' + getAccessToken(),
contentLength: contentLength,
};
var args = {
url: url,
headers: headers,
crossDomain: true,
crossOrigin: true,
type: 'PUT',
contentType: contentType,
data : data,
dataType: 'json',
success: function(data)
{
getMetadata(filepath.substring(0,filepath.lastIndexOf('/')),createFolderViews);
},
error: function(jqXHR)
{
console.log(jqXHR);
}
};
$.ajax(args);
}
I believe the issue is reader.readAsTextFile(file, "UTF-8"). If the file isn't a text file, this will misinterpret the contents. I think you want reader.readAsBinaryString or reader.readAsArrayBuffer. (I haven't tested it myself.)
EDIT
After testing this myself, I found that readAsArrayBuffer is what you need, but you also need to add processData: false as an option to $.ajax to prevent jQuery from trying to convert the data to fields in a form submission.
Also be sure to use dataType: 'json' to properly parse the response from the server.