TinyMCE 4.3 only uploading one image - javascript

TinyMCE is not allowing multiple file uploads in a post. You can select a file, and it will insert into the TinyMCE editor however once you submit, only the last inserted image is uploaded.
Below is the code I am working with: (the [0] is removed on the second attempt in which I was thinking TinyMCE would handle the files array)
if (meta.filetype == 'image') {
$('#upload').trigger('click')
$('#upload').on('change', function() {
var file = this.files[0]
var reader = new FileReader()
var name = file.name.split('.')[0]
var blobCache = tinymce.activeEditor.editorUpload.blobCache
var blobInfo = blobCache.create(name, file, reader.result)
blobCache.add(blobInfo);
reader.onload = function(e) {
callback(blobInfo.blobUri(), {
alt: file.name,
title: name
})
}
reader.readAsDataURL(file)
})
}
}
I have tried to append additional files with a for loop and removed the [0] from files and split without success.
file_picker_callback: function(callback, value, meta) {
if (meta.filetype == 'image') {
$('#upload').trigger('click')
$('#upload').on('change', function() {
var file = this.files//[0]
var reader = new FileReader()
var name = []
for(var x = 0; x < file.length; x++) {
name.push(file[x].name.split('.'))
}
var blobCache = tinymce.activeEditor.editorUpload.blobCache
var blobInfo = blobCache.create(name, file, reader.result)
blobCache.add(blobInfo);
reader.onload = function(e) {
callback(blobInfo.blobUri(), {alt: file.name, title: name})
}
reader.readAsDataURL(file)
})
}
}
I have also tried allowing auto upload which only work for the first image and the rest fallback to base64 in the database. Lastly, I tried to combine all files in order to upload however I'm not seeing different file names in console.log. For instance I attach one file, I see one file in console.log, I attach another, I see two responses in console.log but with the most recent attached file but only the last attached image will upload on submit. It seems that TinyMCE overwrites the file with each image attachemnt.
Is there a different approach to this so I can add images to a post with TinyMCE and upon submit, they are all uploaded instead of the last image attached?

Changed to the following now for a working solution. Using the name for the first argument when calling blobCache.create was the cause of the issue, a unique blobid is required instead.
file_picker_callback: function(callback, value, meta) {
if (meta.filetype == 'image') {
$('#upload').on('change', function() {
var file = this.files[0]
var reader = new FileReader()
reader.onload = function(e) {
// var name = file.name.split('.')[0] // replaced with id below
// var base64 = reader.result.split(',')[1]; // for base64
var id = 'blobid' + (new Date()).getTime();
var blobCache = tinymce.activeEditor.editorUpload.blobCache
var blobInfo = blobCache.create(id, file, reader.result)
blobCache.add(blobInfo);
callback(blobInfo.blobUri(), {alt: file.name, title: name})
}
reader.readAsDataURL(file)
})
$('#upload').trigger('click')
}
}

Related

Cannot upload image in vue.js laravel

I'm trying to upload an image which is in base64 format using below code
upload: function(e) {
const tmpFiles = e.target.files;
if (tmpFiles.length === 0) {
return false;
}
const file = tmpFiles[0];
const self = this;
const reader = new FileReader();
reader.onload = function(e) {
self.form.imageData.push(e.target.result);
}
}
issue is if i access this.form.imageData outside onload function then i get null and in my controller also i'm not getting image data, but when i print it using self.form.imageData inside onload function then i get image data in base64 encoded format.
Any help is highly appreciated.

Getting an empty array, when uploading a file with vuejs and axios in laravel

I am trying to upload an image, following this link from Yubaraj but upon submission, the image file is an empty object.
Of which according to Yubaraj is actually what you should put the server.
What he is done is actually hide a normal input file behind a text-field and gets the data with an event.
Here is a snippet on the two events:
pickFile () {
this.$refs.image.click ()
},
onFilePicked (e) {
const files = e.target.files
if (files[0] !== undefined) {
this.form.imageName = files[0].name
if (this.form.imageName.lastIndexOf('.') <= 0) {
return
}
const fr = new FileReader();
fr.readAsDataURL(files[0]);
fr.addEventListener('load', () => {
this.form.imageUrl = fr.result;
this.form.imageFile = files[0] // this is an image file that can be sent to server...
})
} else {
this.form.imageName = '';
this.form.imageFile = '';
this.form.imageUrl = '';
}
},
When I console log the form.image.file, and the file, this is what I get from the browsers console:
using vform post which actually uses axios behind the scenes.
this.form.post(route('api.settings.branch.create'), this.form)
an image of the dd() result from the network tab in the browser:
the very much text there is the dataurl of the image

JavaScript object to file object

I'm working on adding images to page, do something with collection of added images (preview etc) and finally I want them save. Everything is cool until the files object is used to show or save the photo.
var input = document.getElementById('files');
var files = input.files;
as it is an array of objects read only - it is impossible to manipulate it freely. For working with that array friendly I maped it like that:
var addedFiles = added(files);
function added(from) {
return $.map(from, function (i) {
var x = { lastModified: i.lastModified, lastModifiedDate: i.lastModifiedDate, name: i.name, size: i.size, type: i.type, webkitRelativePath: i.webkitRelativePath }
return x;
});
}
... then do something with those files - and I want to preview, and then save - but for example during preview I get an error:
Uncaught TypeError: Failed to execute 'readAsDataURL' on 'FileReader': parameter 1 is not of type 'Blob'.
function readImage(file) {
var reader = new FileReader();
reader.addEventListener("load", function () {
var image = new Image();
image.addEventListener("load", function () {
preview.innerHTML += drawHtml(this, file);
window.URL.revokeObjectURL(image.src); //blob version
});
image.src = reader.result; //file version
image.src = window.URL.createObjectURL(file) //blob version
});
reader.readAsDataURL(file); // here fire the error
}
When I pass for testing originally file obj to above code every thing is working.
Question:
How to create custom obj (in my case array of obj) that can be parse to file obj
P.S. In project I'm using jquery and javascript
Rather than mapping the File objects to new, incompatible objects, you could instead wrap them with the additional things you need, but then use the underlying original files when reading them:
const fileSelections = Array.prototype.map.call(input.files, file => ({
// This will let you get to the underlying file in the wrapper objects
file,
// If you want pass-throughs, you can do stuff like this:
get lastModified() { return file.lastModified },
// And you can add your own properties/methods as you please
});
function readImage(fileSelection) {
// Unwrap the file
const file = fileSelection.file;
const reader = new FileReader();
reader.addEventListener("load", function () {
const image = new Image();
image.addEventListener("load", function () {
preview.innerHTML += drawHtml(this, file);
window.URL.revokeObjectURL(image.src); //blob version
});
image.src = reader.result; //file version
image.src = window.URL.createObjectURL(file) //blob version
});
reader.readAsDataURL(file);
}
correct answer is blob - it's something amazing for me.
//from is the array of obj - files
function added(from) {
var out = [];
for (var i = 0; i < from.length; i++) {
(function (obj) {
var readerBase64 = new FileReader();
var obj = from[i];
readerBase64.addEventListener("load", function () {
var fileBase64 = readerBase64.result;
var row = { name: obj.name, size: obj.size, type: obj.type, base64: fileBase64 }
out.push(row);
});
readerBase64.readAsDataURL(obj);
})(from[i]);
}
return out;
}
'out' is a table of my own objects with base64, so I can create images for preview and 'do something functions' in the end I'm going to use base64 for create files.
here link for question related to my next step - creating img from blob (where I'm using additional lib b64toBlob)

Why am I missing elements in my array from array.from()

I am currently building a large database from an imported XML document, about 13k items or so. When I do
window.onload = function() {
var fileInput = document.getElementById('fileInput');
var fileDisplayArea = document.getElementById('fileDisplayArea');
fileInput.addEventListener('change', function(e) {
var file = fileInput.files[0];
var textType = /text.*/;
if (file.type.match(textType)) {
var reader = new FileReader();
reader.onload = function(e) {
let text = reader.result;
let parser = new DOMParser();
xmlDoc = parser.parseFromString(text,"application/xml");
getReportList(xmlDoc);
}
reader.readAsText(file);
} else {
fileDisplayArea.innerText = "File not supported!";
}
});
}
function isGroupTag(arr) {
return arr.nodeName === "Group";
}
function getReportList(xml) {
// Creates an array containing all reports
let reportList = Array.from(xmlDoc.children[0].children).filter(isGroupTag);
allReports = reportList.map((currElement, index) => {
return getReport(xml, index);
});
}
I only get back about 2100? I checked my program using a much smaller XML document and every thing was loaded fine. Any ideas as to why I would be getting such a small return back?
My first thought was that the XML document wasn't loaded completely yet so when I did Array.from() it was building the array with what it had...not sure though. Any help would be great.
EDIT:
I just finished testing loading in a few more XML documents. Instead of one massive file I loaded in the yearly ones and everything shows up. Still not sure why I can't just do the massive one.

Loop multiple files from input, save each file readAsDataURL data to array

I need your help with following problem:
I have HTML input which supports multiple files;
I upload let's say 5 files;
Each file is processed: it is readAsDataURL by FileReader and data of file is saved to object(there will be other params saved too, that is why object), which is pushed to array.
After I run flow I described, length of final array is NOT changed.
I believe problem is in async behaviour, but I cannot understand how should I change code to make it work, that is why I ask you for a help. Please find code below:
var controls = document.getElementById('controls');
function processUploadedFilesData(files) {
if (!files[0]) {
return;
};
var uploads = [];
for (var i = 0, length = files.length; i < length; i++) {
(function(file) {
var reader = new FileReader();
//I need object, as other params will be saved too in future;
var newFile = {};
reader.readAsDataURL(file);
reader.onloadend = function(e) {
newFile.data = e.target.result;
uploads.push(newFile);
}
})(files[i]);
}
return uploads;
}
controls.addEventListener('change', function(e) {
var uploadedFilesOfUser = processUploadedFilesData(e.target.files);
alert(uploadedFilesOfUser.length);
});
Codepen example - https://codepen.io/yodeco/pen/xWevRy

Categories

Resources