I want to make a multiple images upload system with prograss bar. I want to do with simaple code(using jquery or js). I want when user has upload his images on browser and i want to show on browser that images and with upload button he starts uploading image via ajax in his folder.
So questions
1.) Is it possible to show uploaded image (without any complicated code) ?
2.) Do i get a variable or array where uploaded images are stored as base64 code (data:/img:dfd5d/d54fs..... something like this) or encoded?
3.) How do i add progressBar ?
I didn't write any code yet because i dont know how to start. I am new in computer science.
But i find this code on this site
function previewFile() {
var preview = document.querySelector('img');
var file = document.querySelector('input[type=file]').files[0];
var reader = new FileReader();
reader.onloadend = function () {
preview.src = reader.result;
}
if (file) {
reader.readAsDataURL(file);
} else {
preview.src = "";
}
}
This is easy code and i understand but one thing is not clear what does mean this line var reader = new FileReader(); why use new and what is it ?
Ty in advance and please dont explain complicate and i am not very good in english. So please try to explain in poor words if possible..
Assuming that you have this field
<input type="file" onchange="showImage(this)"/>
you can create a script to take the binary data and show it
function showImage(input){
var reader = new FileReader();
// validating...
var fileType = input.files[0].type;
var filesize = input.files[0].size;
// filetype (this will validate mimetype, only png, jpeg, jpg allowed)
var fileTypes = ["image/png", "image/jpeg", "image/gif"];
if (fileTypes.indexOf(fileType) < 0){
// return error, invalid mimetype
return false;
}
// file cannot be more than 500kb
if (filesize > 5000000) {
// return error, image too big
return false;
}
reader.onload = function (e) {
// e will contain the image info
jQuery('#myimagetopreview').attr('src', e.target.result)
}
reader.readAsDataURL(input.files[0]);
}
This should work, if you have problem tell me
edit: FileReader is not supported by all the browsers, check the documentation for more https://developer.mozilla.org/en/docs/Web/API/FileReader
The FileReader in JS has Status "Working Draft" and isn't part of the official JS API. I think you have to wait until the Browsers support this ne API or you have to activate experimental JS API in the Browser.
Related
I'm using the following approach in order to preview images before uploading them:
$("#file").change(function() {
var reader = new FileReader();
reader.readAsArrayBuffer(this.files[0]);
var fileName = this.files[0].name;
var fileType = this.files[0].type;
alert(fileType)
reader.onloadend = function() {
var base64Image = btoa(String.fromCharCode.apply(null, new Uint8Array(this.result)));
// I show the image now and convert the data to base 64
}
}
I have noticed that when the image is large, the method fails and I cannot preview the image.
I am unsure if the problem is due to base64 conversion or the FileReader.
Is there any setting to increase the max size, or is there any work around?
Here is the error message thrown in the console :
Uncaught RangeError: Maximum call stack size exceeded
at FileReader.reader.onloadend
Your problem is that you use Function.apply which will convert your Typed Array items to arguments to the String.fromCharCode method.
Functions have a maximum arguments length limit.
To avoid this, when dealing with large files, the best way is to not process it at all.
If you need to send the file to your server, simply send the Blob directly, this can be easily achieved with the FormData API.
If you need to display the file i.e in HTML media element, then use URL.createObjectURL(yourFile) method.
And if you really need a dataURI version of the file, then use reader.readAsDataURL(yourFile) method.
Works for me:
var reader = new FileReader();
reader.onload = function (evt) {
var binary = '';
var bytes = new Uint8Array(reader.result);
var len = bytes.byteLength;
for (var i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i]);
}
console.log(btoa(binary))
}
reader.readAsArrayBuffer(file)
If you read the file using the FileReader, the whole file will be loaded into the memory. If you'd like handle large files, this will simply result in your web browser crashing right away. If you are really interested in passing your file as a Base64 String, I recommend you to add file size constraints in order to prevent any potential problems. As a conclusion, none of the methods of the FileReader class would be suitable for this purpose unless and again unless you are dealing with small files not larger than 100MG or so, otherwise you will run into problems.
After playing around here's the solution:
$("#file").change(function () {
var reader = new FileReader();
reader.readAsBinaryString(this.files[0]);
var fileName = this.files[0].name;
var fileType = this.files[0].type;
alert(fileType)
reader.onloadend = function () {
var base64Image = btoa(this.result);
}
}
For example, if I have an image and I download it -- how does the computer know that all the bytes are supposed to be in that sequential order?
When I upload an image, is it possible to upload it in "chunks" so I can access/use whatever I have uploaded thus far (i.e. I have only uploaded the top half of the image) -- how would I access it?
The same would go for video or PDF, etc.
You can read file into chunks and upload file in chunks.
See the example below.
<input type="file" name="filebrowsefileid">
var filePicker = document.getElementById('filebrowsefileid');
var file = filePicker.files[0];
var chunck = file.slice(a,b);
var reader = new FileReader();
reader.readAsBinaryString(chunck );
....
reader["onloadend"] = function () {
reader.result; // This is what you want
}
After the user uploads a zipped file, i want to remove the images folder from it before sending it over the network. I am using kendo for uploading, and the existing functionality works fine. I just want to add on the removing images part. This is what i have so far:
function onSelect(e) {
var file = e.files[0];
if (endsWith(file.name, '.eds')) {
var contents = e.target.result;
var jszip = new JSZip(contents);
jszip.remove("apldbio/sds/images_barcode");
fileToSend = jszip.generate({type: "base64", compression: "DEFLATE"});
}
e.files[0] = fileToSend;
openProgressDialog(e.files.length); //this is existing code, works fine
}
target.result doesn't seem to exist in the event e. And nothing works properly from that point on. e should probably be used inside a FileReader object's onload(), (as seen here and here) but i have no idea how to use a FileReader for my purpose, with kendo Upload.
EDIT:I did some more reading and now i am using FileReader like this:
var reader = new FileReader();
reader.onload = function (e) {
// do the jszip stuff here with e.target.result
};
reader.onerror = function (e) {
console.error(e);
};
reader.readAsArrayBuffer(file);
Note : file = e.files[0] as in the 1st code block.
With this though, i get the error:
Failed to execute 'readAsArrayBuffer' on 'FileReader': parameter 1 is not of type 'Blob'.
I get the original code from here: Using Javascript FileReader with huge files
But my purpose is different, the author wants to get just a part of the whole but I want them all.
I'm trying modify it with loop, mixed with this technique: slice large file into chunks and upload using ajax and html5 FileReader
All fails, is there anyway I can get what I want.
var getSource = function(file) {
var reader = new FileReader();
reader.onload = function(e) {
if (e.target.readyState == FileReader.DONE) {
process(e.target.result);
}
};
var part = file.slice(0, 1024*1024);
reader.readAsBinaryString(part);
};
function process(data) {
// data processes here
}
Thank you,
I have an issue with JavaScript when rendering an image before upload in a correct rotation. It seems that when you render the image witch have the correct rotation only on exif data the browser doesn't use it.
Users see a different rotation between what they have on their system on when image is displayed on the website by JavaScript.
The code is very basic:
Do you know a simple way to correct this rotation bug ?
LbEmeraude.handleImage = function (f) {
if (f.type.match('image.*')) {
var reader = new FileReader();
reader.onload = (function (file) {
return function (e) {
var image = {};
image.dataAsUrl = e.target.result;
LbEmeraude.renderImage(image);
};
})(f);
var image = reader.readAsDataURL(f);
}
}
LbEmeraude.renderImage = function (image) {
var eImage = LbEmeraude.createImgElement(image.dataAsUrl);
$('someElement').append(eImage);
};
LbEmeraude.createImgElement = function (src) {
var image = document.createElement("img");
image.src = src;
return image;
}
Thank for your attention.
What you are asking for is nothing new... check this out: https://bugzilla.mozilla.org/show_bug.cgi?id=298619
That sucker was opened in 2005 and has not been resolved yet. This article is old but really robust: http://www.daveperrett.com/articles/2012/07/28/exif-orientation-handling-is-a-ghetto/
But the key part in there is kinda far down where he notes that the browser does not usually apply exif rotation when in the context of an html img tag, but may honor it when opening the image in its own tab.
So right now no browser will do it by default, the web apps that seem to do it are mostly getting that value on the server and serving down different assets.
But it looks like there is hope if you want to hack it in: Accessing JPEG EXIF rotation data in JavaScript on the client side