So, this is my problem. I am trying to save all the images in the local drive from Javascript and the code that does that looks like this:
window.onload = function () {
console.log(" IN ON LOAD FUNCTION ")
for (filename=1; filename<=80; filename++)
{
console.log(filename)
html2canvas(document.querySelector(`.lead${filename}`)).then(canvas => {
canvas.toBlob(function(blob) {
saveAs(blob, `${filename}.png`);
});
});
}
}
So the code is quite clear. I have about 80 divs with different class name and I am extracting the text from it and saving it as canvas and downloading that canvas as png format, but the problem is sometimes it downloads about 10 images, and sometimes it doesn't download at all. Any help regarding this issue? Any other way around?
Related
Holla, fellow developers!
I made this e-commerce app using the MERN stack, where I create a product as an admin, upload an image, save it in the database as binary data and then display it on the SHOP page using the following code:
src={`data:image/jpg;base64,${img.data}`}
And although this works perfectly in my PC browser and all images are displayed accordingly, when I open the app on my mobile device some of the images are displayed and some are not, which makes no sense, since they are all JPEG, they are all roughly the same size.
Is there some kind of a rule when using base64 encoded images on mobile devices or do they not work all together and if there's a way to fix it, how can I do it?
Also what are the best and most reliable practices when it comes to saving images in the database and displaying them when working with the MERN stack?
Thanks in advance!
Technically this should not be the case. I have something similar and it is working perfectly fine.
I am uploading an image
converting to base64 &
made an api call to upload the same.
Made one more api call to fetch the same and using it with image src.
So here is my base 64 conversion.
//base 64 conversion//
`onChange=event=>{
this.setState({
file: URL.createObjectURL(event.target.files[0])
})
const ProjectfileSize = Math.round((event.target.files[0].size / 1024));
this.setState({
fileSize: ProjectfileSize
})
var reader = new FileReader();
reader.readAsDataURL(event.target.files[0]);
reader.onload = () =>
{
const base64File = reader.result.replace(/^data:[a-z]+\/[a-z\-]+;base64,/, "");
this.setState({
convertedFile: base64File
})
};
reader.onerror = function (error) {
console.log('Error: ', error);
};
}
`
I'm making a website, in which I want to offer the user to download the whole website (CSS and images included) for them to modify. I know I can download individual resources with
Click Me
but like I said, this only downloads one file, whereas I would like to download the entire website.
If it helps you visualise what I mean: in chrome, IE and Firefox you can press ctrl+s to download the entire website (make sure you save it as Web page, Complete.
Edit: I know I can create a .zip file that it will download, however doing so requires me to update it every time I make a change, which is something I'd rather not do, as I could potentially be making a lot of changes.
As I mention, it is better that you will have a cron job or something like this that once in a while will create you a zip file of all the desired static content.
If you insist doing it in javascript at the client side have a look at JSZip .
You still have to find a way to get the list of static files of the server to save.
For instance, you can create a txt file with each line is a link to a webpage static file.
you will have to iterate over this file and use $.get to get it's content.
something like this:
// Get list of files to save (either by GET request or hardcoded)
filesList = ["f1.json /echo/jsonp?name=1", "inner/f2.json /echo/jsonp?name=2"];
function createZip() {
zip = new JSZip();
// make bunch of requests to get files content
var requests = [];
// for scoping the fileName
_then = (fname) => data => ({ fileName: fname, data });
for (var file of filesList) {
[fileName, fileUrl] = file.split(" ");
requests.push($.get(fileUrl).then(_then(fileName)));
}
// When all finished
$.when(...requests).then(function () {
// Add each result to the zip
for (var arg of arguments) {
zip.file(arg.fileName, JSON.stringify(arg.data));
}
// Save
zip.generateAsync({ type: "blob" })
.then(function (blob) {
saveAs(blob, "site.zip");
});
});
}
$("#saver").click(() => {
createZip();
});
JSFiddle
Personally, I don't like this approach. But do as you prefer.
I am making a node.js app that will be primarily used to download images from the web. I have created this function that successfully downloads images. I want to make the function also show a live preview of the image as it is downloading without impacting download speed. Is it possible to "tap the pipe" and draw the image as it downloads to a html canvas or img? I am using electron so I am looking for a chromium/node.js based solution.
Edit:
I've also found out you can chain pipes (r.pipe(file).pipe(canvas);) but I'm not sure if that would download the file first and then show up on the canvas or if it would update them both as the file downloads.
I've also thought of creating two separate pipes from the request (var r = request(url); r.pipe(file); r.pipe(canvas);), but I'm not sure if that would try to download the image twice.
I'm also not particularly familiar with the html canvas and haven't been able to test these ideas because I don't know how to pipe a image to a canvas or an img element for display in the application.
const fs = require('fs-extra');
downloadFile(url, filename) {
return new Promise(resolve => {
var path = filename.substring(0, filename.lastIndexOf('\\'));
if (!fs.existsSync(path)) fs.ensureDirSync(path);
var file = fs.createWriteStream(filename);
var r = request(url).pipe(file);
// How would also pipe this to a canvas or img element?
r.on('error', function(err) { console.log(err); throw new Error('Error downloading file') });
r.on('finish', function() { file.close(); resolve('done'); });
});
}
I ended up asking another similar question that provides the answer to this one. It turns out the solution is to pipe the image into memory, encode it with base64 and display it using a data:image url.
This question already has answers here:
filereader api on big files
(2 answers)
Closed 5 years ago.
I have always fully loaded binary files and then gone on to scan the content blocks within. This was fine for anything up to say 500K. I'm now looking at scanning files that are as large as 1G or higher. (Client side)
Loading files greater then 5mb (as large as 1G) is not great, and I would like to move to a process where as the file is loading I can grab the blocks of code and process it. The file format is made up of blocks that have a size in it. So I would be able to grab the header, the first block and do parsing as the file loads.
If anyone knows where i can find some good examples of code like this working, or useful texts I can read, I would be very grateful.
My current code for loading is as follows. Jquery change on an input box, calls another function to load the file into memory, then processes it. The scanMyFile(buffer) is the function where the arraybuffer is then been sent to do its work identifying everything in the file.
$("#myfile").change(function (e) {
try {
readMyFile(e);
} catch (error) {
alert("Found this error : " + error);
}
});
function readMyFile(evt) {
var f = evt.target.files[0];
if (f) {
var r = new FileReader();
r.onload = function (e) {
var buffer = r.result;
scanMyFile(buffer);
};
r.readAsArrayBuffer(f);
} else {
alert("Failed to load file");
}
}
A File objects is also an instance of a Blob, which offers the .slice method to create a smaller view of the file.
Might want to check out this question:
filereader api on big files
I'm kinda new to programming in general. My problem is that I want to download a file and after that do something.
Danbooru.search(tag, function (err, data) { //search for a random image with the given tag
data.random() //selects a random image with the given tag
.getLarge() //get's a link to the image
.pipe(require('fs').createWriteStream('random.jpg')); //downloads the image
});
now I want to do a console.log after the file has been downloaded. I don't want to work with setTimeout since the files will take a diffrent time to download.
Thanks for the help.
See if this works for you. Just saving the request to a variable and checking for the finish event on it.
Danbooru.search(tag, function (err, data) {
var stream = data.random()
.getLarge()
.pipe(require('fs').createWriteStream('random.jpg'));
stream.on('finish', function() { console.log('file downloaded'); });
});