Convert base64 to png in meteor app - javascript

I have a meteor application and in this one I get a base64 image. I want to save the image on a Digital Ocean instance, so I would convert it in a png or an other image format and send it to the server to get an url of the image.
But I didn't find a meteor package that does this.
Do you know how I can do that ?

I was running into a similar issue.
run the following:
meteor npm install --save file-api
This will allow the following code on the server for example:
import FileAPI from 'file-api';
const { File } = FileAPI;
const getFile = function(name,image){
const i = image.indexOf('base64,');
const buffer = Buffer.from(image.slice(i + 7), 'base64');
const file = new File({buffer: buffer, name, type: 'image/jpeg'});
return file;
}
Simply call it with any name of file you prefer, and the base64 string as the image parameter.
I hope this helps. I have tested this and it works on the server. I have not tested it on the client but I don't see why it wouldn't work.

I solved my problem using fs.writeFile from File System.
This is my javascript code on client side, I got a base64 image (img) from a plugin and when I click on my save button, I do this :
$("#saveImage").click(function() {
var img = $image.cropper("getDataURL")
preview.setAttribute('src', img);
insertionImage(img);
});
var insertionImage = function(img){
//some things...
Meteor.call('saveTileImage', img);
//some things...
}
And on the server side, I have :
Meteor.methods({
saveTileImage: function(fileData) {
var fs = Npm.require('fs');
var path = process.env.PWD + '/var/uploads/';
base64Data = fileData.replace(/^data:image\/png;base64,/, "");
base64Data += base64Data.replace('+', ' ');
binaryData = new Buffer(base64Data, 'base64').toString('binary');
var imageName = "tileImg_" + currentTileId + ".png";
fs.writeFile(path + imageName, binaryData, "binary", Meteor.bindEnvironment(function (err) {
if (err) {
throw (new Meteor.Error(500, 'Failed to save file.', err));
} else {
insertionTileImage(imageName);
}
}));
}
});
var insertionTileImage = function(fileName){
tiles.update({_id: currentTileId},{$set:{image: "upload/" + fileName}});
}
So, the meteor methods saveTileImage transform the base64 image into a png file and insertionTileImage upload it to the server.

BlobUrl, would it be a better option for you?
Save the images to a server as you like in base64 or whatever, and then when you are viewing the image on a page, generate the blobUrl of it. The url being used only at that time, preventing others from using your url on various websites and not overloading your image server ...

Related

Accepting different image formats and uploading to Firebase

I am currently trying to send the images to Firebase via putString and the base64 encoded images. So far I am using FileSystem with Expo, and it spits out a base64 image that I can put into a decoder online and it shows the correct image, but when I try to upload it to Firebase storage it doesn't want to accept it as an image but rather just some data (not sure)? I assume the image is supposed to be a png as per the expo documentation even though the image is stored as a WebP on the simulator. (Expo on ReadAsStringAsync: "Read the entire contents of a file as a string. Binary will be returned in raw format, you will need to append data:image/png;base64, to use it as Base64.")
let uri = FileSystem.cacheDirectory + Date.now();
await FileSystem.makeDirectoryAsync(uri, {intermediates: true});
await FileSystem.copyAsync({to: uri, from: image});
let base64Img = await FileSystem.readAsStringAsync( uri, { encoding: FileSystem.EncodingType.Base64 });
let storageRef = firebase.storage().ref();
let imageName = `${this.props.user.id}_ProfilePhoto_${Date.now()}_${i}`;
let imageRef = storageRef.child("Profile Photos/" + imageName);
try {
let base64 = "data:image/png;base64," + base64Img;
console.log("Base 64: " + base64);
const snapshot = await imageRef.putString(base64, "data_url", {
contentType: "image/png"
});
const remoteURL = await snapshot.ref.getDownloadURL();
} catch (err) {
alert(err);
}

How to read remote image to a base64 data url

actually there are many answers for this question. But my problem is,
i want to generate pdf dynamically with 5 external(URL) images. Im using PDFmake node module.
it supports only two ways local and base64 format. But i don't want to store images locally.
so my requirement is one function which takes url as parameter and returns base64.
so that i can store in global variable and create pdfs
thanks in advance
function urlToBase(URL){
return base64;
}
var img = urlToBase('https://unsplash.com/photos/MVx3Y17umaE');
var dd = {
content: [
{
text: 'fjfajhal'
},
{
image: img,
}
]
};
var writeStream = fs.createWriteStream('myPdf.pdf');
var pdfDoc = printer.createPdfKitDocument(dd);
pdfDoc.pipe(writeStream);
pdfDoc.end();
im using PDFmake module from npm
The contents of the remote image can first be fetched with an HTTP request, for example using the ubiquitous request npm module. The image string contents can then be transformed to a buffer and finally converted to a base64 string. To complete the transformation, add the proper data-url prefix, for example, data:image/png,base64, to the beginning of the base64 string.
Here is a rough example for a PNG image:
const request = require('request-promise-native');
let jpgDataUrlPrefix = 'data:image/png;base64,';
let imageUrl = 'https://www.google.com/images/branding/googlelogo/1x/googlelogo_color_272x92dp.png';
request({
url: imageUrl,
method: 'GET',
encoding: null // This is actually important, or the image string will be encoded to the default encoding
})
.then(result => {
let imageBuffer = Buffer.from(result);
let imageBase64 = imageBuffer.toString('base64');
let imageDataUrl = jpgDataUrlPrefix+imageBase64;
console.log(imageDataUrl);
});

Save PNG image from Backend to Frontend to local Angular Project folder

I want to save a PNG image received from Backend (Java Project) to a folder inside my Angular Project. So far I can only save the image under Downloads/ folder of the PC and you can see how the file is downloaded. What I want is to silently download the image in my project (when I check the folder to see the new image stored).
Backend:
#GET
#Produces(MediaType.TEXT_PLAIN)
#Path("getImage")
public Response getImage() {
File dir = new File(Utilities.IMAGE_DIRECTORY);
File[] directoryListing = dir.listFiles();
String encodedImages = null;
// Get the first image stored in Backend project folder
try {
if (directoryListing != null) {
// Encode the image in Base64 and save it in a string
encodedImages = Base64
.getEncoder()
.withoutPadding()
.encodeToString(
Files.readAllBytes(directoryListing[0].toPath()));
}
} catch (IOException e) {
e.printStackTrace();
...
}
// Send the base64 string to Frontend
return Response
.status(Response.Status.OK)
.entity(encodedImages)
.build();
}
Frontend:
/* Extract Image */
getImage() {
this
.http
.get(this.baseUrl + "getImage", { responseType:
'text' })
.subscribe((res) => {
console.log("I received the image: \n" + res);
// Decode from base64 to PNG
var decodedImage = atob(res);
var blob = new Blob([decodedImage], { type: 'image/png' });
//this method saves the image in Downloads/ and it is not silent
saveAs(blob, 'imageFileName.png');
});
}
Since your user will not need the image it self why to download it? You don't need to download the image just for using it.
Only if your user need to download the image you could go with and "Save File" system.

Upload a photo to Firebase Storage with Image URI

I am currently attempting to upload a photo to my Firebase app's storage in my Apache Cordova app. I currently get the photo's URI with the following code:
function getPhotoFromAlbum() {
navigator.camera.getPicture(onPhotoURISuccess, onFail, {
quality: 50,
sourceType: navigator.camera.PictureSourceType.SAVEDPHOTOALBUM,
destinationType: navigator.camera.DestinationType.FILE_URI
});
}
function onPhotoURISuccess(imageURI) {
var image = document.getElementById('image');
image.style.display = 'block';
image.src = imageURI;
getFileEntry(imageURI);
}
And then am attempting to convert the image into a file and push it to my Firebase storage with the following function:
function getFileEntry(imgUri) {
window.resolveLocalFileSystemURL(imgUri, function success(fileEntry) {
console.log("got file: " + fileEntry.fullPath);
var filename = "test.jpg";
var storageRef = firebase.storage().ref('/images/' + filename);
var uploadTask = storageRef.put(fileEntry);
}, function () {
// If don't get the FileEntry (which may happen when testing
// on some emulators), copy to a new FileEntry.
createNewFileEntry(imgUri);
});
}
I have both the file and the camera cordova plugins installed, the only errors I get when I attempt to do this is
Error in Success callbackId: File1733312835 : [object Object]
Which is just an error message from cordova.js
I also know I have my Firebase storage set up correctly because I have tested it through an emulator by adding a file input and successfully uploading whatever file the user added, to the Firebase storage.
Is it possible to upload a file to Firebase storage using this method of converting an image to a file through its URI, and then uploading it? If so, what is the correct way to do so / what is wrong with the way i'm doing it?
I was able to accomplish uploading an image by using a data url. Below is my code:
var filename = "test.jpg";
var storageRef = firebase.storage().ref('/images/' + filename);
var message = 'data:image/jpg;base64,' + imageUri;
storageRef.putString(message, 'data_url').then(function (snapshot) {
console.log('Uploaded a data_url string!');
});
Is it possible to upload a file to Firebase storage using this method of converting an image to a file through its URI, and then uploading it? If so, what is the correct way to do so / what is wrong with the way i'm doing it?
Yes it is possible to upload a file on firebase through its URI. However you have to follow the correct way.
1. You have to store the data in firebase after file reading operation is completed.you can use FileReader.onloadend for this.
2. By using a data_url you can store to firebase.
Here is the snippet for more clarity:
function getFileEntry(imgUri) {
window.resolveLocalFileSystemURL(imgUri, function onSuccess(fileEntry) {
fileEntry.file(function(file) {
var reader = new FileReader();
reader.onloadend = function() {
filename = "test.jpg";
var storageRef = firebase.storage().ref('/images/' + filename);
var data = 'data:image/jpg;base64,' + imgUri;
storageRef.putString(data, 'data_url').then(function (snapshot) {
console.log('Image is uploaded by base64 format...');
});
};
reader.readAsArrayBuffer(file);
});
},
function onError(err) {
console.log(err);
createNewFileEntry(imgUri);
});
}

Batch upload using CSV to Azure Storage

I've come across a problem in uploading a large csv file to Azure's Table Storage, in that it appears to stream the data from it so fast that it doesn't upload properly or throws a lot of Timeout Errors.
This is my current code:
var fs = require('fs');
var csv = require('csv');
var azure = require('azure');
var AZURE_STORAGE_ACCOUNT = "my storage account";
var AZURE_STORAGE_ACCESS_KEY = "my access key";
var tableService = azure.createTableService(AZURE_STORAGE_ACCOUNT,AZURE_STORAGE_ACCESS_KEY);
var count = 150000;
var uploadCount =1;
var counterror = 1;
tableService.createTableIfNotExists('newallactorstable', function(error){
if(!error){
console.log("Table created / located");
}
else
{
console.log("error");
}
});
csv()
.from.path(__dirname+'/actorsb-c.csv', {delimiter: '\t'})
.transform( function(row){
row.unshift(row.pop());
return row;
})
.on('record', function(row,index){
//Output plane carrier, arrival delay and departure delay
//console.log('Actor:' + row[0]);
var actorsUpload = {
PartitionKey : 'actors'
, RowKey : count.toString()
, Actors : row[0]
};
tableService.insertEntity('newallactorstable', actorsUpload, function(error){
if(!error){
console.log("Added: " + uploadCount);
}
else
{
console.log(error)
}
});
count++
})
.on('close', function(count){
console.log('Number of lines: '+count);
})
.on('error', function(error){
console.log(error.message);
});
The CSV file is roughly 800mb.
I know that to fix it, I probably need to send the data in batches, but I have literally no idea how to do this.
I have no knowledge of the azure package nor the CSV package, but I would suggest you to upload the file using a stream. If you have the file saved to your drive you can create a read stream from it, and then use that stream to upload to azure using createBlockBlobFromStream. That question redirects me here. I suggest you to take a look at that, as it handles the encoding. The code provides a way to convert the file to a base64 string, but i have the idea that can be done more efficiently using node. I will have to look into that though.
hmm What I would suggest is to upload your file to blob storage and you can have reference to blob URI in your table storage. Block blob option give you an easy way of batch upload.

Categories

Resources