I have a NodeJS backend which use the official blob storage library (#azure/storage-blob) from Microsoft to manage my Blob Storage:
https://www.npmjs.com/package/#azure/storage-blob
It is necessary to move a blob from one folder, to another.
Unfortunately I can't find any documentation for that.
What I did until now is:
const { BlobServiceClient } = require("#azure/storage-blob");
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.storageconnection);
const containerClient = blobServiceClient.getContainerClient('import');
const blobClient = containerClient.getBlobClient('toImport/' + req.body.file);
const downloadBlockBlobResponse = await blobClient.download();
... do some stuff with the value of the files
Like you can see in the code, I read a file from folder "toImport". After that I want to move the file to another folder "finished". Is that possible? Maybe I need to create a copy of the file and delete the old one?
As such move operation is not supported in Azure Blob Storage. What you have to do is copy the blob from source to destination, monitor the copy progress (because copy operation is asynchronous) and delete the blob once the copy is complete.
For copying, the method you would want to use is beginCopyFromURL(string, BlobBeginCopyFromURLOptions).
Please see this code:
const { BlobServiceClient } = require("#azure/storage-blob");
const connectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key;EndpointSuffix=core.windows.net";
const container = "container-name";
const sourceFolder = "source";
const targetFolder = "target";
const blobName = "blob.png";
async function moveBlob() {
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient(container);
const sourceBlobClient = containerClient.getBlobClient(`${sourceFolder}/${blobName}`);
const targetBlobClient = containerClient.getBlobClient(`${targetFolder}/${blobName}`);
console.log('Copying source blob to target blob...');
const copyResult = await targetBlobClient.beginCopyFromURL(sourceBlobClient.url);
console.log('Blob copy operation started successfully...');
console.log(copyResult);
do {
console.log('Checking copy status...')
const blobCopiedSuccessfully = await checkIfBlobCopiedSuccessfully(targetBlobClient);
if (blobCopiedSuccessfully) {
break;
}
} while (true);
console.log('Now deleting source blob...');
await sourceBlobClient.delete();
console.log('Source blob deleted successfully....');
console.log('Move operation complete.');
}
async function checkIfBlobCopiedSuccessfully(targetBlobClient) {
const blobPropertiesResult = await targetBlobClient.getProperties();
const copyStatus = blobPropertiesResult.copyStatus;
return copyStatus === 'success';
}
moveBlob();
The previous best solution seem working but I don't like use an infinite loop.
So this is an alternative way to move blob file
const move = async (
fileName: string,
src: string,
dest: string
) => {
try {
const = blobServiceClient = BlobServiceClient.fromConnectionString();
logger.info(`Move storage file [ ${src} -> ${dest} | ${fileName} ]`);
const srcContainerClient = blobServiceClient.getContainerClient(src);
const destContainerClient =
blobServiceClient.getContainerClient(dest);
const blobClient = srcContainerClient.getBlobClient(fileName);
const downloadBlockBlobResponse = await blobClient.download();
const buffer = await streamToBuffer(
downloadBlockBlobResponse.readableStreamBody!
);
blobClient.delete();
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
await blockBlobClient.upload(buffer, buffer.length);
return `${this.storageUrl}/${containerClient.containerName}/${fileName}`;
} catch (e) {
throw new Error(
`Fail to move storage file [ ${src} -> ${dest} | ${fileName} ]`
);
}
};
const streamToBuffer = async (readableStream: NodeJS.ReadableStream): Promise<Buffer> => {
return new Promise((resolve, reject) => {
const chunks: Buffer[] = [];
readableStream.on("data", (data) => {
chunks.push(data instanceof Buffer ? data : Buffer.from(data));
});
readableStream.on("end", () => {
resolve(Buffer.concat(chunks));
});
readableStream.on("error", reject);
});
};
Related
I'm going to delete old images using the schedule function. Before deleting these images, I have created thumbnail images, and I want to delete only the original images except for these thumbnail images.
The following is part of my code
scedule function
exports.scheduledDeleteFile = functions
.region("asia-northeast3")
.pubsub.schedule("every 5 minutes")
.onRun(async (context) => {
try {
const bucket = firebase.storage().bucket();
// get storage file
const [filesArray] = await bucket.getFiles({
prefix: "chat/files",
});
totalCount = filesArray.length;
// variables with our settings to be reused below
const now = Date.now();
const time_ago = Date.now() - 180000; // 3min test
const TIMESTAMP_AGO = new Date(time_ago); // change datetime
const DELETE_OPTIONS = { ignoreNotFound: true }; // ??
functions.logger.log(
`Found ${totalCount} files that need to be checked.`
);
const deleteOldFileResults = await Promise.all(
filesArray.map(async (file) => {
let metadata;
try {
// 파일에 대한 메타데이터를 가져옴
[metadata] = await file.getMetadata();
// metadata of file
const { temporaryHold, eventBasedHold, timeCreated } = metadata;
const TIME_CREATED = new Date(timeCreated);
const dispose = TIME_CREATED < TIMESTAMP_AGO;
// delete
if (dispose) {
await file.delete(DELETE_OPTIONS);
functions.logger.log("delete");
disposedCount++;
// ===================
// firestore file chat 업데이트
// 트리거 함수를 따로 만들어서 사용
}
return { file, metadata, disposed: dispose, skipped: activeHold };
} catch (error) {}
})
);
} catch (error) {}
});
My thumbnail image is in the same path as the original file. Is there an option to exclude certain files when importing them? (For example, except for a file whose name precedes "thumb_")
await bucket.getFiles({
prefix: "chat/files",
});
The following is a create thumbnail function. I referred to the example provided by firebase.
https://github.com/firebase/functions-samples/tree/main/2nd-gen/thumbnails
// thumb image name size
const THUMB_MAX_HEIGHT = 200;
const THUMB_MAX_WIDTH = 200;
// thumb image name
const THUMB_PREFIX = "thumb_";
exports.generateThumbnail = functions
.region("asia-northeast3")
.storage.object()
.onFinalize(async (object) => {
// custom metadata
const userKey = object.metadata["userKey"];
const chatroomKey = object.metadata["chatroomKey"];
const type = object.metadata["type"];
// File and directory paths.
const filePath = object.name;
const contentType = object.contentType; // This is the image MIME type
const fileDir = path.dirname(filePath);
const fileName = path.basename(filePath);
const thumbFilePath = path.normalize(
// ! if change path, error!
path.join(fileDir, `${THUMB_PREFIX}${fileName}`)
);
const tempLocalFile = path.join(os.tmpdir(), filePath);
const tempLocalDir = path.dirname(tempLocalFile);
const tempLocalThumbFile = path.join(os.tmpdir(), thumbFilePath);
if (!contentType.startsWith("image/")) {
return functions.logger.log("This is not an image.");
}
if (fileName.startsWith(THUMB_PREFIX)) {
return functions.logger.log("Already a Thumbnail.");
}
// Cloud Storage files.
const bucket = admin.storage().bucket(object.bucket);
const file = bucket.file(filePath);
const thumbFile = bucket.file(thumbFilePath);
const metadata = {
contentType: contentType,
};
await mkdirp(tempLocalDir);
// Download file from bucket.
await file.download({ destination: tempLocalFile });
functions.logger.log("The file has been downloaded to", tempLocalFile);
// Generate a thumbnail using ImageMagick.
await spawn(
"convert",
[
tempLocalFile,
"-thumbnail",
`${THUMB_MAX_WIDTH}x${THUMB_MAX_HEIGHT}>`,
tempLocalThumbFile,
],
{ capture: ["stdout", "stderr"] }
);
functions.logger.log("Thumbnail created at", tempLocalThumbFile);
// Uploading the Thumbnail.
await bucket.upload(tempLocalThumbFile, {
destination: thumbFilePath,
metadata: metadata,
});
functions.logger.log("Thumbnail uploaded to Storage at", thumbFilePath);
fs.unlinkSync(tempLocalFile);
fs.unlinkSync(tempLocalThumbFile);
const results = await Promise.all([
thumbFile.getSignedUrl({
action: "read",
expires: "03-01-2500",
}),
file.getSignedUrl({
action: "read",
expires: "03-01-2500",
}),
]);
functions.logger.log("Got Signed URLs.");
const thumbResult = results[0];
const originalResult = results[1];
const thumbFileUrl = thumbResult[0];
const fileUrl = originalResult[0];
await file.delete().then((value) => {
functions.logger.log("원본 삭제 완료");
});
// Add the URLs to the Database
await admin
.database()
.ref("images")
.push({ path: fileUrl, thumbnail: thumbFileUrl });
return functions.logger.log("Thumbnail URLs saved to database.");
});
Is there an option to exclude certain files when importing them? (For example, except for a file whose name precedes "thumb_")
No. You can filter out the objects you don't want in the code that iterates the results.
I want to check folder with names which are present in array and only select them which are present inside array list but with if condition
and return the value which are present inside FileArray
let extensionArray = [".html", ".htm", ".aspx"];
let fileArray = [
"index.html",
"index.htm",
"index.aspx",
"Index.html",
"Index.htm",
"Index.aspx",
"default.html",
"default.htm",
"default.aspx",
"Default.html",
"Default.htm",
"Default.aspx",
];
if(!extensionArray.include(true){
if(!fileArray.inclue(true){
// return the output
}
}
I have checked one of the post in which file can be checked from all the folder and subfolder
but I don't know where to apply my condition to check the extension and file name and then return it.
code is as follow :-
const fs = require('fs');
const path = require('path');
async function getFile(dir) {
let files = await fs.readdir(dir);
files = await Promise.all(
files.map(async (file) => {
const filePath = path.join(dir, file);
const stats = await fs.stat(filePath);
if (stats.isDirectory()) return getFile(filePath);
else if (stats.isFile()) return filePath;
})
);
return files.reduce((all, folderContents) => all.concat(folderContents), []);
}
You don't have to add the filenames in both capitalized and lowercase forms to fileArray. You can convert the filenames to lowercase when filtering them. And you can add the filenames to a Set. Also, you don't need extensionArray since you're going to check for the filenames directly. Once you have the list of file paths in the directory by calling the getFilePaths function, you can filter them by checking if the lowercased filename (obtained by splitting the file path by / and getting the last element in the array) is present in the set.
const fs = require('fs').promises
const path = require('path')
const filenames = new Set([
'index.html',
'index.htm',
'index.aspx',
'default.html',
'default.htm',
'default.aspx',
])
const getFilePaths = async (dir) => {
let files = await fs.readdir(dir)
files = await Promise.all(
files.map(async (file) => {
const filePath = path.join(dir, file)
const stats = await fs.stat(filePath)
if (stats.isDirectory()) {
return getFilePaths(filePath)
}
return filePath
})
)
return files.flat()
}
const filterFiles = async (dir) => {
const paths = await getFilePaths(dir)
const filteredFiles = paths.filter((filePath) => {
const parts = filePath.split('/')
const filename = parts[parts.length - 1]
return filenames.has(filename.toLowerCase())
})
console.log(filteredFiles)
}
filterFiles('.')
I have the following upload function where I'm trying to upload two images to firebase and combine their download urls in a collection.
How my second storageRef keeps overwriting the first one which results in only the compressed image being uploaded and me getting the link for the compressed image twice in my collection.
How do I fix this?
const onUpload = async () => {
if (file) {
const options = {
maxWidthOrHeight: 250,
};
const fullStorageRef = storage.ref();
const fullFileRef = fullStorageRef.child(file.name);
await fullFileRef.put(file);
const fullUrl = await fullFileRef.getDownloadURL();
const thumbnailStorageRef = storage.ref();
const compressedFile = await imageCompression(file, options);
const compressedFileRef = thumbnailStorageRef.child(compressedFile.name);
await compressedFileRef.put(compressedFile);
const thumbnailUrl = await compressedFileRef.getDownloadURL();
db.collection("images").add({
created: firebase.firestore.FieldValue.serverTimestamp(),
full: fullUrl,
thumbnail: thumbnailUrl,
});
setFile(null);
}
};
file.name and compressedFile.name were the same
The "INDEX" sheet is created and has all its data and the sheets titled with the "product code" are created but completely blank. The console log function inside the inner foreach loop does print the data on each iteration but yet the insertRow function wont populate the sheets with the data. Can someone plz tell me what I'm doing wrong.
index.js
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
const fs = require('fs-extra')
const gcs = require('#google-cloud/storage')();
const path = require('path');
const os = require('os');
const ExcelJS = require('exceljs');
exports.createCSV = functions.firestore
.document('reports/{reportId}')
.onCreate(async (snap, context) => {
// Step 1. Set main variables
console.log("Began");
const reportId = context.params.reportId;
const fileName = `reports/${reportId}.xlsx`;
console.log(fileName);
const tempFilePath = path.join(os.tmpdir(), fileName);
// Reference report in Firestore
const db = admin.firestore();
const reportRef = db.collection('reports').doc(reportId);
// Reference Storage Bucket
const storage = gcs.bucket('gs://stock-check-48f78.appspot.com');
const workbook = new ExcelJS.Workbook();
// Step 2. Query collection
try {
const querySnapshot = await db.collection('stores')
.get();
// create array of order data
const indexWorksheet = workbook.addWorksheet("INDEX");
querySnapshot.forEach(async doc => {
//stores.push(doc.data());
const rowValues = [];
rowValues[1] = doc.data().code.toString();
rowValues[2] = doc.data().name;
rowValues[3] = doc.data().address;
indexWorksheet.addRow(rowValues);
const worksheet = workbook.addWorksheet(doc.data().code.toString());
const storeId = doc.id;
console.log("Store id is: " + storeId);
const querySnap = await db.collection('stores').doc(storeId).collection('products').get();
querySnap.forEach(async a => {
//console.log(a.data());
const productValues = [];
productValues[1] = a.data().name;
productValues[2] = a.data().packaging;
productValues[3] = a.data().category;
productValues[4] = a.data().stock.toString();
productValues[5] = a.data().order.toString();
productValues[6] = a.data().last_updated;
worksheet.insertRow(1, productValues);
});
});
// Step 4. Write the file to cloud function tmp storage
console.log("Filename is: ");
console.log(fileName);
console.log(tempFilePath);
const buffer = await workbook.xlsx.writeBuffer();
await fs.outputFile(tempFilePath, buffer);
console.log("Uploaded");
// Step 5. Upload the file to Firebase cloud storage
const file = await storage.upload(tempFilePath, {
destination: fileName
});
console.log("Uploaded to bucket");
return reportRef.update({
status: 'complete'
});
} catch (err) {
return console.log(err);
}
})
Update
I made these changes and still get the same result...(see edited code above)
Try using promise.all
await Promise.all(querySnap.map(async a => {
//console.log(a.data());
const productValues = [];
productValues[1] = a.data().name;
productValues[2] = a.data().packaging;
productValues[3] = a.data().category;
productValues[4] = a.data().stock.toString();
productValues[5] = a.data().order.toString();
productValues[6] = a.data().last_updated;
worksheet.insertRow(1, productValues);
}));
i am working on a cooking android app where firebase is the backend, i need to upload multiple images of a recipe in firebase stoarge and then store the downloadurl into firebase database.
i managed to upload the files into firebase but i am having some trouble to get the downloadUrl of these files.
i have created an array of promises to upload the files and i had created another array to store the url of each file which i get when it finishes the uploading task.
here is my code
var promises = [];
for (var i=0 ;i< prepaImages.length;i++)
{
//alert(prepaImages[i].name);
var storageRef = firebase.storage().ref("receipes"+"/"+category+"/"+title+"/"+uploadTime+prepaImages[i].name );
var uploadTask = storageRef.put(prepaImages[i]);
promises.push(uploadTask);
uploadTask.on('state_changed', snapshot => {
var percentage = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
$("#prepaImageUploadprogress").html(Math.round(percentage)+"%");
$("#prepaImageUploadprogress").attr("style", "width:"+percentage+"%");
}, error => { alert(error) }, () => {
uploadTask.snapshot.ref.getDownloadURL().then(downloadURL => {
//prepaImagesUrl+="+"+downloadURL;
prepaImagesUrl.push(downloadURL);
});
});
the problem is i am getting an array of the length of the number of uploaded files minus one (the legnth it should be equal to the number of uploaded files) and it has the same value (the same downloadurl)
. any help will be appreciated
Thank you.
I think the problem is with the promisies. I suggest you to use Promise.all and await. Therefore your code will be more reliable. Here is my solution to multiple file upload (adapt to your variable names):
const array = Array.from({ length: prepaImages.length }, (value, index) => index);
const uploadedImages = await Promise.all(array.map(async index => {
const image = prepaImages[index];
const metadata = { contentType: image.type };
const storageRef = firebase.storage().ref(`receipes/${category}/${title}/${uploadTime}${prepaImages[i].name}`);
const uploadTask = storageRef.put(image, metadata);
const url = await new Promise((resolve, reject) => {
uploadTask.on('state_changed', snapshot => {
const percentage = (snapshot.bytesTransferred / snapshot.totalBytes) * 100;
$('#prepaImageUploadprogress').html(`${Math.round(percentage)}%`);
$('#prepaImageUploadprogress').attr('style', `width: ${percentage}%`);
}, error => reject(error),
async () => {
const downloadUrl = await uploadTask.snapshot.ref.getDownloadURL();
resolve(downloadUrl);
});
});
return { name: image.name, url };
}));
The uploadedImages will contains an array with the image names and download urls. You can make this without await of course, but I prefer this way.
UPDATE:
Here is my own code (without error handling) to achieve this, also, I need to mention that I'm using this with react, redux and using the firebase, firestore wrapper for redux redux-firestore and react-redux-firebase but these are just wrappers:
export const addNewWork = work => async (dispatch, getState, { getFirebase, getFirestore }) => {
const { files, ...restWork } = work;
const firebase = getFirebase();
const firestore = getFirestore();
const storageRef = firebase.storage().ref();
const array = Array.from({ length: files.length }, (value, index) => index);
const uploadedFiles = await Promise.all(array.map(async index => {
const file = files[index];
const metadata = { contentType: file.type };
const uploadTask = storageRef.child(`works/${file.name}`).put(file, metadata);
const url = await new Promise((resolve, reject) => {
uploadTask.on('state_changed', () => {}, error => reject(error), async () => {
const downloadUrl = await uploadTask.snapshot.ref.getDownloadURL();
resolve(downloadUrl);
});
});
return { name: file.name, url };
}));
await firestore.collection('works').add({
...restWork,
image: uploadedFiles[0], // Use only one image for the clean example
createdAt: new Date()
});
});