There is html code written for download button.
The complete Sharepoint 2013 hosted application functionality has JS code. There is WCF solution using C# to get data from SAP system.
There is one page in application which has download button. Once clicked by the user, the file should get downloaded from Azure blob storage.
Kindly help how we can achieve this.
I have tested in my environment
You can use the below javascript function to download files from Azure blob storage :
async function main() {
var fs = require('fs');
const { BlobServiceClient, StorageSharedKeyCredential } = require("#azure/storage-blob");
const account = "storage account name";
const accountKey = "storage account key"
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
const blobServiceClient = new BlobServiceClient(
`https://${account}.blob.core.windows.net`,
sharedKeyCredential);
const containerClient = blobServiceClient.getContainerClient("container name");
const blobClient = containerClient.getBlobClient("blob file name to be downloaded");
const response = await blobClient.downloadToFile("blob file name to be downloaded");
main();
Related
I'm trying to decrypt zip file using crypto provided file is in s3 storage and I have password for the file and no info for IV.
I'm using cryptoDecipher but getting error cryptoDecipher is deprecated. I saw some post saying use createDecipheriv but I don't have any IV that I can use.
below is a sample code -
function demo(entry){
password = 'mypassword';
algorithm = 'aes-256-cbc';
let decrypt = crypto.createDecipher(algorithm,password)
let res = entry.stream().pipe(decrypt);
const uploadParams = {
Body:res,
Bucket:myS3bucket,
key:myfilename,
}
uploadfile(uploadParams)
}
I'm using unzipper to unzip file and getting 'entry as object for file so just using that object in demo function'
help me out as I'm new to streams and crypto lib.
I'm trying to upload a file to Azure Blob Storage. What I've done so far:
npm i #azure/identity #azure/storage-blob
Generate SAS query parameters with a user delegation key:
async function generateSas() {
const startDate = new Date();
const expiryDate = new Date();
startDate.setTime(startDate.getTime() - 100 * 60 * 1000);
expiryDate.setTime(expiryDate.getTime() + 100 * 60 * 60 * 1000);
const credential = new DefaultAzureCredential();
const blobServiceClient = new BlobServiceClient(STORAGE, credential);
const key = await blobServiceClient.getUserDelegationKey(startDate, expiryDate);
return generateBlobSASQueryParameters({
containerName: CONTAINER,
startsOn: startDate,
expiresOn : expiryDate,
permissions: ContainerSASPermissions.parse('rwl'),
}, key, ACCOUNT).toString();
}
Use the generated SAS to upload
async function upload(sasToken: string) {
const blobClient = new BlockBlobClient(
`https://${ACCOUNT}.blob.core.windows.net/${CONTAINER}/test.json?${sasToken}`);
const content = 'some content';
const response = await blobClient.upload(content, content.length);
}
Before I run this, I do az login with my account.
The error:
(node:19584) UnhandledPromiseRejectionWarning: RestError: This request is not authorized to perform
this operation using this permission.
If I copy a SAS from Azure Storage Explorer with the same login, the code works! So I assume that there is some way to retrieve a valid SAS for my account.
I suspect that this is a permission issue.
After analyzing Can't list file system of azure datalake with javascript and ManagedIdentityCredential failed when used to list blob containers #5539 issues closely, I think that the Owner role is not sufficient for uploading blobs inside your blob storage account. You'll have to use one of the Storage Blob Data * roles (like Storage Blob Data Owner before you can upload blobs.
So, try adding Storage Blob Data Owner role to your intended user and try running the code again.
I am working on some react web app in which I am trying to read excel file on client side as below.
import XLSX from "xlsx";
const targetCsvPath = window.location.origin + "/XRayConfig.xlsx";
const workbook = XLSX.readFile(targetCsvPath)
const json = XLSX.utils.sheet_to_json(workbook.Sheets.FOV);
But this gives error TypeError: _fs.readFileSync is not a function. When I run this code snippet using node, it runs flawlessly. I think client side JavaScript does not run on Node, so is the error.
window.location.origin points to public folder of react app and the excel file is in that folder.
This link almost answers this question, but excel file is uploaded from client side using input tag and then it is processed. But My excel file is on server side. How can I solve this?
I am answering my own question. Using file system APIs does not work on client side JavaScript as it does not run on Node. So first the excel content should be fetched in the form of blob and use that blob.
Following solution works for me.
import XLSX from "xlsx";
const targetCsvPath = window.location.origin + "/XRayConfig.xlsx";
const reader = new FileReader();
reader.onload = function (e) {
const workbook = XLSX.read(e.target.result, { type: "binary" });
// your operations on workbook comes here
}
fetch(targetCsvPath)
.then((response) => response.blob())
.then((data) => {
reader.readAsBinaryString(data);
})
.catch((err) => console.log(err);
I would like to use storage service from Firebase with a nodeJS api (hosted on "firebase functions") to allow the users to upload his avatars.
So I read the doc from https://firebase.google.com/docs/storage/web/start
and I do:
admin.js
const admin = require('firebase-admin');
const config = require('./config.js');
admin.initializeApp(config);
const db = admin.firestore();
const storage = admin.storage();
module.exports = { admin, db, storage };
user.js
const { admin, db, storage } = require('../util/admin');
exports.postAvatar = async (request, response) => {
const storageRef = storage.ref();
}
but I have the following error: storage.ref is not a function
Is something is missing from the documentation ?
The console.log of storage const is:
Storage {
INTERNAL: StorageInternals {},
storageClient: Storage {...},
appInternal: FirebaseApp {...}
}
admin.storage() returns a Storage object. If you want to use it to refer to a file in your default storage bucket, you should use its bucket() method with no parameters, and it will give you a Bucket object from the Google Cloud nodejs SDK.
There are no methods called ref() anywhere in that SDK. It's not much like the JavaScript web client SDK. You will have to learn a different but similar API to work with content in using the Cloud Storage node SDK. The Admin SDK just essentially wraps this API.
const file = storage.bucket().file('/path/to/file');
Try below code.
const storage = storage.bucket() // you can also put your bucket-id from config
const storageRef = storage.ref()
Also check this answer. TypeError: firebase.storage is not a function
I'd like to upload files with uniform names by generating a md5 hash client-side. However, I didn't manage to successfully pass the Firebase Storage security rules at this point:
match /{userUid}/{photoHash} {
allow create: if request.resource.md5Hash == photoHash;
}
Client-side code snippet uploading the photo:
const compressedPhoto = await this.compressPhoto(photo)
const base64 = await this.toBase64(compressedPhoto)
const photoHash = CryptoJS.MD5(compressedPhoto).toString(CryptoJS.enc.Base64)
const uploadTask = this.storageRef.child(photoHash).put(compressedPhoto)