i have a bunch of VHD files stored on a private Server, which are accessible through a url.
I am trying upload these vhd files directly to my azure storage account using the azure javascript npm libraries. The vhds have to be uploaded as page-blobs. I tried using the method uploadPagesFromURL() of the pageblobClient but with no success. My code looks roughly like this:
async function uploadVHD(accessToken, srcUrl)
{
try {
// Get credentials from accessToken
const creds = new StorageSharedKeyCredential(storageAccount.name, storageAccount.key);
// Get blobServiceClient
const blobServiceClient = new BlobServiceClient(`https://${storageAccount.name}.blob.core.windows.net`, creds);
// Create Container
const containerClient = blobServiceClient.getContainerClient("vhd-images");
await containerClient.createIfNotExists();
const src = srcUrl.replace('https://', 'https://username:password#');
// Upload to blob storage
const pageBlobClient = containerClient.getPageBlobClient("Test.vhd");
// Get fileSize of vhd
const fileSize = (await axiosRequest(src, { method: "HEAD" })).headers["content-length"];
const uploadResponse = await pageBlobClient.uploadPagesFromURL(src, 0, 0, fileSize);
return uploadResponse;
} catch (error) {
return error;
}
});
It is not possible to upload the Page Blob with your URL directly. You need to read data from the url. Then upload using uploadPages method.
axios.get(URL, {
responseType: 'arraybuffer'
})
.then((response) => {
console.log(response.data)
console.log(response.data.length)
// upload page blob...
}).catch((error) => {
//handle error
});
// uploadPages method
const uploadResponse = pageBlobClient.uploadPages(data, 0, dataLength);
Related
I'm trying to accomplish the following task: I need to download image from url, then upload it to S3 storage and return the location of the uploaded file. I'm using async/await functions to do the task, but it returns Promise { pending } and after few seconds returns the location, i want to return location after promise is resolved. Here is my code:
// Space config
const spaceEndPoint = new AWS.Endpoint("fra1.digitaloceanspaces.com");
const s3 = new AWS.S3({
endpoint: spaceEndPoint,
accessKeyId: "xxxxxxxxx",
secretAccessKey: "xxxxxxxxxxxxxxxxx",
});
// Download image from url
const downloadImage = async (url) => {
try {
const file = axios
.get(url, {
responseType: "stream",
})
.then((res) => res.data)
.catch((err) => console.log(err));
return file;
} catch (err) {
console.log(err);
}
};
// Upload to space
const upload = async (fileUrl) => {
// Get file name from url
const fileName = path.basename(fileUrl);
// Path to save tmp file
const localFilePath = path.resolve(__dirname, "../downloads", fileName);
// Download file
const file = await downloadImage(fileUrl);
// Write file to disk
await file.pipe(fs.createWriteStream(localFilePath));
// Upload params
const params = {
Bucket: "sunday",
Body: fs.createReadStream(localFilePath),
Key: path.basename(fileName),
ContentType: "application/octet-stream",
ACL: "public-read",
};
const { Location } = await s3.upload(params).promise();
return Location;
};
console.log(
upload(
"https://i.pinimg.com/474x/e9/62/7c/e9627ce6fe731ba49597d3a83e21e398.jpg"
).then((data) => data)
);
// Result:
Promise { <pending> }
https://sunday.fra1.digitaloceanspaces.com/e9627ce6fe731ba49597d3a83e21e398.jpg
So i want to return location when promise is resolved.
Thanks in advance for help!
Your function upload is async and thus always returns a promise that should be awaited too. await your upload call. If you're in environment that doesn't support top level await, use .then to log results instead or put outer logging code in helper function.
I am looking for a way to download binary pdf file, which I get through api. I have this file in local app file system, but I would like to download it on my phone.
const pdf = Buffer.from(res.data, "binary").toString("base64");
const fileUri =
FileSystem.documentDirectory + `${encodeURI("generated")}.pdf`;
FileSystem.writeAsStringAsync(fileUri, pdf, {
encoding: FileSystem.EncodingType.Base64
}).then((respond) => {
downloadPdf(fileUri);
//Sharing.shareAsync(fileUri); // Here I can share file by other apps
});
const downloadPdf = async (uri) => {
//Linking.openURL(uri) // #approach 1
// MediaLibrary.saveToLibraryAsync(uri).then(res => { // #approach 2
// console.log(res)
// }).catch(err => {
// console.log(err)
// })
const permissions = await MediaLibrary.getPermissionsAsync();
try {
const asset = await MediaLibrary.createAssetAsync(uri); // #approach 3
const album = await MediaLibrary.getAlbumAsync("Downloads");
if (album == null) {
await MediaLibrary.createAlbumAsync("Downloads", asset, false);
} else {
await MediaLibrary.addAssetsToAlbumAsync([asset], album, false);
}
} catch (e) {
console.log(e)
}
};
I have tried different ways to do it, but I've stopped here with expo-media-library, which gives me:
"Unable to copy file into external storage" error.
Is it a good direction to use it ? Maybe you have some better solutions?
Hello All I hope All Are Doing Well
I have A issue I am uploading multiple images in Cloudinary via ReactJs
Here Is Input Field
<input
type="file"
className="form-control"
required
onChange={(e) => setImages(e.target.files)}
multiple
/>
OnChange I'm storing all files on a state given below
const [images, setImages] = useState([]);
Now I am looping the state and uploading each file to Cloudinary and extracting the URL of each Image code is given below
for (const file of images) {
async function upload() {
const formData = new FormData();
formData.append("file", file);
formData.append("upload_preset", "DummyPreset"); // Replace the preset name with your own
formData.append("api_key", "0300545648961"); // Replace API key with your own Cloudinary key
// Make an AJAX upload request using Axios (replace Cloudinary URL below with your own)
await axios
.post(
"https://api.cloudinary.com/v1_1/Dummycloud/image/upload",
formData,
{
headers: { "X-Requested-With": "XMLHttpRequest" },
}
)
.then((response) => {
const data = response.data;
const fileURL = data.secure_url; // You should store this URL for future references in your app
console.log(fileURL);
});
}
upload();
}
Here I'm able to extract each link As fileURL and consoled it
console.log(fileURL);
To See The Output Please click the Link it will redirects you to the image Outputimage
As You Can see all URLs Are Extracted Now I want to push All Extracted URLs into an Array And Wants to send them to Express Server where I'll store them into DB
Please Let Me Know How Store All URLs into a state array whenever any URL extracted it'll be stored into That array
Here's The Solution
Thanks For Contribution
var ImagesUrlArray = [];
for (const file of image) {
async function upload() {
const formData = new FormData();
formData.append("file", file);
formData.append("upload_preset", "DummyPreset"); // Replace the preset name with your own
formData.append("api_key", "0300545648961"); // Replace API key with your own Cloudinary key
// Make an AJAX upload request using Axios (replace Cloudinary URL below with your own)
await axios
.post(
"https://api.cloudinary.com/v1_1/Dummycloud/image/upload",
formData,
{
headers: { "X-Requested-With": "XMLHttpRequest" },
}
).then((response) => {
const data = response.data;
var fileURL = data.secure_url; // You should store this URL for future references in your app
ImagesUrlArray = [...ImagesUrlArray];
ImagesUrlArray.push(fileURL);
if (ImagesUrlArray.length === image.length) {
const res = axios
.post("http://localhost:5000/register", {
fullname: Data.fullname,
email: Data.email,
pass: Data.pass,
cpass: Data.cpass,
phone: Data.phone,
imagee: ImagesUrlArray,
})
.then((response) => response);
setdataa(res);
}
});
}
upload();
}
// this function returns a Promise
const uploadFile = (file) => {
const formData = new FormData();
formData.append(stuff);
return axios.post('some/path', formData).then(response => response.data.secure_url);
};
Promise.all(images.map(uploadFile)).then(fileURLs => storeFileURLs(fileURLs))
I connected the typescript function to Azure Blobstorage through Rest-API and this works fine for me. Now I want to get each blob contents and read the contents of each blobs.
I try this with this code here, but it returns an error:
const blobServiceClient = new BlobServiceClient(`https://${accountName}.blob.core.windows.net?${sasToken}`,
pipeline)
const containerClient = blobServiceClient.getContainerClient(containerName)
console.log(containerClient)
if (!containerClient.exists()) {
console.log("the container does not exit")
await containerClient.create()
}
const client = containerClient.getBlockBlobClient(this.currentFile.name)
//name of uploded blob
console.log(this.currentFile.name)
//metaata from the blob
console.log(client)
//List each blobs in the container
for await (const blob of containerClient.listBlobsFlat()) {
console.log('\t', blob.name);
const blockBlobClient = containerClient.getBlockBlobClient(blob.name);
const downloadBlockBlobResponse = await blockBlobClient.download(0);
console.log('\nDownloaded blob content...');
console.log('\t', await streamToString(downloadBlockBlobResponse.readableStreamBody));
//end of loop
}
async function streamToString(readableStream) {
return new Promise((resolve, reject) => {
const chunks = [];
readableStream.on("data", (data) => {
chunks.push(data.toString());
});
readableStream.on("end", () => {
resolve(chunks.join(""));
});
readableStream.on("error", reject);
});
}
The error is :
ERROR Error: Uncaught (in promise): TypeError: Cannot read property 'on' of undefined TypeError: Cannot read property 'on' of undefined
So how to solve the problem?
Thanks
Download the official sample code.
It runs normally on my side. Check if your local lack of dependencies, or the permissions in the storage need to be set.
// Copyright (c) Microsoft Corporation.
// Licensed under the MIT license.
/*
Setup: Enter your storage account name and shared key in main()
*/
import {
BlobServiceClient,
StorageSharedKeyCredential,
BlobDownloadResponseModel
} from "#azure/storage-blob";
// Load the .env file if it exists
import * as dotenv from "dotenv";
dotenv.config();
export async function main() {
// Enter your storage account name and shared key
const account = process.env.ACCOUNT_NAME || "pans*****age";
const accountKey = process.env.ACCOUNT_KEY || "IHa48xxo+0anyKQ2GzQ2K*******ZBxgJ0VotCpGs/PMftkebb9UFqyg==";
// Use StorageSharedKeyCredential with storage account and account key
// StorageSharedKeyCredential is only available in Node.js runtime, not in browsers
const sharedKeyCredential = new StorageSharedKeyCredential(account, accountKey);
// ONLY AVAILABLE IN NODE.JS RUNTIME
// DefaultAzureCredential will first look for Azure Active Directory (AAD)
// client secret credentials in the following environment variables:
//
// - AZURE_TENANT_ID: The ID of your AAD tenant
// - AZURE_CLIENT_ID: The ID of your AAD app registration (client)
// - AZURE_CLIENT_SECRET: The client secret for your AAD app registration
//
// If those environment variables aren't found and your application is deployed
// to an Azure VM or App Service instance, the managed service identity endpoint
// will be used as a fallback authentication source.
// const defaultAzureCredential = new DefaultAzureCredential();
// You can find more TokenCredential implementations in the [#azure/identity](https://www.npmjs.com/package/#azure/identity) library
// to use client secrets, certificates, or managed identities for authentication.
// Use AnonymousCredential when url already includes a SAS signature
// const anonymousCredential = new AnonymousCredential();
// List containers
const blobServiceClient = new BlobServiceClient(
// When using AnonymousCredential, following url should include a valid SAS or support public access
`https://${account}.blob.core.windows.net`,
sharedKeyCredential
);
let i = 1;
for await (const container of blobServiceClient.listContainers()) {
console.log(`Container ${i++}: ${container.name}`);
}
// Create a container
const containerName = `newcontainer${new Date().getTime()}`;
const containerClient = blobServiceClient.getContainerClient(containerName);
const createContainerResponse = await containerClient.create();
console.log(`Create container ${containerName} successfully`, createContainerResponse.requestId);
// Create a blob
const content = "hello, 你好";
const blobName = "newblob" + new Date().getTime();
const blockBlobClient = containerClient.getBlockBlobClient(blobName);
const uploadBlobResponse = await blockBlobClient.upload(content, Buffer.byteLength(content));
console.log(`Upload block blob ${blobName} successfully`, uploadBlobResponse.requestId);
// List blobs
i = 1;
for await (const blob of containerClient.listBlobsFlat()) {
console.log(`Blob ${i++}: ${blob.name}`);
}
// Get blob content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadBlockBlobResponse.readableStreamBody
// In browsers, get downloaded data by accessing downloadBlockBlobResponse.blobBody
const downloadBlockBlobResponse: BlobDownloadResponseModel = await blockBlobClient.download(0);
console.log(
"Downloaded blob content",
await streamToString(downloadBlockBlobResponse.readableStreamBody!)
);
// Delete container
await containerClient.delete();
console.log("deleted container");
}
// A helper method used to read a Node.js readable stream into string
async function streamToString(readableStream: NodeJS.ReadableStream) {
return new Promise((resolve, reject) => {
const chunks: string[] = [];
readableStream.on("data", (data) => {
chunks.push(data.toString());
});
readableStream.on("end", () => {
resolve(chunks.join(""));
});
readableStream.on("error", reject);
});
}
main().catch((err) => {
console.error("Error running sample:", err.message);
});
I need to create a zip file with any PDF what I recieved from Storage AWS, and I am trying do this with ADM-zip in NodeJS, but i cant read the final file.zip.
Here is the code.
var zip = new AdmZip();
// add file directly
var content = data.Body.buffer;
zip.addFile("test.pdf", content, "entry comment goes here");
// console.log(content)
// add local file
zip.addLocalFile(`./tmp/boletos/doc.pdf`);
// // get everything as a buffer
var willSendthis = zip.toBuffer();
console.log(willSendthis)
// // or write everything to disk
zip.writeZip("test.zip", `../../../tmp/boletos/${datastring}.zip`);
As it is this only creates a .zip for each file..zip
I was also facing this issue. I looked through a lot of SO posts. This is how I was able to create a zip with multiple files from download urls. Please keep in mind, I'm unsure this is best practice, or if this is going to blow up memory.
Create a zip folder from a list of id's of requested resources via the client.
const zip = new AdmZip();
await Promise.all(sheetIds.map(async (sheetId) => {
const downloadUrl = await this.downloadSds({ sheetId, userId, memberId });
if (downloadUrl) {
await new Promise((resolve) => https.get(downloadUrl, (res) => {
const data = [];
res.on('data', (chunk) => {
data.push(chunk);
}).on('end', () => {
const buffer = Buffer.concat(data);
resolve(zip.addFile(`${sheetId}.pdf`, buffer));
});
}));
} else {
console.log('could not download');
}
}));
const zipFile = zip.toBuffer();
I then used downloadjs in my React.js client to download.
const result = await sds.bulkDownloadSds(payload);
if (result.status > 399) return store.rejectWithValue({ errorMessage: result?.message || 'Error', redirect: result.redirect });
const filename = 'test1.zip';
const document = await result.blob();
download(document, filename, 'zip');