How to make a clipboard stream from guacamole-common.js.? - javascript

I am using Guacamole.client.createClipboardStream for creating a clipboard stream and send write the stream on paste event but, the remote clipboard is still empty.
const stream = client.createClipboardStream("text/plain");
stream.sendBlob(data);

I had the same issue, but then found that per the Guacamole Common Js spec, the OuputStream.sendBlob method expects a Base64 encoded string representing the blob.
Here's a little snippet from my quite early code to do this
const blobToBase64 = (blob) => {
return new Promise((res, _) => {
const reader = new FileReader();
reader.onloadend = () => res(reader.result);
reader.readAsDataURL(blob);
});
}
const sendBlobBasedOnMimeType = async (item, mimeType) => {
const blob = await item.getType(mimeType);
const blobAsDataUrl = await blobToBase64(blob);
const blobAsB64 = blobAsDataUrl.split(",")[1];
console.log("size of b64 blob", blobAsB64.length);
const stream = guac.current.createClipboardStream(mimeType, "remote");
stream.onack = () => {
stream.sendEnd();
}
stream.sendBlob(blobAsB64);
}

Related

How to add custom metadata in PDF file using react js?

I'm taking input as a PDF file and using javascript to add custom metadata, but I'm not getting a satisfactory result.
Below is a sample method code that I used to add custom metadata that is first converted to blob type and then added, but when we convert its blob data to base64 and download the file and check the properties, we cannot find it.
const blobToBase64 = (blob: any) =>
new Promise((resolve, reject) => {
const reader = new FileReader();
reader.readAsDataURL(blob);
reader.onload = () => resolve(reader.result);
reader.onerror = (error) => reject(error);
});
const updatePDFMetaData = (file: any, metadata: any) => {
let convertBlobToBase64: any;
const selectedFile = file;
const reader = new FileReader();
reader.readAsArrayBuffer(selectedFile);
reader.onload = async (event:any) => {
const fileBuffer: any = event?.target?.result;
const blob: any = new Blob([fileBuffer], { type: selectedFile.type });
Object.keys(metadata).forEach((key: any) => {
blob[key] = metadata[key];
});
convertBlobToBase64 = await blobToBase64(blob);
console.log("convertBlobToBase64", convertBlobToBase64);
};
};

converting image to base64 - image becomes invisible

I'm trying encode an image to base64, (so I can later send it this way to a backend server). Everything seems to work until I use JSON.stringify() on the object that has the encoded image in it.
I think It gets lost in the JSON.stringify() and I can't seem to find a solution. I've been working for weeks on this issue and I couldn't find an answer anywhere. Please help!
const [baseImage, setBaseImage] = useState('');
const [baseImageCorrect, setBaseImageCorrect] = useState('');
const convertBase64 = (file) => {
return new Promise((resolve, reject) => {
const fileReader = new FileReader();
fileReader.readAsDataURL(file);
fileReader.onload = () => {
resolve(fileReader.result);
};
fileReader.onerror = (error) => {
reject(error);
console.log(error);
};
});
};
const uploadImage = async (e) => {
const file = e.target.files[0];
const base64 = await convertBase64(file);
const base64RemovedType = base64.split(',')[1];
setBaseImage(`${base64RemovedType}`);
};
useEffect(() => {
setBaseImageCorrect(baseImage);
console.log('current:' + baseImageCorrect);
//prints out a long string with the RIGHT information
}, [baseImage, baseImageCorrect]);
const EncodedImage = JSON.stringify({
fileBase64: (baseImageCorrect, { encoding: 'base64' }),
});
console.log(EncodedImage)
//PRINTS THIS: "fileBase64":{"encoding":"base64"}} , without the encoded image string
I am assuming u need the key baseImageCorrect and encoding key at the same level.
Use this instead:
const EncodedImage = JSON.stringify({
fileBase64: {baseImageCorrect, encoding: 'base64' },
});

JavaScript Azure Blob Storage move blob

I have a NodeJS backend which use the official blob storage library (#azure/storage-blob) from Microsoft to manage my Blob Storage:
https://www.npmjs.com/package/#azure/storage-blob
It is necessary to move a blob from one folder, to another.
Unfortunately I can't find any documentation for that.
What I did until now is:
const { BlobServiceClient } = require("#azure/storage-blob");
const blobServiceClient = BlobServiceClient.fromConnectionString(process.env.storageconnection);
const containerClient = blobServiceClient.getContainerClient('import');
const blobClient = containerClient.getBlobClient('toImport/' + req.body.file);
const downloadBlockBlobResponse = await blobClient.download();
... do some stuff with the value of the files
Like you can see in the code, I read a file from folder "toImport". After that I want to move the file to another folder "finished". Is that possible? Maybe I need to create a copy of the file and delete the old one?
As such move operation is not supported in Azure Blob Storage. What you have to do is copy the blob from source to destination, monitor the copy progress (because copy operation is asynchronous) and delete the blob once the copy is complete.
For copying, the method you would want to use is beginCopyFromURL(string, BlobBeginCopyFromURLOptions).
Please see this code:
const { BlobServiceClient } = require("#azure/storage-blob");
const connectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key;EndpointSuffix=core.windows.net";
const container = "container-name";
const sourceFolder = "source";
const targetFolder = "target";
const blobName = "blob.png";
async function moveBlob() {
const blobServiceClient = BlobServiceClient.fromConnectionString(connectionString);
const containerClient = blobServiceClient.getContainerClient(container);
const sourceBlobClient = containerClient.getBlobClient(`${sourceFolder}/${blobName}`);
const targetBlobClient = containerClient.getBlobClient(`${targetFolder}/${blobName}`);
console.log('Copying source blob to target blob...');
const copyResult = await targetBlobClient.beginCopyFromURL(sourceBlobClient.url);
console.log('Blob copy operation started successfully...');
console.log(copyResult);
do {
console.log('Checking copy status...')
const blobCopiedSuccessfully = await checkIfBlobCopiedSuccessfully(targetBlobClient);
if (blobCopiedSuccessfully) {
break;
}
} while (true);
console.log('Now deleting source blob...');
await sourceBlobClient.delete();
console.log('Source blob deleted successfully....');
console.log('Move operation complete.');
}
async function checkIfBlobCopiedSuccessfully(targetBlobClient) {
const blobPropertiesResult = await targetBlobClient.getProperties();
const copyStatus = blobPropertiesResult.copyStatus;
return copyStatus === 'success';
}
moveBlob();
The previous best solution seem working but I don't like use an infinite loop.
So this is an alternative way to move blob file
const move = async (
fileName: string,
src: string,
dest: string
) => {
try {
const = blobServiceClient = BlobServiceClient.fromConnectionString();
logger.info(`Move storage file [ ${src} -> ${dest} | ${fileName} ]`);
const srcContainerClient = blobServiceClient.getContainerClient(src);
const destContainerClient =
blobServiceClient.getContainerClient(dest);
const blobClient = srcContainerClient.getBlobClient(fileName);
const downloadBlockBlobResponse = await blobClient.download();
const buffer = await streamToBuffer(
downloadBlockBlobResponse.readableStreamBody!
);
blobClient.delete();
const blockBlobClient = containerClient.getBlockBlobClient(fileName);
await blockBlobClient.upload(buffer, buffer.length);
return `${this.storageUrl}/${containerClient.containerName}/${fileName}`;
} catch (e) {
throw new Error(
`Fail to move storage file [ ${src} -> ${dest} | ${fileName} ]`
);
}
};
const streamToBuffer = async (readableStream: NodeJS.ReadableStream): Promise<Buffer> => {
return new Promise((resolve, reject) => {
const chunks: Buffer[] = [];
readableStream.on("data", (data) => {
chunks.push(data instanceof Buffer ? data : Buffer.from(data));
});
readableStream.on("end", () => {
resolve(Buffer.concat(chunks));
});
readableStream.on("error", reject);
});
};

How can I read the data in the excel file with reactjs or javascript using the path to the file

I want to read the contents of the file directly by using the file path. I can do this by having the file selected. But I don't know how to do it using the direct file path. I could not find any examples or sources for this. Below is how I read the file by selecting it from the input.
import * as XLSX from 'xlsx';
var items = [];
readExcel = (file) => {
const promise = new Promise((resolve, reject) => {
const fileReader = new FileReader();
fileReader.readAsArrayBuffer(file);
fileReader.onload = (e) => {
const bufferArray = e.target.result;
const wb = XLSX.read(bufferArray, { type: "buffer" });
const wsname = wb.SheetNames[0];
const ws = wb.Sheets[wsname];
const data = XLSX.utils.sheet_to_json(ws);
resolve(data);
};
fileReader.onerror = (error) => {
reject(error);
};
});
promise.then((d) => {
this.items = d;
console.log(this.items)
// fill dictionary
this.dictionary = Object.assign({}, ...this.items.map((x) => ({ [x.PartNumber]: x.Cost })));
console.log(this.dictionary)
});
};
<input
type="file"
onChange={(e) => {
const file = e.target.files[0];
this.readExcel(file);
}}
/>
I beleive it should work:
const req = new XMLHttpRequest();
req.responseType = "arraybuffer";
req.open("GET", "https://.../MyExcelFile.xlsx", true);
req.onload = () => {
const bufferArray = req.response;
const wb = XLSX.read(bufferArray, { type: "buffer" });
...
I couldn't find a direct read operation. I converted the excel file to json format and got my job done.

Sharp JS: Error with input as Buffer made from base64

I was trying to create an API endpoint for rotating images uploaded from client side. I'm sending images as base64 type, converted from blob (from simple <input tag), as follows:
const addImageBase64 = async (fileData) => {
const file = fileData;
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onload = (event) => {
resolve(event.target.result);
};
reader.onerror = (err) => {
reject(err);
};
reader.readAsDataURL(file);
});
};
Then, on the server side, that's how the endpoint looks like:
app.post("/api/rotate-image", async (req, res) => {
try {
let buffer = Buffer.from(req.body.imageData, "base64"); //not working
let array = new Uint8Array(buffer); //not working
const image = await sharp(buffer)
.rotate(180)
.png({ quality: 100 })
.toBuffer();
console.log("success");
res.status(200).send({
success: true,
result: image,
});
} catch (e) {
console.warn(e);
}
});
And here, every my attempt is ending up with '[Error: Input buffer contains unsupported image format]' - either for Buffer or Uint8Array. Can anyone help me with this issue? What is the right input type for Sharp that acctually works?
Edit:
Error with logged buffer obj:

Categories

Resources