I'd like to convert an AudioBuffer to a Blob so that I can create an ObjectURL from it and then download the audio file.
let rec = new Recorder(async(chunks) => {
var blob = new Blob(chunks, {
type: 'audio/mp3'
});
var arrayBuffer = await blob.arrayBuffer();
const audioContext = new AudioContext()
await audioContext.decodeAudioData(arrayBuffer, (audioBuffer) => {
// How to I now convert the AudioBuffer into an ArrayBuffer => Blob ?
}
An AudioBuffer contains non-interleaved Float32Array PCM samples for each decoded audio channel. For a stereo AudioBuffer, it will contain 2 channels. Those channels need to be interleaved first, and then the interleaved PCM must have a WAV header appended to it so you can download and play as a WAV.
// Float32Array samples
const [left, right] = [audioBuffer.getChannelData(0), audioBuffer.getChannelData(1)]
// interleaved
const interleaved = new Float32Array(left.length + right.length)
for (let src=0, dst=0; src < left.length; src++, dst+=2) {
interleaved[dst] = left[src]
interleaved[dst+1] = right[src]
}
// get WAV file bytes and audio params of your audio source
const wavBytes = getWavBytes(interleaved.buffer, {
isFloat: true, // floating point or 16-bit integer
numChannels: 2,
sampleRate: 48000,
})
const wav = new Blob([wavBytes], { type: 'audio/wav' })
// create download link and append to Dom
const downloadLink = document.createElement('a')
downloadLink.href = URL.createObjectURL(wav)
downloadLink.setAttribute('download', 'my-audio.wav') // name file
supporting functions below:
// Returns Uint8Array of WAV bytes
function getWavBytes(buffer, options) {
const type = options.isFloat ? Float32Array : Uint16Array
const numFrames = buffer.byteLength / type.BYTES_PER_ELEMENT
const headerBytes = getWavHeader(Object.assign({}, options, { numFrames }))
const wavBytes = new Uint8Array(headerBytes.length + buffer.byteLength);
// prepend header, then add pcmBytes
wavBytes.set(headerBytes, 0)
wavBytes.set(new Uint8Array(buffer), headerBytes.length)
return wavBytes
}
// adapted from https://gist.github.com/also/900023
// returns Uint8Array of WAV header bytes
function getWavHeader(options) {
const numFrames = options.numFrames
const numChannels = options.numChannels || 2
const sampleRate = options.sampleRate || 44100
const bytesPerSample = options.isFloat? 4 : 2
const format = options.isFloat? 3 : 1
const blockAlign = numChannels * bytesPerSample
const byteRate = sampleRate * blockAlign
const dataSize = numFrames * blockAlign
const buffer = new ArrayBuffer(44)
const dv = new DataView(buffer)
let p = 0
function writeString(s) {
for (let i = 0; i < s.length; i++) {
dv.setUint8(p + i, s.charCodeAt(i))
}
p += s.length
}
function writeUint32(d) {
dv.setUint32(p, d, true)
p += 4
}
function writeUint16(d) {
dv.setUint16(p, d, true)
p += 2
}
writeString('RIFF') // ChunkID
writeUint32(dataSize + 36) // ChunkSize
writeString('WAVE') // Format
writeString('fmt ') // Subchunk1ID
writeUint32(16) // Subchunk1Size
writeUint16(format) // AudioFormat https://i.stack.imgur.com/BuSmb.png
writeUint16(numChannels) // NumChannels
writeUint32(sampleRate) // SampleRate
writeUint32(byteRate) // ByteRate
writeUint16(blockAlign) // BlockAlign
writeUint16(bytesPerSample * 8) // BitsPerSample
writeString('data') // Subchunk2ID
writeUint32(dataSize) // Subchunk2Size
return new Uint8Array(buffer)
}
a minor edit to AnthumChri's reply above. The following function adds a check for stereo input.
convertAudioBufferToBlob(audioBuffer) {
var channelData = [],
totalLength = 0,
channelLength = 0;
for (var i = 0; i < audioBuffer.numberOfChannels; i++) {
channelData.push(audioBuffer.getChannelData(i));
totalLength += channelData[i].length;
if (i == 0) channelLength = channelData[i].length;
}
// interleaved
const interleaved = new Float32Array(totalLength);
for (
let src = 0, dst = 0;
src < channelLength;
src++, dst += audioBuffer.numberOfChannels
) {
for (var j = 0; j < audioBuffer.numberOfChannels; j++) {
interleaved[dst + j] = channelData[j][src];
}
//interleaved[dst] = left[src];
//interleaved[dst + 1] = right[src];
}
// get WAV file bytes and audio params of your audio source
const wavBytes = this.getWavBytes(interleaved.buffer, {
isFloat: true, // floating point or 16-bit integer
numChannels: audioBuffer.numberOfChannels,
sampleRate: 48000,
});
const wav = new Blob([wavBytes], { type: "audio/wav" });
return wav;
},
Related
Recently I've been working on a voice platform allowing for users to talk to each other under certain conditions. However, it only seems to be smooth when your ping is really low.
Here's the structure currently:
Raw PCM Audio From User -> Web Socket -> Sent to all clients that meet a certain condition
There's nothing particularly special about my WebSocket. It's made in Java and just sends the data received directly from clients to other clients (just temporary).
Issue:
For users that have a decent amount of ping (>100ms) their audio cuts out (choppy) at certain parts and doesn't seem to load fast enough even with the code I have in place. If a video of this will help, let me know!
This is the code I have for recording and playback currently (using AudioWorkletProcessors)
Recording:
buffers = [];
buffersLength = 0;
process(inputs, _, parameters) {
const [input] = inputs;
if (parameters.muted[0] == 1) {
return true;
}
// Create one buffer from this input
const dataLen = input[0].byteLength;
const channels = input.length;
const bufferForSocket = new Uint8Array(dataLen * channels + 9);
bufferForSocket.set(new Uint8Array([channels]), 0);
bufferForSocket.set(numberToArrayBuffer(sampleRate), 1);
bufferForSocket.set(numberToArrayBuffer(input[0].byteLength), 5);
for (let i = 0; i < channels; i++) {
bufferForSocket.set(new Uint8Array(input[i].buffer), 9 + dataLen * i);
}
// Add buffers to a list
this.buffers.push(bufferForSocket);
this.buffersLength += bufferForSocket.byteLength;
// If we have 25 buffers, send them off to websocket
if (this.buffers.length >= 25) {
const combinedBuffer = new Uint8Array(
this.buffersLength + 4 + 4 * this.buffers.length
);
combinedBuffer.set(numberToArrayBuffer(this.buffers.length), 0);
let offset = 4;
for (let i = 0; i < this.buffers.length; i++) {
const buffer = this.buffers[i];
combinedBuffer.set(numberToArrayBuffer(buffer.byteLength), offset);
combinedBuffer.set(buffer, offset + 4);
offset += buffer.byteLength + 4;
}
this.buffers.length = 0;
this.buffersLength = 0;
// This is what sends to WebSocket
this.port.postMessage(combinedBuffer.buffer);
}
return true;
}
Playback:
class extends AudioWorkletProcessor {
buffers = new LinkedList();
timeTillNextBuffer = 0;
process(_, outputs, __) {
const [output] = outputs;
const linkedBuffers = this.buffers.last();
// If we aren't currently playing a buffer and we cannot play a buffer right now, return
if (
linkedBuffers == null ||
(linkedBuffers.buffers.length === 0 && this.timeTillNextBuffer > Date.now())
)
return true;
const buffer = linkedBuffers.buffers.removeLast();
// Current buffer is finished, remove it
if (linkedBuffers.index === linkedBuffers.buffers.length) {
this.buffers.removeLast();
}
if (buffer === null) {
return true;
}
const inputData = buffer.channels;
// put audio to output
for (let channel = 0; channel < outputs.length; channel++) {
const channelData = inputData[channel];
const outputData = output[channel];
for (let i = 0; i < channelData.length; i++) {
outputData[i] = channelData[i];
}
}
return true;
}
static get parameterDescriptors() {
return [];
}
onBuffer(buffer) {
// buffer is ArrayBuffer
const buffersGiven = new DataView(buffer, 0, 4).getUint32(0, true);
let offset = 4;
const buffers = new LinkedList();
const linkedBuffers = { index: 0, buffers };
// Read buffers from WebSocket (created in the snippet above)
for (let i = 0; i < buffersGiven; i++) {
const bufferLength = new DataView(buffer, offset, 4).getUint32(0, true);
const numberOfChannels = new DataView(buffer, offset + 4, 1).getUint8(0, true);
const sampleRate = new DataView(buffer, offset + 5, 4).getUint32(0, true);
const channelLength = new DataView(buffer, offset + 9, 4).getUint32(0, true);
const channels = [];
for (let i = 0; i < numberOfChannels; i++) {
const start = offset + 13 + i * channelLength;
channels[i] = new Float32Array(buffer.slice(start), 0, channelLength / 4);
}
buffers.push({ channelLength, numberOfChannels, sampleRate, channels });
offset += bufferLength + 4;
}
this.buffers.push(linkedBuffers);
// Jitter buffer
this.timeTillNextBuffer = 50 - this.buffers.length * 25 + Date.now();
}
constructor() {
super();
this.port.onmessage = (e) => this.onBuffer(e.data); // Data directly from WebSocket
}
}
I heard the use of Atomics can help because of how the AudioContext plays blank audio when returning. Any tips would be greatly appreciated! Also, if anything is unclear, please let me know!
My code has somewhat of a Jitter buffer system and it doesn't seem to work at all. The audio that a user receives from me (low ping) is clear. However, they (high ping) send me choppy audio. Furthermore, this choppy audio seems to build up and it gets more delayed the more packets I receive.
I made extensive 2 days research on the topic, but there is no really well explained piece that would work.
So the flow is following:
load mp3 (store bought) cbr 320 into wavesurfer
apply all the changes you need
download processed result back to mp3 file (without usage of server)
Ive seen online apps that can do that, nothing is transmitted to server, all happens in the browser.
when we inspect wavesurfer, we have access to these:
The goal would be to use already available references from wavesurfer to produce the download mp3.
from my understanding this can be done with MediaRecorder, WebCodecs API or some libraries like lamejs.
Ive tried to find working example of how to do it with two first methods but without luck. I also tried to do it with lamejs using their example provided on the git but i am getting errors from the lib that are hard to debug, most likely related to providing wrong input.
So far i only managed to download wav file using following script:
handleCopyRegion = (region, instance) => {
var segmentDuration = region.end - region.start;
var originalBuffer = instance.backend.buffer;
var emptySegment = instance.backend.ac.createBuffer(
originalBuffer.numberOfChannels,
Math.ceil(segmentDuration * originalBuffer.sampleRate),
originalBuffer.sampleRate
);
for (var i = 0; i < originalBuffer.numberOfChannels; i++) {
var chanData = originalBuffer.getChannelData(i);
var emptySegmentData = emptySegment.getChannelData(i);
var mid_data = chanData.subarray(
Math.ceil(region.start * originalBuffer.sampleRate),
Math.ceil(region.end * originalBuffer.sampleRate)
);
emptySegmentData.set(mid_data);
}
return emptySegment;
};
bufferToWave = (abuffer, offset, len) => {
var numOfChan = abuffer.numberOfChannels,
length = len * numOfChan * 2 + 44,
buffer = new ArrayBuffer(length),
view = new DataView(buffer),
channels = [],
i,
sample,
pos = 0;
// write WAVE header
setUint32(0x46464952); // "RIFF"
setUint32(length - 8); // file length - 8
setUint32(0x45564157); // "WAVE"
setUint32(0x20746d66); // "fmt " chunk
setUint32(16); // length = 16
setUint16(1); // PCM (uncompressed)
setUint16(numOfChan);
setUint32(abuffer.sampleRate);
setUint32(abuffer.sampleRate * 2 * numOfChan); // avg. bytes/sec
setUint16(numOfChan * 2); // block-align
setUint16(16); // 16-bit (hardcoded in this demo)
setUint32(0x61746164); // "data" - chunk
setUint32(length - pos - 4); // chunk length
// write interleaved data
for (i = 0; i < abuffer.numberOfChannels; i++)
channels.push(abuffer.getChannelData(i));
while (pos < length) {
for (i = 0; i < numOfChan; i++) {
// interleave channels
sample = Math.max(-1, Math.min(1, channels[i][offset])); // clamp
sample = (0.5 + sample < 0 ? sample * 32768 : sample * 32767) | 0; // scale to 16-bit signed int
view.setInt16(pos, sample, true); // update data chunk
pos += 2;
}
offset++; // next source sample
}
// create Blob
return new Blob([buffer], { type: "audio/wav" });
function setUint16(data) {
view.setUint16(pos, data, true);
pos += 2;
}
function setUint32(data) {
view.setUint32(pos, data, true);
pos += 4;
}
};
const cutSelection = this.handleCopyRegion(
this.wavesurfer.regions.list.cut,
this.wavesurfer
);
const blob = this.bufferToWave(cutSelection, 0, cutSelection.length);
// you can now download wav from the blob
Is there a way to avoid making wav and right away make mp3 and download it, or if not make mp3 from that wav, if so how it can be done?
I mainly tried to use wavesurfer.backend.buffer as input, because this reference is AudioBuffer and accessing .getChannelData(0|1) gives you left and right channels. But didnt accomplish anything, maybe i am thinking wrong.
Alright, here is the steps we need to do:
Get buffer data from the wavesurfer player
Analyze the buffer to get the number of Channels(STEREO or MONO), channels data and Sample rate.
Use lamejs library to convert buffer to the MP3 blob file
Then we can get that download link from blob
Here is a quick DEMO
and also the JS code:
function downloadMp3() {
var MP3Blob = analyzeAudioBuffer(wavesurfer.backend.buffer);
console.log('here is your mp3 url:');
console.log(URL.createObjectURL(MP3Blob));
}
function analyzeAudioBuffer(aBuffer) {
let numOfChan = aBuffer.numberOfChannels,
btwLength = aBuffer.length * numOfChan * 2 + 44,
btwArrBuff = new ArrayBuffer(btwLength),
btwView = new DataView(btwArrBuff),
btwChnls = [],
btwIndex,
btwSample,
btwOffset = 0,
btwPos = 0;
setUint32(0x46464952); // "RIFF"
setUint32(btwLength - 8); // file length - 8
setUint32(0x45564157); // "WAVE"
setUint32(0x20746d66); // "fmt " chunk
setUint32(16); // length = 16
setUint16(1); // PCM (uncompressed)
setUint16(numOfChan);
setUint32(aBuffer.sampleRate);
setUint32(aBuffer.sampleRate * 2 * numOfChan); // avg. bytes/sec
setUint16(numOfChan * 2); // block-align
setUint16(16); // 16-bit
setUint32(0x61746164); // "data" - chunk
setUint32(btwLength - btwPos - 4); // chunk length
for (btwIndex = 0; btwIndex < aBuffer.numberOfChannels; btwIndex++)
btwChnls.push(aBuffer.getChannelData(btwIndex));
while (btwPos < btwLength) {
for (btwIndex = 0; btwIndex < numOfChan; btwIndex++) {
// interleave btwChnls
btwSample = Math.max(-1, Math.min(1, btwChnls[btwIndex][btwOffset])); // clamp
btwSample = (0.5 + btwSample < 0 ? btwSample * 32768 : btwSample * 32767) | 0; // scale to 16-bit signed int
btwView.setInt16(btwPos, btwSample, true); // write 16-bit sample
btwPos += 2;
}
btwOffset++; // next source sample
}
let wavHdr = lamejs.WavHeader.readHeader(new DataView(btwArrBuff));
//Stereo
let data = new Int16Array(btwArrBuff, wavHdr.dataOffset, wavHdr.dataLen / 2);
let leftData = [];
let rightData = [];
for (let i = 0; i < data.length; i += 2) {
leftData.push(data[i]);
rightData.push(data[i + 1]);
}
var left = new Int16Array(leftData);
var right = new Int16Array(rightData);
//STEREO
if (wavHdr.channels===2)
return bufferToMp3(wavHdr.channels, wavHdr.sampleRate, left,right);
//MONO
else if (wavHdr.channels===1)
return bufferToMp3(wavHdr.channels, wavHdr.sampleRate, data);
function setUint16(data) {
btwView.setUint16(btwPos, data, true);
btwPos += 2;
}
function setUint32(data) {
btwView.setUint32(btwPos, data, true);
btwPos += 4;
}
}
function bufferToMp3(channels, sampleRate, left, right = null) {
var buffer = [];
var mp3enc = new lamejs.Mp3Encoder(channels, sampleRate, 128);
var remaining = left.length;
var samplesPerFrame = 1152;
for (var i = 0; remaining >= samplesPerFrame; i += samplesPerFrame) {
if (!right)
{
var mono = left.subarray(i, i + samplesPerFrame);
var mp3buf = mp3enc.encodeBuffer(mono);
}
else {
var leftChunk = left.subarray(i, i + samplesPerFrame);
var rightChunk = right.subarray(i, i + samplesPerFrame);
var mp3buf = mp3enc.encodeBuffer(leftChunk,rightChunk);
}
if (mp3buf.length > 0) {
buffer.push(mp3buf);//new Int8Array(mp3buf));
}
remaining -= samplesPerFrame;
}
var d = mp3enc.flush();
if(d.length > 0){
buffer.push(new Int8Array(d));
}
var mp3Blob = new Blob(buffer, {type: 'audio/mpeg'});
//var bUrl = window.URL.createObjectURL(mp3Blob);
// send the download link to the console
//console.log('mp3 download:', bUrl);
return mp3Blob;
}
Let me know if you have any question about the code
I'm trying to convert one of the javascript functions into Swift 5 function. But even after so much of extensive search I couldn't find anything on it.
function toArrayBuffer(type, ts, x, y, z, sc) {
let userId = userID.getUserId();
//console.log("Found GPS UserID: " + userId);
if (userId == "Unauthor") {
//console.log("GPS did not find the userID")
return null;
}
let buffer = new ArrayBuffer(8 * 4);
// Offset: 0
let float64Bytes = new Float64Array(buffer);
float64Bytes[0] = ts;
// Offset: 8
let bytes = new Float32Array(buffer);
bytes[2] = x;
bytes[3] = y;
bytes[4] = z;
bytes[5] = sc;
// Offset: 24
let uint8Bytes = new Uint8Array(buffer);
uint8Bytes[24] = type;
for (let i = 0; i < 8; i++) {
if (userId.charCodeAt(i)) {
uint8Bytes[i + 25] = userId.charCodeAt(i);
}
}
return buffer;
}
Basically I tried to build the same function with var byteArray = [UInt8](stringVal.utf8) but UInt8 can store only upto 256, but I had to store epoch time stamp. So, it doesn't work too. Any help would be appreciated.
🖐
I'm using WaveSurfer js for my project where we can edit an audio.
For that i use the region plugin.
When the users clicks the button finish, I want to export the result in a audio file (mp3/wav)
To get the peaks of the audio where the user selected his audio, I do:
var json = wavesurfer.backend.getPeaks(960, wavesurfer.regions.list["wavesurfer_j99v7ophop8"].start, wavesurfer.regions.list["wavesurfer_j99v7ophop8"].end)
This works but i want to export it as an audio file and not a json
Thanks in advance
See this answer and you can create an Audio Wav file from buffer.
So the code and method might be like this:
Method:
// Convert a audio-buffer segment to a Blob using WAVE representation
// The returned Object URL can be set directly as a source for an Auido element.
// (C) Ken Fyrstenberg / MIT license
function bufferToWave(abuffer, offset, len) {
var numOfChan = abuffer.numberOfChannels,
length = len * numOfChan * 2 + 44,
buffer = new ArrayBuffer(length),
view = new DataView(buffer),
channels = [], i, sample,
pos = 0;
// write WAVE header
setUint32(0x46464952); // "RIFF"
setUint32(length - 8); // file length - 8
setUint32(0x45564157); // "WAVE"
setUint32(0x20746d66); // "fmt " chunk
setUint32(16); // length = 16
setUint16(1); // PCM (uncompressed)
setUint16(numOfChan);
setUint32(abuffer.sampleRate);
setUint32(abuffer.sampleRate * 2 * numOfChan); // avg. bytes/sec
setUint16(numOfChan * 2); // block-align
setUint16(16); // 16-bit (hardcoded in this demo)
setUint32(0x61746164); // "data" - chunk
setUint32(length - pos - 4); // chunk length
// write interleaved data
for(i = 0; i < abuffer.numberOfChannels; i++)
channels.push(abuffer.getChannelData(i));
while(pos < length) {
for(i = 0; i < numOfChan; i++) { // interleave channels
sample = Math.max(-1, Math.min(1, channels[i][offset])); // clamp
sample = (0.5 + sample < 0 ? sample * 32768 : sample * 32767)|0; // scale to 16-bit signed int
view.setInt16(pos, sample, true); // update data chunk
pos += 2;
}
offset++ // next source sample
}
// create Blob
return (URL || webkitURL).createObjectURL(new Blob([buffer], {type: "audio/wav"}));
function setUint16(data) {
view.setUint16(pos, data, true);
pos += 2;
}
function setUint32(data) {
view.setUint32(pos, data, true);
pos += 4;
}
}
Usage:
let originalBuffer = bufferToWave(app.engine.wavesurfer.backend.buffer, 0, app.engine.wavesurfer.backend.buffer.length);
I'm trying to write an application to do end-to-end encryption for files with JS in browser. However I don't seem to be able to get all files decrypted correctly.
TL;DR As it's impractical to encrypt files bigger than 1MB as a whole, I'm trying to encrypt them chunk by chunk. After doing so I try to write the encrypted words (resulted from CryptoJS's WordArray) into a blob. As for decryption I read the files and split them to chunks according to map generated while encrypting the chunks and try to decrypt them. The problem is decrypted result is 0 bits!
I guess I'm not reading the chunks while decrypting correctly. Please take a look at the code below for the function getBlob (writing data to the blob) and the last part of decryptFile for reading chunks.
More explanation
I'm using CryptoJS AES with default settings.
Right now my code looks like this:
function encryptFile (file, options, resolve, reject) {
if (!options.encrypt) {
return resolve(file)
}
if (!options.processor || !options.context) {
return reject('No encryption method.')
}
function encryptBlob (file, optStart, optEnd) {
const start = optStart || 0
let stop = optEnd || CHUNK_SIZE
if (stop > file.size - 1) {
stop = file.size
}
const blob = file.slice(start, stop)
const fileReader = new FileReader()
fileReader.onloadend = function () {
if (this.readyState !== FileReader.DONE) return
const index = Math.ceil(optStart / CHUNK_SIZE)
const result = CryptoJS.lib.WordArray.create(this.result)
encryptedFile[index] = encrypt(result)
chunksResolved++
if (chunksResolved === count) {
const {sigBytes, sigBytesMap, words} = getCipherInfo(encryptedFile)
const blob = getBlob(sigBytes, words)
resolve(blob, Object.keys(sigBytesMap))
}
}
fileReader.readAsArrayBuffer(blob)
}
let chunksResolved = 0
const encryptedFile = []
const CHUNK_SIZE = 1024*1024
const count = Math.ceil(file.size / CHUNK_SIZE)
const encrypt = value => options.processor.call(
options.context, value, 'file',
(v, k) => CryptoJS.AES.encrypt(v, k))
for (let start = 0; (start + CHUNK_SIZE) / CHUNK_SIZE <= count; start+= CHUNK_SIZE) {
encryptBlob(file, start, start + CHUNK_SIZE - 1)
}
}
As you can see I'm trying to read the file chunk by chunk (each chunk is 1MB or fileSize % 1MB) as ArrayBuffer, converting it to WordArray for CryptoJS to understand and encrypt it.
After encrypting all the chunks I try to write each word they have to a blob (using a code I found in CryptoJS's issues in Google Code, mentioned below) and I guess here is what goes wrong. I also generated a map for where encrypted chunks end so I can later use it to get the chunks out of the binary file for decryption.
And here's how I decrypt the files:
function decryptFile (file, sigBytesMap, filename, options, resolve, reject) {
if (!options.decrypt) {
return resolve(file)
}
if (!options.processor || !options.context) {
return reject('No decryption method.')
}
function decryptBlob (file, index, start, stop) {
const blob = file.slice(start, stop)
const fileReader = new FileReader()
fileReader.onloadend = function () {
if (this.readyState !== FileReader.DONE) return
const result = CryptoJS.lib.WordArray.create(this.result)
decryptedFile[index] = decrypt(result)
chunksResolved++
if (chunksResolved === count) {
const {sigBytes, words} = getCipherInfo(decryptedFile)
const finalFile = getBlob(sigBytes, words)
resolve(finalFile, filename)
}
}
fileReader.readAsArrayBuffer(blob)
}
let chunksResolved = 0
const count = sigBytesMap.length
const decryptedFile = []
const decrypt = value => options.processor.call(
options.context, value, 'file',
(v, k) => CryptoJS.AES.decrypt(v, k))
for (let i = 0; i < count; i++) {
decryptBlob(file, i, parseInt(sigBytesMap[i - 1]) || 0, parseInt(sigBytesMap[i]) - 1)
}
}
Decryption is exactly like the encryption but doesn't work. Although chunks are not 1MB anymore, they are limited to sigBytes mentioned in the map. There is no result for the decryption! sigBytes: 0.
Here's the code for generating a blob and getting sigbytesMap:
function getCipherInfo (ciphers) {
const sigBytesMap = []
const sigBytes = ciphers.reduce((tmp, cipher) => {
tmp += cipher.sigBytes || cipher.ciphertext.sigBytes
sigBytesMap.push(tmp)
return tmp
}, 0)
const words = ciphers.reduce((tmp, cipher) => {
return tmp.concat(cipher.words || cipher.ciphertext.words)
}, [])
return {sigBytes, sigBytesMap, words}
}
function getBlob (sigBytes, words) {
const bytes = new Uint8Array(sigBytes)
for (var i = 0; i < sigBytes; i++) {
const byte = (words[i >>> 2] >>> (24 - (i % 4) * 8)) & 0xff
bytes[i] = byte
}
return new Blob([ new Uint8Array(bytes) ])
}
I'm guessing the issue is the method I'm using to read the encrypted chunks. Or maybe writing them!
I should also mention that previously I was doing something different for encryption. I was stringifying each WordArray I got as the result for CryptoJS.AES.encrypt using the toString method with the default encoding (which I believe is CryptoJS.enc.Hex) but some files didn't decrypt correctly. It didn't have anything to do with the size of the original file, rather than their types. Again, I'm guessing!
Turns out the problem was the WordArray returned by CryptoJS.AES.decrypt(value, key) has 4 extra words as padding which should not be included in the final result. CryptoJS tries unpadding the result but only changes sigBytes accordingly and doesn't change words. So when decrypting, before writing chunks to file pop those extra words. 4 words for full chunks and 3 for smaller ones (last chunk).
check this issue
import CryptoJS from "crypto-js";
async function encryptBlobToBlob(blob: Blob, secret: string): Promise<Blob> {
const wordArray = CryptoJS.lib.WordArray.create(await blob.arrayBuffer());
const result = CryptoJS.AES.encrypt(wordArray, secret);
return new Blob([result.toString()]);
}
export async function decryptBlobToBlob(blob: Blob, secret: string): Promise<Blob> {
const decryptedRaw = CryptoJS.AES.decrypt(await blob.text(), secret);
return new Blob([wordArrayToByteArray(decryptedRaw)]);
}
function wordToByteArray(word, length) {
const ba = [];
const xFF = 0xff;
if (length > 0) ba.push(word >>> 24);
if (length > 1) ba.push((word >>> 16) & xFF);
if (length > 2) ba.push((word >>> 8) & xFF);
if (length > 3) ba.push(word & xFF);
return ba;
}
function wordArrayToByteArray({ words, sigBytes }: { sigBytes: number; words: number[] }) {
const result = [];
let bytes;
let i = 0;
while (sigBytes > 0) {
bytes = wordToByteArray(words[i], Math.min(4, sigBytes));
sigBytes -= bytes.length;
result.push(bytes);
i++;
}
return new Uint8Array(result.flat());
}
async function main() {
const secret = "bbbb";
const blob = new Blob(["1".repeat(1e3)]);
const encryptedBlob = await encryptBlobToBlob(blob, secret);
console.log("enrypted blob size", encryptedBlob.size);
const decryptedBlob = await decryptBlobToBlob(encryptedBlob, secret);
console.log("decryptedBlob", decryptedBlob);
console.log(await decryptedBlob.text());
}
main();