new Function() inside of AudioWorklet - javascript

I want to create an audio editor where you can connect nodes together to create custom audio components. Every time the nodes change, they get compiled into javascript and then will be run by a new Function() to get better performance. I just read up that there is the possibility to create an AudioWorklet, which runs on a separate thread. Now I am wondering if there is a possibility of combining both ideas in a way where my algorithm gets passed to the AudioWorklet as a string of javascript code, where it then gets put into a function using new Function(codeString) inside of the constructor. Then the audioworklet's process() function will call the custom function somehow.
Is this possible in some way, or am I asking for too much? I would like to get a "yes, that's possible" or a "no, sorry" before I spend hours trying to get it to work...
Thanks for your help,
dogefromage

With the help of #AKX's comment, I crafted together this solution. The code inside the string will later be replaced by a compiler.
function generateProcessor()
{
return (`
class TestProcessor extends AudioWorkletProcessor
{
process(inputs, outputs)
{
const input = inputs[0];
const output = outputs[0];
for (let channel = 0; channel < output.length; ++channel) {
for (let i = 0; i < output[channel].length; i++) {
output[channel][i] = 0.01 * Math.acos(input[channel][i]);
}
}
return true;
}
}
registerProcessor('test-processor', TestProcessor);
`);
}
const button = document.querySelector('#button');
button.addEventListener('click', async (e) =>
{
const audioContext = new AudioContext();
await audioContext.audioWorklet.addModule(
URL.createObjectURL(new Blob([
generateProcessor()
], {type: "application/javascript"})));
const oscillator = new OscillatorNode(audioContext);
const testProcessor = new AudioWorkletNode(audioContext, 'test-processor');
oscillator.connect(testProcessor).connect(audioContext.destination);
oscillator.start();
});

Related

NodeJS: Using Pipe To Write A File From A Readable Stream Gives Heap Memory Error

I am trying to create 150 million lines of data and write the data into a csv file so that I can insert the data into different databases with little modification.
I am using a few functions to generate seemingly random data and pushing the data into the writable stream.
The code that I have right now is unsuccessful at handling memory issue.
After a few hours of research, I am starting to think that I should not be pushing each data at the end of the for loop because it seems that the pipe method simply cannot handle garbage collection this way.
Also, I found a few StackOverFlow answers and NodeJS docs that recommend against using push at all.
However, I am very new to NodeJS and I feel like I am blocked and do not know how to proceed from here.
If someone can provide me any guidance on how to proceed and give me an example, I would really appreciate it.
Below is a part of my code to give you a better understanding of what I am trying to achieve.
P.S. -
I have found a way to write successfully handle memory issue without using pipe method at all --I used the drain event-- but I had to start from scratch and now I am curious to know if there is a simple way to handle this memory issue without completely changing this bit of code.
Also, I have been trying to avoid using any library because I feel like there should be a relatively easy tweak to make this work without using a library but please tell me if I am wrong. Thank you in advance.
// This is my target number of data
const targetDataNum = 150000000;
// Create readable stream
const readableStream = new Stream.Readable({
read() {}
});
// Create writable stream
const writableStream = fs.createWriteStream('./database/RDBMS/test.csv');
// Write columns first
writableStream.write('id, body, date, dp\n', 'utf8');
// Then, push a number of data to the readable stream (150M in this case)
for (var i = 1; i <= targetDataNum; i += 1) {
const id = i;
const body = lorem.paragraph(1);
const date = randomDate(new Date(2014, 0, 1), new Date());
const dp = randomNumber(1, 1000);
const data = `${id},${body},${date},${dp}\n`;
readableStream.push(data, 'utf8');
};
// Pipe readable stream to writeable stream
readableStream.pipe(writableStream);
// End the stream
readableStream.push(null);
Since you're new to streams, maybe start with an easier abstraction: generators. Generators generate data only when it is consumed (just like Streams should), but they don't have buffering and complicated constructors and methods.
This is just your for loop, moved into a generator function:
function * generateData(targetDataNum) {
for (var i = 1; i <= targetDataNum; i += 1) {
const id = i;
const body = lorem.paragraph(1);
const date = randomDate(new Date(2014, 0, 1), new Date());
const dp = randomNumber(1, 1000);
yield `${id},${body},${date},${dp}\n`;
}
}
In Node 12, you can create a Readable stream directly from any iterable, including generators and async generators:
const stream = Readable.from(generateData(), {encoding: 'utf8'})
stream.pipe(writableStream)
i suggest to try a solution like the following:
const { Readable } = require('readable-stream');
class CustomReadable extends Readable {
constructor(max, options = {}) {
super(options);
this.targetDataNum = max;
this.i = 1;
}
_read(size) {
if (i <= this.targetDataNum) {
// your code to build the csv content
this.push(data, 'utf8');
return;
}
this.push(null);
}
}
const rs = new CustomReadable(150000000);
rs.pipe(ws);
Just complete it with your portion of code to fill the csv and create the writable stream.
With this solution you leave calling the rs.push method to the internal _read stream method invoked until this.push(null) is not called. Probably before you were filling the internal stream buffer too fast calling push manually in a loop getting the out memory error.
Try pipeing to the WritableStream before you start pumping data into the ReadableStream and yield before you write the next chunk.
...
// Write columns first
writableStream.write('id, body, date, dp\n', 'utf8');
// Pipe readable stream to writeable stream
readableStream.pipe(writableStream);
// Then, push a number of data to the readable stream (150M in this case)
for (var i = 1; i <= targetDataNum; i += 1) {
const id = i;
const body = lorem.paragraph(1);
const date = randomDate(new Date(2014, 0, 1), new Date());
const dp = randomNumber(1, 1000);
const data = `${id},${body},${date},${dp}\n`;
readableStream.push(data, 'utf8');
// somehow YIELD for the STREAM to drain out.
};
...
The entire Stream implementation of Node.js relies on the fact that the wire is slow and that the CPU can actually have a downtime before the next chunk of data comes in from the stream source or till the next chunk of data has been written to the stream destination.
In the current implementation, since the for-loop has booked up the CPU, there is no downtime for the actual pipeing of the data to the writestream. You will be able to catch this if you watch cat test.csv which will not change while the loop is running.
As (I am sure) you know, pipe helps in guaranteeing that the data you are working with is buffered in memory only in chunks and not as a whole. But that guarantee only holds true if the CPU gets enough downtime to actually drain the data.
Having said all that, I wrapped your entire code into an async IIFE and ran it with an await for a setTimeout which ensures that I yield for the stream to drain the data.
let fs = require('fs');
let Stream = require('stream');
(async function () {
// This is my target number of data
const targetDataNum = 150000000;
// Create readable stream
const readableStream = new Stream.Readable({
read() { }
});
// Create writable stream
const writableStream = fs.createWriteStream('./test.csv');
// Write columns first
writableStream.write('id, body, date, dp\n', 'utf8');
// Pipe readable stream to writeable stream
readableStream.pipe(writableStream);
// Then, push a number of data to the readable stream (150M in this case)
for (var i = 1; i <= targetDataNum; i += 1) {
console.log(`Pushing ${i}`);
const id = i;
const body = `body${i}`;
const date = `date${i}`;
const dp = `dp${i}`;
const data = `${id},${body},${date},${dp}\n`;
readableStream.push(data, 'utf8');
await new Promise(resolve => setImmediate(resolve));
};
// End the stream
readableStream.push(null);
})();
This is what top looks like pretty much the whole time I am running this.
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
15213 binaek ** ** ****** ***** ***** * ***.* 0.5 *:**.** node
Notice the %MEM which stays more-or-less static.
You were running out of memory because you were pre-generating all the data in memory before you wrote any of it to disk. Instead, you need a strategy to write is as you generate so you don't have to hold large amounts of data in memory.
It does not seem like you need .pipe() here because you control the generation of the data (it's not coming from some random readStream).
So, you can just generate the data and immediately write it and handle the drain event when needed. Here's a runnable example (this creates a very large file):
const {once} = require('events');
const fs = require('fs');
// This is my target number of data
const targetDataNum = 150000000;
async function run() {
// Create writable stream
const writableStream = fs.createWriteStream('./test.csv');
// Write columns first
writableStream.write('id, body, date, dp\n', 'utf8');
// Then, push a number of data to the readable stream (150M in this case)
for (let i = 1; i <= targetDataNum; i += 1) {
const id = i;
const body = lorem.paragraph(1);
const date = randomDate(new Date(2014, 0, 1), new Date());
const dp = randomNumber(1, 1000);
const data = `${id},${body},${date},${dp}\n`;
const canWriteMore = writableStream.write(data);
if (!canWriteMore) {
// wait for stream to be ready for more writing
await once(writableStream, "drain");
}
}
writableStream.end();
}
run().then(() => {
console.log(done);
}).catch(err => {
console.log("got rejection: ", err);
});
// placeholders for the functions that were being used
function randomDate(low, high) {
let rand = randomNumber(low.getTime(), high.getTime());
return new Date(rand);
}
function randomNumber(low, high) {
return Math.floor(Math.random() * (high - low)) + low;
}
const lorem = {
paragraph: function() {
return "random paragraph";
}
}

Tone.js Tone.BufferSource: buffer is either not set or not loaded

Tone.BufferSource: buffer is either not set or not loaded. This error occurs in try/catch block. It only occurs, when I trigger update function constantly or sometimes randomly.
When this error occurs my audio just turns off for a brief moment.
The logic behind my code. When program starts create function is invoked in the constructor creating Tone.sequence later on when I change/update track parameters i call update fuction,
which calls loopprocessor with new/updated tracks. But when i trigger update which triggers loopprocessor function it runs into tone.sourcebuffer is either not set ir loaded. How can i work around this problem?
My code:
import Tone from "tone";
export function create(tracks, beatNotifier){
const loop = new Tone.Sequence(
loopProcessor(tracks, beatNotifier),
[...new Array(16)].map((_, i) => i),
"16n"
);
Tone.Transport.bpm.value = 120;
Tone.Transport.start();
return loop;
}
export function update(loop, tracks, beatNotifier){
loop.callback = loopProcessor(tracks, beatNotifier);
return loop;
}
function loopProcessor (tracks, beatNotifier) {
const urls = tracks.reduce((acc, {name}) => {
return {...acc, [name]: `http://localhost:3000/src/sounds/${name}.[wav|wav]`};
}, {});
const keys = new Tone.Players(urls, {
fadeOut: "64n"
}).toMaster();
return (time, index) => {
beatNotifier(index);
tracks.forEach(({name, vol, muted, note, beats}) => {
if (beats[index]) {
try {
var vel = Math.random() * 0.5 + 0.5;
keys
.get(name)
.start(time, 0, note, 0, vel);
keys
.get(name).volume.value = muted
? -Infinity
: vol;
} catch(e) {
console.log("error", e);
}
}
});
};
}
I had this problem recently and found a solution that worked for my case.
Tone.js doesn't like it when you initialise an audio buffer inside a function (what you're doing when you call new Tone.Players inside loopprocessor).
To get around this at the top of your code declare a new global variable buffer1 = new Tone.Buffer(url1) for each url that you need. https://tonejs.github.io/docs/r13/Buffer
Then inside loopprocessor just replace urls with each buffer and a name tag and you shouldn't have any problems. So new Tone.Players({"name1": buffer1, "name2": buffer2, ...})

Socket.io emitting values inside ES6 class

I wonder if any smart individuals could show me how to implement Socket.IO in an OOP environment with ES6 classes. The main problem I keep running into with Socket.io is passing around the server object, in my case called 'io'. Almost every example I've seen of socket.io has been pure spaghetti code, one file with many socket related events and logic. First I tried to pass the server object, io, to new class's constructor, but for some reason you end up with a nasty "RangeError: Maximum call stack size exceeded" error message. Then I've tried to wrap my classes in module.exports function which parameter should contain the io object. Which is fine for the first class. Let's say I pass the io object into my Game, great works as expected. But when I try to reference the io object down to the Round class(Game holds an array of Rounds) I can't. Because that is one hell of a bad practice in NodeJS, require should be global and not inside the modules/functions. So I'm once again back with the same issue.
app.js(where I require the main sockets file)
const io = socketio(server, { origins: '*:*' });
...
require('./sockets')(io);
sockets/index.js(where I initialize my game server, and handle incoming messages from client sockets)
const actions = require('../actions.js');
const chatSockets = require('./chat-sockets');
const climbServer = require('./climb-server');
const authFunctions = require('../auth-functions');
module.exports = (io) => {
io.on('connection', (client) => {
console.log('client connected...');
// Standard join, verify the requested room; if it exists let the client join it.
client.on('join', (data) => {
console.log(data);
console.log(`User ${data.username} tries to join ${data.room}`);
console.log(`Client joined ${data.room}`);
client.join(data.room);
});
client.on('disconnect', () => {
console.log('Client disconnected');
});
client.on(actions.CREATE_GAME, (hostParticipant) => {
console.log('CREATE_GAME', hostParticipant);
// Authorize socket sender by token?
// Create a new game, and set the host to the host participant
climbServer.createGame(io, hostParticipant);
});
client.on(actions.JOIN_GAME, (tokenizedGameId) => {
console.log('JOIN_GAME');
const user = authFunctions.getPayload(tokenizedGameId.token);
// Authorize socket sender by token?
// Create a new game, and set the host to the host participant
const game = climbServer.findGame(tokenizedGameId.content);
game.joinGame(user);
});
});
};
climbServer.js(My game server that keeps track of active games)
const actions = require('../actions.js');
const Game = require('../models/game');
const climbServer = { games: { }, gameCount: 0 };
climbServer.createGame = (io, hostParticipant) => {
// Create a new game instance
const newGame = new Game(hostParticipant);
console.log('New game object created', newGame);
// Store it in the list of game
climbServer.games[newGame.id] = newGame;
// Keep track
climbServer.gameCount += 1;
// Notify clients that a new game was created
io.sockets.in('climb').emit(actions.CLIMB_GAME_CREATED, newGame);
};
climbServer.findGame = gameId => climbServer.games[gameId];
module.exports = climbServer;
Game.js(ES6 class that SHOULD be able to emit to all connected sockets)
const UUID = require('uuid');
const Round = require('./round');
class Game {
// Constructor
constructor(hostParticipant) {
this.id = UUID();
this.playerHost = hostParticipant;
this.playerClient = null;
this.playerCount = 1;
this.rounds = [];
this.timestamp = Date.now();
}
joinGame(clientParticipant) {
console.log('Joining game', clientParticipant);
this.playerClient = clientParticipant;
this.playerCount += 1;
// Start the game by creating the first round
return this.createRound();
}
createRound() {
console.log('Creating new round at Game: ', this.id);
const newRound = new Round(this.id);
return this.rounds.push(newRound);
}
}
module.exports = Game;
Round.js(ES6 class that is used by the Game class(stored in a rounds array))
const actions = require('../actions.js');
class Round {
constructor(gameId) {
console.log('Initializing round of gameId', gameId);
this.timeLeft = 60;
this.gameId = gameId;
this.winner = null;
this.timestamp = Date.now();
// Start countdown when class is instantiated
this.startCountdown();
}
startCountdown() {
const countdown = setInterval(() => {
// broadcast to every client
io.sockets.in(this.gameId).emit(actions.ROUND_TIMER, { gameId: this.gameId, timeLeft: this.timeLeft });
if (this.timeLeft === 0) {
// when no time left, stop counting down
clearInterval(countdown);
this.onRoundEnd();
} else {
// Countdown
this.timeLeft -= 1;
console.log('Countdown', this.timeLeft);
}
}, 1000);
}
onRoundEnd() {
// Evaluate who won
console.log('onRoundEnd: ', this.gameId);
}
}
module.exports = Round;
TO SUMMARIZE with a question: How can I pass a reference of io to my classes so that I'm able to emit to connected sockets within these classes?
This doesn't necessarily have to be ES6 classes, it can be NodeJS objects using the .prototype property. I just want a mainatainable way to handle my game server with sockets... ANY HELP IS APPRECIATED!
After hours upon hours I figured out a solution. If anyone runs into the same thing check my solution out below. Not the best, but much better than putting all socket related code in one file...
Game.js(ES6 Class). Focus on the first line containing 'module.exports'.
const GameFactory = require('../models/game');
const climbServer = { games: { }, gameCount: 0 };
climbServer.createGame = (io, hostParticipant) => {
// Create a new game instance
const Game = GameFactory(io);
const newGame = new Game(hostParticipant);
console.log('New game object created', newGame);
// Store it in the list of game
climbServer.games[newGame.id] = newGame;
// Keep track
climbServer.gameCount += 1;
return newGame;
};
climbServer.findGame = gameId => climbServer.games[gameId];
module.exports = climbServer;
The trick is to use this factory pattern where you first declare:
const GameFactory = require('../models/game');
Then initialize the factory with passing in the Socket.io server object, in my case 'io'. IF YOU pass it in via the constructor you end up with a RangeError, therefore this is the only way. Once again not certain how this code performs in comparison to spaghetti code.
const Game = GameFactory(io);
Finally, you can now instantiate instances of your class:
const newGame = new Game(hostParticipant);
If anyone have improvements or thoughts, please leave me a comment. Still uncertain about the quality of this code.

How to make Gnome Shell extension query for changes

I've been battling the horrendous Gnome API documentation and came up with this extension:
const St = imports.gi.St;
const Main = imports.ui.main;
const Tweener = imports.ui.tweener;
const GLib = imports.gi.GLib;
let label;
function init() {
label = new St.Bin({ style_class: 'panel-label' });
let stuff = GLib.spawn_command_line_sync("cat /home/user/temp/hello")[1].toString();
let text = new St.Label({ text: stuff });
label.set_child(text);
}
function enable() {
Main.panel._rightBox.insert_child_at_index(label, 0);
}
function disable() {
Main.panel._rightBox.remove_child(label);
}
This should read whatever is in the hello file and display it in the top panel. However, if I change the contents of the hello file, I have to restart Gnome for that new content to be shown. Now, surely there is a way to do this dynamically but I just couldn't find anything in the documentation. The message in the panel should basically always mirror whatever is in the file. Any ideas how to do this?
You'll want to obtain a Gio.File handle for your hello file, and then monitor it:
let helloFile = Gio.File.new_for_path('/home/user/temp/hello');
let monitor = helloFile.monitor(Gio.FileMonitorFlags.NONE, null);
monitor.connect('changed', function (file, otherFile, eventType) {
// change your UI here
});
This worked for me. It will refresh the label value each 30 seconds.
Add the following import
const Mainloop = imports.mainloop;
In your init method
Mainloop.timeout_add(30000, function () {
let stuff = GLib.spawn_command_line_sync("your_command")[1].toString();
let label = new St.Label({ text: stuff });
button.set_child(label);return true});

Play Audio through JS with Overlapping

I have the following JS code for a canvas based game.
var EXPLOSION = "sounds/explosion.wav";
function playSound(str, vol) {
var snd = new Audio();
snd.src = str;
snd.volume = vol;
snd.play();
}
function createExplosion() {
playSound(EXPLOSION, 0.5);
}
This works, however it sends a server request to download the sound file every time it is called. Alternatively, if I declare the Audio object beforehand:
var snd = new Audio();
snd.src = EXPLOSION;
snd.volume = 0.5;
function createExplosion() {
snd.play();
}
This works, however if the createExplosion function is called before the sound is finished playing, it does not play the sound at all. This means that only a single playthrough of the sound file is allowed at a time - and in scenarios that multiple explosions are taking place it doesn't work at all.
Is there any way to properly play an audio file multiple times overlapping with itself?
I was looking for this for ages in a tetris game i'm building and I think this solution is the best.
function playSoundMove() {
var sound = document.getElementById("move");
sound.load();
sound.play();
}
just have it loaded and ready to go.
You could just duplicate the node with cloneNode() and play() that duplicate node.
My audio element looks like this:
<audio id="knight-audio" src="knight.ogg" preload="auto"></audio>
and I have an onClick listener that does just that:
function click() {
const origAudio = document.getElementById("knight-audio");
const newAudio = origAudio.cloneNode()
newAudio.play()
}
And since the audio element isn't going to be displayed, you don't actually have to attach the node to anything.
I verified client-side and server-side that Chrome only tries to download the audio file once.
Caveats: I'm not sure about performance impacts, since this on my site this clip doesn't get played more than ~40x maximum for a page. You might have to clean up the audio nodes if you're doing something much larger than that?
Try this:
(function() {
var snds = {};
window.playSound(str,vol) {
if( !snds[str]) (snds[str] = new Audio()).src = str;
snds[str].volume = vol;
snds[str].play();
}
})();
Then the first time you call it it will fetch the sound, but every time after that it will reuse the same sound object.
EDIT: You can also preload with duplicates to allow the sound to play more than once at a time:
(function() {
var snds = {}
window.playSound = function(str,vol) {
if( !snds[str]) {
snds[str] = [new Audio()];
snds[str][0].src = str;
}
var snd = snds[str], pointer = 0;
while( snd[pointer].playing) {
pointer++;
if( pointer >= snd.length) {
snd.push(new Audio());
snd[pointer].src = str;
}
}
snd[pointer].volume = vol;
snd[pointer].play();
};
})();
Note that this will send multiple requests if you play the sound overlapping itself too much, but it should return Not Modified very quickly and will only do so if you play it more times than you have previously.
In my game i'm using preoading but after the sound is initiated (its not so smart to not preload at all or preload everything on page load, some sound hasn't played in some gameplay at all, why to load them)
const audio {};
audio.dataload = {'entity':false,'entityes':[],'n':0};
audio.dataload.ordernum = function() {
audio.dataload.n = (audio.dataload.n + 1)%10;
return audio.dataload.n;
}
audio.dataload.play = function() {
audio.dataload.entity = new Audio('/some.mp3');
for (let i = 0; i<10;i++) {
audio.dataload.entityes.push(audio.dataload.entity.cloneNode());
}
audio.dataload.entityes[audio.dataload.ordernum()].play();
}
audio.dataload.play() // plays sound and preload sounds to memory when it isn't
I've created a class that allows for layered audio. This is very similar to other answers where it creates another node with the same src, but this class will only do that if necessary. If it has created a node already that has been completed, it will replay that existing node.
Another tweak to this is that initially fetch the audio and use the URL of the blob. I do this for efficiency; so the src doesn't have to be fetched externally every single time a new node is created.
class LayeredAudio {
url;
samples = [];
constructor(src){
fetch(src)
.then(response => response.blob())
.then((blob) => {
this.url = URL.createObjectURL(blob);
this.samples[0] = new Audio(this.url);
});
}
play(){
if(!this.samples.find(e => e.paused)?.play()){
this.samples.push(new Audio(this.url))
this.samples[this.samples.length - 1].play()
}
}
}
const aud = new LayeredAudio("URL");
aud.play()
Relying more on memory than process time, we can make an array of multiple clones of the Audio and then play them by order:
function gameSnd() {
tick_wav = new Audio('sounds/tick.wav');
victory_wav = new Audio('sounds/victory.wav');
counter = 0;
ticks = [];
for (var i = 0; i<10;i++)
ticks.push(tick_wav.cloneNode());
tick = function(){
counter = (counter + 1)%10;
ticks[counter].play();
}
victory = function(){
victory_wav.play();
}
}
When I tried some of the other solutions there was some delay, but I may have found a better alternative. This will plow through a good chunk of memory if you make the audio array's length high. I doubt you will need to play the same audio more than 10 times at the same time, but if you do just make the array length longer.
var audio = new Array(10);
// The length of the audio array is how many times
// the audio can overlap
for (var i = 0; i < audio.length; i++) {
audio[i] = new Audio("your audio");
}
function PlayAudio() {
// Whenever you want to play it call this function
audio[audioIndex].play();
audioIndex++;
if(audioIndex > audio.length - 1) {
audioIndex = 0;
}
}
I have found this to be the simples way to overlap the same audio over itself
<button id="btn" onclick="clickMe()">ding</button>
<script>
function clickMe() {
const newAudio = new Audio("./ding.mp3")
newAudio.play()
}

Categories

Resources