React/NextJS - Audio player - javascript

I need to stream an audio using the mp3 endpoint url.
I am new to audio streaming, so I could really use some help. Thanks in advance.
This is what I have tried:
const SongRow = ({ track }) => {
const player = useRef();
const playSong = () => {
player.src = track.preview_url;
player.play();
}
return (
<div className="songRow" onClick={() => playSong()}>
<audio ref={player} />
<div className="songRow__info">
<h1>{track.name}</h1>
</div>
</div>
)
};
Error : TypeError: Cannot add property src, object is not extensible
I am not sure how I am supposed to proceed with this. Also couldn't find any relevant docs related to this and functional components.

You need to access the current property on the ref.
From the docs:
a reference to the node becomes accessible at the current attribute of the ref.
So it would be player.current.src.

Related

Save a selected image/video via input in storage and show it on refresh

I'm using React localStorage to save the paths to image/video files. The paths are correct since the images/videos display properly in their given components. The paths also save correctly to storage because when I pull from storage, I see they are the same as what was saved.
However, the image path only "lasts" for a few refreshes or back buttons, and then suddenly, the image won't be found anymore, even though the path is unchanged. The video is never "found" after any refresh or back button (even though the path is correctly fetched) — it always says the video can't be found.
If the browser is closed and reopened, neither image nor video is found anymore.
I get the path using an input component like so:
<input
ref={inputRef}
type="file"
accept="image/*,video/mp4,video/x-m4v,video/*,audio/x-m4a,audio/*"
name="image"
onChange={(event) => loadFileFromDevice(event)}
/>
I save the path to both state and localStorage like so:
const loadFileFromDevice = (event: any) => {
let media = event.target.files[0];
if (media === null || media === undefined)
// Means you canceled
return;
let path = URL.createObjectURL(media);
let isImg = media.type.indexOf("image") != -1;
if (isImg) {
setImgSrc(path);
setVideoSrc("");
localStorage.setItem(IMG_KEY, path);
} else {
setImgSrc("");
setVideoSrc(path);
localStorage.setItem(VIDEO_KEY, path);
}
};
I initialize state using the following:
// Outside component
const getValue = (storageKey: string, defaultVal: string) => {
const item = localStorage.getItem(storageKey);
if (item === null || item === undefined || item.length === 0) return defaultVal;
return item;
};
// Inside component
const [imgSrc, setImgSrc] = useState(getValue(IMG_KEY, ""));
const [videoSrc, setVideoSrc] = useState(getValue(VIDEO_KEY, ""));
I render the image and video in the following components:
<img src={imgSrc} width="100%" height="100%" />
<video src={videoSrc} width="100%" height="100%" autoPlay loop muted />
As mentioned, the path is fetched fine and renders the images/videos correctly as long as the browser isn't refreshed. If I do refresh, the images will still be found (usually) but not the video.
If I refresh a few more times or let a few minutes pass and then refresh, the image will no longer be found. If I close and reopen the browser, neither image nor video is found. The file locations have not changed.
Why can't I reload the image/video I selected after a refresh/back button/close?
So you are creating the ephemeral URL to the media - it's lifecycle is tied to the document object. After reloading the page, saved url is no longer valid - it is not pointing to the file anymore. Via MDN
The URL lifetime is tied to the document in the window on which it was created. The new object URL represents the specified File object or Blob object.
I'd recommend using IndexedDB To achieve your goal.
A URL created with URL.createObjectURL() points to an in-memory file, not the one on the hard drive of the user. And it has a limited lifetime, as they say on MDN's documentation:
The URL lifetime is tied to the document in the window on which it was created. The new object URL represents the specified File object or Blob object.
A common way to keep files is through cached responses. For example, the below code would work in vanilla HTML, CSS, and JavaScirpt, which you could easily adapt to React:
<input type="file" name="image" id="" />
<img src="" alt="" />
const createResponseObjectAndSaveInCache = (media) => {
const response = new Response(media, {
status: 200,
statusText: "Ok",
headers: {
"content-type": media.type,
"content-length": media.size,
"X-file": media.name,
},
});
window.caches.open("cachename").then((cache) => {
let name = response.headers.get("X-file");
let url = new URL(`/${Date.now()}/${name}`, location.origin);
cache.put(url, response);
localStorage.setItem("media", url.href);
});
};
const displayFromCache = async () => {
const href = localStorage.getItem("media");
const cache = await caches.open("cachename");
let response = await cache.match(href);
if (!response) {
return;
}
let blob = await response.blob();
document.querySelector("img").src = URL.createObjectURL(blob);
};
document.querySelector("input").addEventListener("change", (e) => {
let media = event.target.files[0];
if (media) {
document.querySelector("img").src = URL.createObjectURL(media);
createResponseObjectAndSaveInCache(media);
}
});
// display cached media if there is on load
displayFromCache();

React, component not re-rendering after change in an array state (not the same as others)

I'm trying to make a page that gets picture from a server and once all pictures are downloaded display them, but for some reason the page doesn't re-render when I update the state.
I've seen the other answers to this question that you have to pass a fresh array to the setImages function and not an updated version of the previous array, I'm doing that but it still doesn't work.
(the interesting thing is that if I put a console.log in an useEffect it does log the text when the array is re-rendered, but the page does not show the updated information)
If anyone can help out would be greatly appreciated!
Here is my code.
export function Profile() {
const user = JSON.parse(window.localStorage.getItem("user"));
const [imgs, setImages] = useState([]);
const [num, setNum] = useState(0);
const [finish, setFinish] = useState(false);
const getImages = async () => {
if (finish) return;
let imgarr = [];
let temp = num;
let filename = "";
let local = false;
while(temp < num+30) {
fetch("/get-my-images?id=" + user.id + "&logged=" + user.loggonToken + "&num=" + temp)
.then(response => {
if(response.status !== 200) {
setFinish(true);
temp = num+30;
local = true;
}
filename = response.headers.get("File-Name");
return response.blob()
})
.then(function(imageBlob) {
if(local) return;
const imageObjectURL = URL.createObjectURL(imageBlob);
imgarr[temp - num] = <img name={filename} alt="shot" className="img" src={imageObjectURL} key={temp} />
temp++;
});
}
setNum(temp)
setImages(prev => [...prev, ...imgarr]);
}
async function handleClick() {
await getImages();
}
return (
<div>
<div className="img-container">
{imgs.map(i => {
return (
i.props.name && <div className="img-card">
<div className="img-tag-container" onClick={(e) => handleView(i.props.name)}>{i}</div>
<div className="img-info">
<h3 className="title" onClick={() => handleView(i.props.name)}>{i.props.name.substr(i.props.name.lastIndexOf("\\")+1)}<span>{i.props.isFlagged ? "Flagged" : ""}</span></h3>
</div>
</div>
)
})}
</div>
<div className="btn-container"><button className="load-btn" disabled={finish} onClick={handleClick}>{imgs.length === 0 ? "Load Images" : "Load More"}</button></div>
</div>
)
}
I think your method of creating the new array is correct. You are passing an updater callback to the useState() updater function which returns a concatenation of the previous images and the new images, which should return a fresh array.
When using collection-based state variables, I highly recommend setting the key property of rendered children. Have you tried assigning a unique key to <div className="img-card">?. It appears that i.props.name is unique enough to work as a key.
Keys are how React associates individual items in a collection to their corresponding rendered DOM elements. They are especially important if you modify that collection. Whenever there's an issue with rendering collections, I always make sure the keys are valid and unique. Even if adding a key doesn't fix your issue, I would still highly recommend keeping it for performance reasons.
It is related to Array characteristics of javascript.
And the reason of the console log is related with console log print moment.
So it should be shown later updated for you.
There are several approaches.
const getImages = async () => {
... ...
setNum(temp)
const newImage = [...prev, ...imgarr];
setImages(prev => newImage);
}
const getImages = async () => {
... ...
setNum(temp)
setImages(prev => JOSN.parse(JSON.object([...prev, ...imgarr]);
}
const getImages = async () => {
... ...
setNum(temp)
setImages(prev => [...prev, ...imgarr].slice(0));
}
Maybe it could work.
Hope it will be helpful for you.
Ok the problem for me was the server was not sending a proper filename header so it was always null so the condition i.props.name was never true... lol sorry for the confusion.
So the moral of this story is, always make sure that it's not something else in your code that causes the bad behavior before starting to look for other solutions...

Web Audio Related React Problems

This is my first stack overflow so I'm sorry in advance.
This is a web audio API question, relating to React Hooks (specifically useContext/useReducer - the dream team).
BASICALLY... I've been trying to use web audio API to create an oscillator and a slider to control it. So far, so good, and in vanilla JS I managed it by using setInterval() and listening for the changes:
setInterval(() => {
if (!osc) {
console.log("Oscillator is stopped");
} else {
let freqSliderVal = document.getElementById("freq-slide").value;
osc.frequency.value = freqSliderVal;
osc.type = selectedWaveform;
console.log(`Oscillator is playing. Frequency value is ${freqSliderVal}`);
}
}, 50);
I can change the frequency of the note and the waveform without the note stopping and everything's grand. You can probably see where this one's going. React basically hates this because every time you move the slider, as you can predict, it rerenders the page because the audio context is inside a useEffect. I'm aware that in the dependencies I have it re rendering every time the frequency changes, but that was the only way I could get it to actually register the change in frequency:
useEffect(() => {
let audioContext = new AudioContext();
let osc = audioContext.createOscillator();
osc.type = waveform;
osc.frequency.value = freq;
osc.connect(audioContext.destination);
osc.start(audioContext.currentTime);
osc.stop(audioContext.currentTime + 3);
audioContextRef.current = audioContext;
audioContext.suspend();
return () => osc.disconnect(audioContext.destination);
}, [freq, waveform]);
How could I make it so that:
a) I can move the fader in real time to control the frequency of the output?
2. I can change the waveform (controlled with a Context and linked to another component), also in real time?
Any help you can provide would be absolutely wonderful, as I'm beginning to really hate React now after everything started so wonderfully.
Thanks!
Sam
I created a separate component with the input slider that does what you want. I defined an audioContext outside of the component to avoid re-defining it in every state update.
Since I don't know how you handle the start and stop I just start it on component load but you can easily update that with a useEffect in your component. The way it is now you will here the sound when you move the slider.
The component is the following:
import React, { useEffect, useState, useRef } from 'react';
const audioContext = new AudioContext();
const osc = audioContext.createOscillator();
osc.type = 'square';
osc.start();
const Osc = (props) => {
const [freq, setFrequency] = useState(0);
const { waveform } = props;
const onSlide = (e) => {
const { target } = e;
setFrequency(target.value);
};
useEffect(() => {
osc.frequency.value = freq;
osc.connect(audioContext.destination);
return () => osc.disconnect(audioContext.destination);
}, [freq]);
useEffect(() => {
osc.type = waveform;
}, [waveform]);
return (
<input
name="freqSlide"
type="range"
min="20"
max="1000"
onChange={(e) => onSlide(e)}
/>
);
};
export default Osc;
To use it you first need to import it:
import Osc from './osc';
And use it in your render function:
<Osc waveform="square" />
The waveform is a property since you said you update it from a different component so you can update it here and the update will be reflected in the component.

Play audio with React

Whatever I do, I get an error message while trying to playing a sound:
Uncaught (in promise) DOMException.
After searching on Google I found that it should appear if I autoplayed the audio before any action on the page from the user but it's not the case for me. I even did this:
componentDidMount() {
let audio = new Audio('sounds/beep.wav');
audio.load();
audio.muted = true;
document.addEventListener('click', () => {
audio.muted = false;
audio.play();
});
}
But the message still appears and the sound doesn't play. What should I do?
The audio is an HTMLMediaElement, and calling play() returns a Promise, so needs to be handled. Depending on the size of the file, it's usually ready to go, but if it is not loaded (e.g pending promise), it will throw the "AbortError" DOMException.
You can check to see if its loaded first, then catch the error to turn off the message. For example:
componentDidMount() {
this.audio = new Audio('sounds/beep.wav')
this.audio.load()
this.playAudio()
}
playAudio() {
const audioPromise = this.audio.play()
if (audioPromise !== undefined) {
audioPromise
.then(_ => {
// autoplay started
})
.catch(err => {
// catch dom exception
console.info(err)
})
}
}
Another pattern that has worked well without showing that error is creating the component as an HTML audio element with the autoPlay attribute and then rendering it as a component where needed. For example:
const Sound = ( { soundFileName, ...rest } ) => (
<audio autoPlay src={`sounds/${soundFileName}`} {...rest} />
)
const ComponentToAutoPlaySoundIn = () => (
<>
...
<Sound soundFileName="beep.wav" />
...
</>
)
Simple error tone
If you want something as simple as playing a simple error tone (for non-visual feedback in a barcode scanner environment, for instance), and don't want to install dependencies, etc. - it can be pretty simple. Just link to your audio file:
import ErrorAudio from './error.mp3'
And in the code, reference it, and play it:
var AudioPlay = new Audio (ErrorAudio);
AudioPlay.play();
Only discovered this after messing around with more complicated options.
I think it would be better to use this component (https://github.com/justinmc/react-audio-player) instead of a direct dom manipulation
It is Very Straightforward Indeed
const [, setMuted] = useState(true)
useEffect(() => {
const player = new Audio(./sound.mp3);
const playPromise = player.play();
if (playPromise !== undefined)
return playPromise.then(() => setMuted(false)).catch(() => setMuted(false));
}, [])
I hope It works now :)

Change playout delay in WebRTC stream

I'm trying to cast a live MediaStream (Eventually from the camera) from peerA to peerB and I want peerB to receive the live stream in real time and then replay it with an added delay. Unfortunately in isn't possible to simply pause the stream and resume with play since it jump forward to the live moment.
So I have figured out that I can use MediaRecorder + SourceBuffer rewatch the live stream. Record the stream and append the buffers to MSE (SourceBuffer) and play it 5 seconds later.
This works grate on the local device (stream). But when I try to use Media Recorder on the receivers MediaStream (from pc.onaddstream) is looks like it gets some data and it's able to append the buffer to the sourceBuffer. however it dose not replay. sometime i get just one frame.
const [pc1, pc2] = localPeerConnectionLoop()
const canvasStream = canvas.captureStream(200)
videoA.srcObject = canvasStream
videoA.play()
// Note: using two MediaRecorder at the same time seem problematic
// But this one works
// stream2mediaSorce(canvasStream, videoB)
// setTimeout(videoB.play.bind(videoB), 5000)
pc1.addTransceiver(canvasStream.getTracks()[0], {
streams: [ canvasStream ]
})
pc2.onaddstream = (evt) => {
videoC.srcObject = evt.stream
videoC.play()
// Note: using two MediaRecorder at the same time seem problematic
// THIS DOSE NOT WORK
stream2mediaSorce(evt.stream, videoD)
setTimeout(() => videoD.play(), 2000)
}
/**
* Turn a MediaStream into a SourceBuffer
*
* #param {MediaStream} stream Live Stream to record
* #param {HTMLVideoElement} videoElm Video element to play the recorded video in
* #return {undefined}
*/
function stream2mediaSorce (stream, videoElm) {
const RECORDER_MIME_TYPE = 'video/webm;codecs=vp9'
const recorder = new MediaRecorder(stream, { mimeType : RECORDER_MIME_TYPE })
const mediaSource = new MediaSource()
videoElm.src = URL.createObjectURL(mediaSource)
mediaSource.onsourceopen = (e) => {
sourceBuffer = mediaSource.addSourceBuffer(RECORDER_MIME_TYPE);
const fr = new FileReader()
fr.onerror = console.log
fr.onload = ({ target }) => {
console.log(target.result)
sourceBuffer.appendBuffer(target.result)
}
recorder.ondataavailable = ({ data }) => {
console.log(data)
fr.readAsArrayBuffer(data)
}
setInterval(recorder.requestData.bind(recorder), 1000)
}
console.log('Recorder created')
recorder.start()
}
Do you know why it won't play the video?
I have created a fiddle with all the necessary code to try it out, the javascript tab is the same code as above, (the html is mostly irrelevant and dose not need to be changed)
Some try to reduce the latency, but I actually want to increase it to ~10 seconds to rewatch something you did wrong in a golf swing or something, and if possible avoid MediaRecorder altogether
EDIT:
I found something called "playout-delay" in some RTC extension
that allows the sender to control the minimum and maximum latency from capture to render time
https://webrtc.org/experiments/rtp-hdrext/playout-delay/
How can i use it?
Will it be of any help to me?
Update, there is new feature that will enable this, called playoutDelayHint.
We want to provide means for javascript applications to set their preferences on how fast they want to render audio or video data. As fast as possible might be beneficial for applications which concentrates on real time experience. For others additional data buffering may provide smother experience in case of network issues.
Refs:
https://discourse.wicg.io/t/hint-attribute-in-webrtc-to-influence-underlying-audio-video-buffering/4038
https://bugs.chromium.org/p/webrtc/issues/detail?id=10287
Demo: https://jsfiddle.net/rvekxns5/
doe i was only able to set max 10s in my browser but it's more up to the UA vendor to do it's best it can with the resources available
import('https://jimmy.warting.se/packages/dummycontent/canvas-clock.js')
.then(({AnalogClock}) => {
const {canvas} = new AnalogClock(100)
document.querySelector('canvas').replaceWith(canvas)
const [pc1, pc2] = localPeerConnectionLoop()
const canvasStream = canvas.captureStream(200)
videoA.srcObject = canvasStream
videoA.play()
pc1.addTransceiver(canvasStream.getTracks()[0], {
streams: [ canvasStream ]
})
pc2.onaddstream = (evt) => {
videoC.srcObject = evt.stream
videoC.play()
}
$dur.onchange = () => {
pc2.getReceivers()[0].playoutDelayHint = $dur.valueAsNumber
}
})
<!-- all the irrelevant part, that you don't need to know anything about -->
<h3 style="border-bottom: 1px solid">Original canvas</h3>
<canvas id="canvas" width="100" height="100"></canvas>
<script>
function localPeerConnectionLoop(cfg = {sdpSemantics: 'unified-plan'}) {
const setD = (d, a, b) => Promise.all([a.setLocalDescription(d), b.setRemoteDescription(d)]);
return [0, 1].map(() => new RTCPeerConnection(cfg)).map((pc, i, pcs) => Object.assign(pc, {
onicecandidate: e => e.candidate && pcs[i ^ 1].addIceCandidate(e.candidate),
onnegotiationneeded: async e => {
try {
await setD(await pc.createOffer(), pc, pcs[i ^ 1]);
await setD(await pcs[i ^ 1].createAnswer(), pcs[i ^ 1], pc);
} catch (e) {
console.log(e);
}
}
}));
}
</script>
<h3 style="border-bottom: 1px solid">Local peer (PC1)</h3>
<video id="videoA" muted width="100" height="100"></video>
<h3 style="border-bottom: 1px solid">Remote peer (PC2)</h3>
<video id="videoC" muted width="100" height="100"></video>
<label> Change playoutDelayHint
<input type="number" value="1" id="$dur">
</label>

Categories

Resources