Playing audio in Safari using .play() - javascript

I'm looking to play an audio file using custom controls to trigger the .play() method on a custom button. To be clear, I'm not trying to have the audio auto-play. Everything works perfectly in Chrome, but in Safari I get the error:
Unhandled Promise Rejection: NotAllowedError: The request is not
allowed by the user agent or the platform in the current context,
possibly because the user denied permission.
The project is built using React and React Router, so I'm wondering if it's possibly fixable in my useEffect() hook. I've tried enabling controls on the audio element and using CSS to remove them from the DOM, but no luck.
import React, { useState, useRef, useEffect } from "react";
import { BrowserRouter as Router, Switch, Route } from "react-router-dom";
import { gsap } from "gsap";
function RadioPlayerNav(props) {
const audioEl = useRef(null);
const [isPlaying, setIsPlaying] = useState(false);
const playingTitle = document.querySelector(".radio-player-nav .title p");
const toPX = (value) => {
return (parseFloat(value) / 100) * (/vh/gi.test(value) ? window.innerHeight : window.innerWidth);
};
const radioPlayerGSAP = gsap.to(".radio-player-nav .title p", {
x: toPX("-5vw"),
duration: 4,
ease: "none",
yoyo: true,
repeat: -1,
delay: 1,
repeatDelay: 1,
paused: true,
});
useEffect(() => {
if (isPlaying) {
audioEl.current.play();
radioPlayerGSAP.play();
// radioPlayerGSAP.reversed(4, false);
} else {
audioEl.current.pause();
}
}, [isPlaying]);
return (
<div className="radio-player-nav">
<div className="radio-player-controls">
<audio src="src/current-radio-mix.mp3" ref={audioEl} preload="auto"></audio>
<i
className={isPlaying ? "fas fa-pause cursor-hover" : "fas fa-play cursor-hover"}
onClick={() => {
setIsPlaying(!isPlaying);
}}
></i>
<div className="title">
<p>MIXED FEELINGS M0001</p>
</div>
<a href="src/current-radio-mix.mp3" download="Mixed Feelings M0001">
<i className="fas fa-download cursor-hover"></i>
</a>
</div>
</div>
);
}
export default RadioPlayerNav;
You can find the full github repo for the project here: https://github.com/nallstott/mixed-feelings/tree/master

Turns out, safari requires you to use useLayoutEffect instead of useEffect to accomplish this. I'm leaving the post up since I didn't see anything previously that gave the answer, along with the article that solved it for me in case anyone else has this issue with <audio> on safari.
https://lukecod.es/2020/08/27/ios-cant-play-youtube-via-react-useeffect/

I also play audio in my app, and it cycles through them. I was able to get around this by placing my audio files in a map, and use a separate function to call the audio to play.
import hold_female from '../../assets/audio/Female/hold_female.mp3';
import exhale_female from '../../assets/audio/Female/exhale_female.mp3';
import inhale_female from '../../assets/audio/Female/inhale_female.mp3';
import hold_male from '../../assets/audio/Male/hold_male.mp3';
import exhale_male from '../../assets/audio/Male/exhale_male.mp3';
import inhale_male from '../../assets/audio/Male/inhale_male.mp3';
//Props here...
createAudio('Exhale_female', exhale_female); //These place the audio into a map under the name provided.
createAudio('Inhale_female', inhale_female);
createAudio('Hold_female', hold_female);
createAudio('Exhale_male', exhale_male);
createAudio('Inhale_male', inhale_male);
createAudio('Hold_male', hold_male);
const BreatheTest: FC<BreathingProps> = ({ gender }) => {
const [stageText, setStageText] = useState<string>('Inhale');
const [index, setIndex] = useState<number>(0);
const [milliseconds, setMilliseconds] = useState<number>(0); //Set to 0 so the audio plays right away and there is no delay.
const captions = ['Inhale', 'Hold', 'Exhale', 'Hold'];
const playAudioFiles = () => {
playAudio(`${stageText}_${gender}`);
};
useEffect(() => {
const timeout = setTimeout(() => {
stopAll(); //stop all the previous audio files if they are running.
setStageText(captions[index]);
setIndex(index === 3 ? 0 : index + 1);
setMilliseconds(isSafari ? 4500 : 4350);//Sets the timeout to the time of the audio files.
playAudioFiles(); //Plays the audio files as the useEffect runs
}, milliseconds);
return () => clearTimeout(timeout);
}, [index]);
//... render method and everything else.
}
My app is for controlling breathing, and this is how I have gotten past the error you are seeing. From what I have read, iOS just requires some kind of trigger to start any media, audio or video. Putting the play function into a series of play functions kind of satisfies Safari.
It may not work for you, or how your code works, but if this is where we can discuss how we got around iOS's audio control, this is another way.

Related

Is there is any way to add Lazy Loading for video tag with JavaScript?

At the moment, I am working on a project that requires me to add three videos to the homepage, but loading them all at once will reduce the load time considerably.
Also i want to use <video/> tag instead of using <iframe/> because i want that autoplay functionality.
What's the best way to do this in React? Using NextJS and Chakra UI.
You can use IntersectionObserver and do it as below. For React all you have to do is to add the below code in an useEffect with empty dependency.
const video = document.querySelector("video");
function handleIntersection(entries) {
entries.map(async (entry) => {
if (entry.isIntersecting) {
const res = await fetch("/video.mp4");
const data = await res.blob();
video.src = URL.createObjectURL(data);
}
});
}
const observer = new IntersectionObserver(handleIntersection);
observer.observe(video);
<video autoplay muted loop playsinline></video>
Also I used a video with a relative path to avoid possible CORS issues.
i found a way to do it using '#react-hook/intersection-observer'
import useIntersectionObserver from '#react-hook/intersection-observer'
import { useRef } from 'react'
const LazyIframe = () => {
const containerRef = useRef()
const lockRef = useRef(false)
const { isIntersecting } = useIntersectionObserver(containerRef)
if (isIntersecting) {
lockRef.current = true
}
return (
<div ref={containerRef}>
{lockRef.current && (
<video
src={"add video source here"}
type="video/mp4"
></video>
)}
</div>
)
}

manage a single instance of a Nextjs Component seperatly for lottie Animations?

i am building a nextjs component called SocialMediaIcon = this component should recive a single lottie json path and render a lottie svg, when the user hovers on the animation it should start playing and when the mouse leaves it should reset the animation to position 0 ,
import { useEffect, useRef, useState } from 'react'
import { LottiePlayer } from 'lottie-web'
function SocialMediaIcon({ iconPath }) {
const ref = useRef(null)
const anim = useRef(null)
const [lottie, setLottie] = useState(null)
useEffect(() => {
import('lottie-web').then((Lottie) => setLottie(Lottie.default))
}, [])
useEffect(() => {
if (lottie && ref.current) {
anim.current = lottie.loadAnimation({
container: ref.current,
renderer: 'svg',
loop: true,
autoplay: false,
// path to your animation file, place it inside public folder
animationData: iconPath,
})
return () => anim.current.destroy()
}
}, [lottie])
const handleAnimation = () => {
lottie.play()
}
const handlemouseOut = () => {
lottie.goToAndStop(0)
}
return (
<div
ref={ref}
className="lottie h-20 rounded-lg bg-white p-2"
onMouseEnter={handleAnimation}
onMouseLeave={handlemouseOut}
></div>
)
}
export default SocialMediaIcon
and this is how i am rendering the component in the parrent component :
<SocialMediaIcon iconPath={githubJson} />
<SocialMediaIcon iconPath={twitterJson} />
the problem i'm having is that when i hover over one component all the components are playing the animation instead of just the one that has been hovered
i have already tried :
1.using a anim = useRef() hook and passing it to lottie.play(anim.current) which resulted in no animation being played at all
2.i also tried passing animation to lottie play lottie.play(lottie.animation) thinking every instance has its own animation and the animation did not play .
it would be greatly appreciated if you could help me understand this on a deeper level, i think im really missing a important concept not sure if its about lottie or nextjs in genral.

Unable to play a different song using setState method in ReactJs

// TuneContainer.js
import React, {useState} from 'react'
import './TuneContainer.css'
function TuneContainer(props) {
const[isPlaying, setIsPlaying] = useState(false)
const[isPaused, setIsPaused] = useState(true)
const audio = document.querySelector('audio')
const audioControls = () => {
if(isPaused) {
console.log(isPlaying)
console.log(isPaused)
setIsPlaying(!isPlaying)
setIsPaused(!isPaused)
console.log(isPlaying)
console.log(isPaused)
audio.play()
} else {
setIsPlaying(!isPlaying)
setIsPaused(!isPaused)
audio.pause()
}
}
return (
<>
<div className="tune-container">
<div className="info-class">
<img src={props.imgsrc} className="tune-img" alt={props.imgalt} onClick={audioControls}></img>
<audio src={props.audiosrc} id="tune" loop hidden></audio>
</div>
</div>
</>
)
}
export default TuneContainer
The above is the code for the container which consist of the image, which when clicked plays the song in an infinite loop, until paused again by clicking the image. Below given is the main page which is calling the TuneContainer and passing it props.
// HomePage.js
import React from 'react'
import NavigationBar from './NavigationBar'
import TuneContainer from './TuneContainer'
import Bird from '../images/Bird.svg'
import BirdWhistling from '../audios/Bird-whistling.mp3'
import Leaves from '../images/Leaves.svg'
import LeavesRustling from '../audios/Rustling-leaves.mp3'
function HomePage() {
return (
<>
<NavigationBar />
<div className="container">
<TuneContainer audiosrc={BirdWhistling} imgsrc={Bird} imgalt="Bird by Ana MarĂ­a Lora Macias from the Noun Project"/>
<TuneContainer audiosrc={LeavesRustling} imgsrc={Leaves} imgalt="leaves by KP Arts from the Noun Project"/>
</div>
</>
)
}
export default HomePage
So here, when I click on the bird image, I hear the the chirping sounds, since those are the props passed. The second TuneContainer has different image and audio altogether. However, when the leaf image is clicked, it still plays the chirping sound. So I believe the audio source is not properly getting updated. Can someone please highlight where am I doing a mistake?
P.S: Before someone asks, I have checked all the routes and filenames correctly, and no, both audio files have different songs in them.
Although I know the SO highly recommends asking one question in one post, I will just add my second question here, since it is highly related and requires no extra bit of code.
Q: When I check the console, the values getting printed (because of the console.log statements) are false, true, false, true. I believe it should print false, true, true, false, since I am printing once before the setState function and once after it. Why such behaviour?
Because document.querySelector('audio') will always return the first html audio element, which in your case is the bird chirping sound.
You can use a unique (id)entifier for each TuneContainer. Use that id on your audio tag and query select that id, which will point the correct audio element.
Another way would be to use a useRef to get the audio element.
// TuneContainer.js
...
const audioRef = React.useRef(null);
/* const audio = document.querySelector('audio') */
const audioControls = () => {
if(isPaused) {
console.log(isPlaying)
console.log(isPaused)
setIsPlaying(!isPlaying)
setIsPaused(!isPaused)
console.log(isPlaying)
console.log(isPaused)
// audio.play()
audioRef.current.play();
} else {
setIsPlaying(!isPlaying)
setIsPaused(!isPaused)
// audio.pause()
audioRef.current.pause();
}
};
...
...
return (
...
...
<audio ref={audioRef} src={props.audiosrc} id="tune" loop hidden></audio>
...
);

Facing Problems in Converting to Functional Components in React

This is a very small two-page project, so this won't take much time.
I am trying to convert this Github repo from class-based component to functional component. I am very close but the logic is just not working properly.
The useState hook especially, as in the values are not getting saved. So I tried a different approach.
This is the expected output which is the live demo of the original project:
https://matt-eric.github.io/web-audio-fft-visualization-with-react-hooks/
And this is where I am. This is the sandbox link.
Ignore the The error you provided does not contain a stack trace. error for now. Click on the x and refresh the small project window (not your browser tab) a couple of times until the audio plays on refresh. This is because google stops you from playing music on load.
I want to audio to play with the click of the button and not on load. But it is not working.
Thank whoever goes and looks into it.
There's some cleanup needed but this is working. https://codesandbox.io/s/xenodochial-wildflower-c5b35
All I really did was put all the functions in a useCallback made audioFile a ref, and then made the toggleAudio function which either plays or pauses the audio depending on its current state. One of the biggest problems I saw was that you were trying to initialize the audio on click, but that really should be done on mount, then the audio just starts when you click. Also if you initialize on every click then it causes errors because it's already initialized.
Let me know if you have any questions!
Using your sandbox, I found a couple of things missing:
You needed to memoize your audoFile (since you create a new audio and it never changes)
Your functions need to be stable, and therefore need to be react hook functions, specifically useCallback functions.
In your onClick function in the demo (the start button) you called to initalizeAudioAnalyzer but that was already intialized with your useEffect on functionCont.jsx and doesn't need to be initialized again. Once I removed this, it all worked.
Here is the fixed up code that is now playing the audio:
functionCont.jsx:
import React, { useCallback, useEffect, useMemo, useState } from "react";
import VisualDemo from "./functionViz";
// import VisualDemo from "./VisualDemo";
import soundFile from "../audio/water.mp3";
const FunctionCont = () => {
const audioFile = useMemo(() => {
const audio = new Audio();
audio.crossOrigin = "anonymous";
return audio;
} , []);
const [audioData, setAudioData] = useState();
let frequencyBandArray = [...Array(25).keys()];
const initializeAudioAnalyser = useCallback(() => {
const audioContext = new AudioContext();
const source = audioContext.createMediaElementSource(audioFile);
const analyser = audioContext.createAnalyser();
audioFile.src = soundFile;
analyser.fftSize = 64;
source.connect(audioContext.destination);
source.connect(analyser);
setAudioData(analyser);
audioFile.play();
}, [audioFile]);
useEffect(() => {
initializeAudioAnalyser();
}, [initializeAudioAnalyser]);
const getFrequencyData = useCallback((styleAdjuster) => {
const bufferLength = audioData.frequencyBinCount;
const amplitudeArray = new Uint8Array(bufferLength);
audioData.getByteFrequencyData(amplitudeArray);
styleAdjuster(amplitudeArray);
}, [audioData]);
return (
<div>
<VisualDemo
frequencyBandArray={frequencyBandArray}
getFrequencyData={getFrequencyData}
// audioData={audioData}
audioFile={audioFile}
/>
</div>
);
};
export default FunctionCont;
functionViz.jsx
import React, { useRef, useEffect, useState } from "react";
import Paper from "#material-ui/core/Paper";
import IconButton from "#material-ui/core/IconButton";
import Tooltip from "#material-ui/core/Tooltip";
import EqualizerIcon from "#material-ui/icons/Equalizer";
import { makeStyles } from "#material-ui/core/styles";
import "../stylesheets/App.scss";
const useStyles = makeStyles((theme) => ({
flexContainer: {
display: "flex",
flexWrap: "wrap",
justifyContent: "center",
paddingTop: "25%"
}
}));
const VisualDemo = (props) => {
const classes = useStyles();
const amplitudeValues = useRef(null);
function adjustFreqBandStyle(newAmplitudeData) {
amplitudeValues.current = newAmplitudeData;
let domElements = props.frequencyBandArray.map((num) =>
document.getElementById(num)
);
for (let i = 0; i < props.frequencyBandArray.length; i++) {
let num = props.frequencyBandArray[i];
domElements[
num
].style.backgroundColor = `rgb(0, 255, ${amplitudeValues.current[num]})`;
domElements[num].style.height = `${amplitudeValues.current[num]}px`;
}
}
function runSpectrum() {
props.getFrequencyData(adjustFreqBandStyle);
requestAnimationFrame(runSpectrum);
}
function handleStartButtonClick() {
requestAnimationFrame(runSpectrum);
}
return (
<div>
<div>
<Tooltip title="Start" aria-label="Start" placement="right">
<IconButton
id="startButton"
onClick={handleStartButtonClick}
// disabled={!!props.audioData ? true : false}
>
<EqualizerIcon />
</IconButton>
</Tooltip>
</div>
<div className={classes.flexContainer}>
{props.frequencyBandArray.map((num) => (
<Paper
className={"frequencyBands"}
elevation={4}
id={num}
key={num}
/>
))}
</div>
</div>
);
};
export default VisualDemo;

Detect date changes and send notification at user-set time in React Native

I am writing a reminder App and I am faced with the challenge of actually detecting date and time changes without the component unmounting and re-mounting.
I want to be able to send a push notification when the date and time is the same as specified by the user.
I have done a couple of google and YouTube searches but it seems no one really has any article or video on this.
The first solution you can store data in the server and send notification via server.
Second solution you can use Headless js in React native to run background process while your app is in the background mode. you can read about headless in below link
https://facebook.github.io/react-native/docs/0.60/headless-js-android
You can use luxon library to get the difference between reminder time and now. Then you call setInterval with mills value from the diff, thus you synchronize the timer counter with the system clock. See my code below:
import React, { useEffect, useState } from 'react';
import { View } from 'react-native';
import { DateTime } from 'luxon';
const getDateTimeDiffFromNow = (
finishSeconds: number,
): {
minutes: number,
milliseconds: number,
expired?: boolean,
} => {
const now = DateTime.local();
if (now.toSeconds() >= finishSeconds) {
return { minutes: 0, milliseconds: 0, expired: true };
}
const finish = DateTime.fromSeconds(finishSeconds);
return finish
.diff(now, ['minutes', 'milliseconds'])
.toObject();
};
const DateTimer = ({ finishSeconds }: Props) => {
const [diff, setDiff] = useState(() =>
getDateTimeDiffFromNow(finishSeconds),
);
useEffect(() => {
if (diff.expired) {
return;
}
const timeoutId = setTimeout(
() => setDiff(getDateTimeDiffFromNow(finishSeconds)),
diff.milliseconds,
);
return () => clearTimeout(timeoutId);
}, [diff, finishSeconds]);
return (<View />)
};

Categories

Resources