Play local audio wav file with react-app from docker - javascript

I'm trying to use react-player to play local wav files.
I'm not sure of the following.
Does the player support wav files?
When I'm running the react-app through docker what should I put in the url prop?
Docker-compose volume setup
volumes:
- /Users/user/app/sounds:/app/audioExport
My app is running on http://localhost:3000
I tried the following
var path = "http://localhost:3000/app/audioExport/sound.wav"
var path = "/app/audioExport/sound.wav"
var path = "/Users/user/app/sounds/sound.wav"
I tried the static method ReactPlayer.canPlay(path)
I get Invalid URI. Load of media resource failed..
In the console I get an empty src prop
<audio src="" style="width: 100%; height: 100%;" preload="auto" controls=""crossorigin="true"></audio>
Here is the component
<ReactPlayer
className='react-player fixed-bottom'
controls = {true}
width='100%'
height='100%'
type="audio/wav"
forceAudio = {true}
config={{
file: {
attributes: {
crossOrigin: "true",
}
}
}}
url={path}
/>

For people who might stumble upon the question. I learned that one cannot play local files from React App without importing the sounds in the app. This is for security reasons and I was able to find that out because hosted sounds played fine but local files gave an error. This led me to create an API endpoint which streams the wav file to the React app.

Related

Streaming over Janus using Vue.js

I've ran Janus on a server and it works fine, but I'm kind of
struggling finding a way to stream on the Janus server, I could not
find any snippet code out there, I am developing with Vue.js, is there
any library to stream over Janus? I just need a sample.
Janus Gateway has a streaming plugin. You need to enable the plugin and use some software like gstreamer or ffmpeg to transfer video data to the streaming plugin.
The Q&A shows how to do this on Raspberry Pi.
https://superuser.com/questions/1458442/h264-stream-to-janus-using-ffmpeg-on-raspberry-pi
Short summary is below.
Set up Janus gateway streaming plugin
The streaming plugin configuration file is /opt/janus/etc/janus/janus.plugin.streaming.jcfg. (official documentation)
You can find there are several sample configurations. For example,
rtp-sample receives VP8/Opus video streaming data via RTP.
If you want to stream H.264 video, you can edit the configuration to add this setting.
h264-sample: {
type = "rtp"
id = 10
description = "H.264 live stream"
audio = false
video = true
videoport = 8004
videopt = 126
videortpmap = "H264/90000"
videofmtp = "profile-level-id=42e01f;packetization-mode=1"
secret = "somesecretid"
}
After editing the configuration, you need to restart Janus gateway.
Start video streaming
You can send video data to Janus streaming plugin via RTP. For example, if you have FFMpeg, you can do like this.
$ ffmpeg \
-f v4l2 -thread_queue_size 8192 -input_format yuyv422 \
-video_size 1280x720 -framerate 10 -i /dev/video0 \
-c:v h264_omx -profile:v baseline -b:v 1M -bf 0 \
-flags:v +global_header -bsf:v "dump_extra=freq=keyframe" \
-max_delay 0 -an -bufsize 1M -vsync 1 -g 10 \
-f rtp rtp://127.0.0.1:8004/
This command reads video data from /dev/video0 and
Please note the video parameters and output RTP port number (8084 in above example) should be corresponding to the Janus streaming plugin configuration.
Prepare Vue.js frontend
Next step is a frontend. You can create web frontend to view the streaming using janus.js bundled in Janus gateway. As described in the official documantation, you can use the janus.js as JavaScript module. But when you want to use it from Vue.js, you will need export-loader.
For example, you can create Vue.js2 project and add janus.js like this.
$ vue create mystreaming
$ cd mystreaming
$ yarn add git://github.com/meetecho/janus-gateway.git
$ yarn add exports-loader --dev
To add Webpack configuration, you need to create vue.config.js file with following content.
const webpack = require('webpack')
module.exports = {
configureWebpack: {
plugins: [
new webpack.ProvidePlugin({ adapter: 'webrtc-adapter' })
],
module: {
rules: [
{
test: require.resolve('janus-gateway'),
loader: 'exports-loader',
options: {
exports: 'Janus',
},
}
]
}
}
}
Then, you can import Janus object from Vue.js module like this.
import { Janus } from 'janus-gateway';
Then you can use janus.js and receive streaming video data using APIs.
I uploaded example Vue.js project, which might also help you.

Video is not showing in page when linked locally but shows up when the source is a URL

So I was practicing some basic stuff on a cloud IDE called Goorm and I wanted to create a simple page showing a video I upload directly to the server, the ejs file and the video are in the same folder however if i set the src of the video tag as "media/dog.webm" the video does not show up and is just blank instead, but if i set the source to a online URL where the video is uploaded and can be acessed it loads.
<div style="text-align:center">
<h1>
DOG
</h1>
<video autoplay id="video1" width="420">
<source src="https://somewebsite.com/dog.webm" type="video/webm" >
Your browser shall not play this video
</video>
<div/>
This code works but I have almost no idea why when src is "media/dog.webm" or "dog.webm" do not work, I've tried setting the source as the complete directory, in the home folder and in the same directory as the .ejs file and .js file. I belive somewhow when I run "node index.js" it has no acess to media content so it cannot load locally stored media content to the page made in the .ejs file
The code that I was expecting to work the code below, it works if i write it into a html in my machine but doesn't when Goorm runs it from the ejs. I'm 99% sure it's just a stupid small thing i am missing but I can't find a solution anywhere.
<div style="text-align:center">
<h1>
DOG
</h1>
<video autoplay id="video1" width="420">
<source src="media/dog.webm" type="video/webm" >
Your browser shall not play this video
</video>
<div/>
Some more info, other simple ejs that do not load media files do show without problems. I am running at port 3000. Ejs version is 3.1.2, Express version is 4.17.1. NodeJS version is v10.16.3. Ubuntu 18.04.2 LTS x86_64.
edit: adding below the .js file i am running on nodejs
let express = require("express")
let app = express()
app.get("/", function(req, res){
res.render("home.ejs")
})
app.get("/ovo/:coisa", function(req, res){
let ags = req.params.coisa
res.render("ogs.ejs", {coisaVar: ags})
})
app.get("/dog", function(req, res){
res.render("dog.ejs")
})
app.listen(3000, function(){
console.log("Server has started!")
})
You can't serve your static files (images, fonts, videos) just like that. You are using a server (node.js) and browser will request the static files on the html and your server will serve them individually.
So first step create a public folder inside root of your project. Inside the public directory move your video.
And now configure express.js to serve whatever in your public directory.
let express = require("express")
let app = express()
// this is the line express is waiting for to serve your video !!!
app.use(express.static('public'))
app.get("/", function(req, res){
res.render("home.ejs")
})
app.get("/ovo/:coisa", function(req, res){
let ags = req.params.coisa
res.render("ogs.ejs", {coisaVar: ags})
})
app.get("/dog", function(req, res){
res.render("dog.ejs")
})
app.listen(3000, function(){
console.log("Server has started!")
})
This is the doc, you should take a look. https://expressjs.com/en/starter/static-files.html
All good now try to access your video as /media/dog.webm in your html. Let me know how it goes. Good luck
Try:
"./media/" relative path , otherwise it will try going to the document root.
<div style="text-align:center">
<h1>
DOG
</h1>
<video autoplay id="video1" width="420">
<source src="./media/dog.webm" type="video/webm" >
Your browser shall not play this video
</video>
<div/>

How to download files and store them to s3 by nodejs (through heroku)?

I want to know how to download the files to aws-s3 through heroku, and dosen't use the web page.
I'm using wget in Nodejs to download audio files, and wanting to re-encoding them by ffmpeg. Though I have to store the file first, but heroku doesn't provide the space to store.
Then I find the way that heroku can connect with aws-s3, however the sample code is the page for user to upload, and I'm just want to using codes.
This is the code that haven't connected with s3 yet.
const wget = spawn('wget',['-O', 'upload_audio/'+ sender_psid +'.aac', audioUrl]);
//spit stdout to screen
wget.stdout.on('data', function (data) { process.stdout.write(data.toString()); });
// //spit stderr to screen
wget.stderr.on('data', function (data) { process.stdout.write(data.toString()); });
wget.on('close', function (code) {
console.log(process.env.PATH);
const ffmpeg = spawn('ffmpeg',['-i', 'upload_audio/'+ sender_psid +'.aac', '-ar', '16000', 'upload_audio/'+ sender_psid +'.wav', '-y']);
...
});
And it comes the error:
upload_audio/1847432728691358.aac: No such file or directory
...
Error: ENOENT: no such file or directory, open 'upload_audio/1847432728691358.wav'
How can I solve the problem?
Could anyone give some suggest plz?
Maybe won't use s3 can get it?
I don't recommend using S3 until you've tried the NodeJS Static Files API. On the server, that looks like this:
app.use(express.static(path.join(__dirname, 'my_audio_files_folder')));
More on NodeJS's Static Files API available here.
AWS S3 seems to be adding unneeded complexity, but if you have your heart set on it, have a look at this thread.

meteor android app won't connect to server

I have a multiplayer bingo game which I made and im hosting it over Modulus. After uploading the project to Modulus and installing their MongoDB for my project, the multiplayer game works good over the browser, BUT, when I tried to meteor build the game and get the android .apk file to install over my Meizu device the installation got aborted and threw me a parse error so I decided to import the whole project to android studio and build the apk file from there. After installing the apk from Android Studio the application opens up, but it seems that it has no connection to the Modulus hosting server / Modulus db. Can anyone help and point me out my mistakes which I did? the build command was:
meteor build ../pathToBuild --server serverAdress
any help?
well I solved this by hard coding the server's path inside the code
var theURL = "http://myServer";
if (process.env.NODE_ENV === "development") {
// home
theURL = "http://myServer";
// office
//theURL = "http://192.168.10.30:3000";
}
Meteor.absoluteUrl.defaultOptions.rootUrl = theURL;
process.env.ROOT_URL = theURL;
process.env.MOBILE_ROOT_URL = theURL;
process.env.MOBILE_DDP_URL = theURL;
process.env.DDP_DEFAULT_CONNECTION_URL = theURL;
did the trick for me
This is generally to do with the way the server is started rather than the app, try specifying --mobile-server with the same address you use to build it when running the server you want to connect the app to:
meteor --mobile-server http:/app.server:port
If this works or if you don't run the app with the meteor command directly then you can set the DDP_DEFAULT_CONNECTION_URL environmental variable on the server directory you are running from to avoid having to specify it each time you run the app.
Alternatively I think if you upgrade to meteor 1.3 and rebuild/deploy then this bug should disappear.
More info on this thread although it's pretty long there are a few other things that can affect this in there but this method works for me for my cordova android app.

node.js express 4.13.3 serveStatic not serving mp3 or ogg files

I'm using Express 4.13.3 and the serve-static npm library. It serves static assets fine EXCEPT those with mp3 or ogg extensions. Looking at the documentation I haven't found anything to indicate that this is configurable, and others appear to be loading audio and video without issues with earlier express.
The directory is simple:
/public/assets/image.jpg **works ok**
/public/assets/audio.mp3 404?!
/public/assets/audio.ogg 404?!
The code is simple:
app = express();
app.use(serveStatic(__dirname + '/public'));
app.use('/', router);
I can use node.js to return files but that seems a poor choice given the number of audio files. Any idea on what might be wrong?
Doh, it's was actually a load balancer in front of the node server not recognizing the static file extension, so adding the 2 .mp3 and .ogg to haproxy's configuration fixed it.
acl url_static path_end -i .jpg .gif .png .css .js .html .ico .mp3 .ogg

Categories

Resources