I'm just starting with React-Native. do i need to play a song inside one in a webview and be able to control it when the screen is locked by running stop / play?
Does anyone have any ideas / worked on this before?
Yes, this is possible.
We have a React Native app where users can edit their profile via a Webview.
Profiles include a media section with a custom-built audio player which uses Web Audio API (https://developer.mozilla.org/en-US/docs/Web/API/Web_Audio_API). All controllers including play and stop are working as expected.
On the other hand, I don't think you can control the Webview when the screen is locked but this part is totally untested. You will probably need to use a React Native audio player to achieve this.
Related
I am making a WebApp that uses WebSpeech API for Text to Speech.
The problem I am facing is that whenever the screen goes off or
browser window is switched, particularly on mobile devices, the speech
synthesis is just stopped. When the text to be read takes longer time,
the device screen times out, the reading progress is lost, which is
really bad UX.
The basic synthesis controller is is created by const synth=window.speechSynthesis; which is attached to window.
I have overcome with a dirty workaround of keeping the screen on by using NoSleep.js, which essentially plays a video in the background to keep the device awake.
However, I noticed that some music players can play audio when browser is minimised and even when the screen is turned off. Example: wynk.in
Is it possible to achieve the same in my case? Any inputs on how is it done in music apps? Is attaching to anything other than window going to help?
Link to my WebApp: https://yakshag.github.io/tts.html
Link to my JS Script: https://yakshag.github.io/js/tts.js
PS: I am a beginner in JavaScript :p
I was wanting to make a webpage that streams my phone's camera to the webpage so that I could add a filter on top while it records. Sort of like Snapchat but on the browser instead of an app.
So far, all I can find is the option to record a video and then have the video display afterwards on the browser. I am however looking for a live stream option.
Is there any way to implement this?
You can use the html5 getUserMedia, heres a tutorial:
https://developer.mozilla.org/en-US/docs/Web/API/MediaDevices/getUserMedia
*Situation is -*I have to design a webpage that works on both desktop and on mobile devices, that plays an embeded youtube video (not autostarted) which then i can use interact with the video to get the play state etc (using this to count the total time watched, excluding pause/buffer/stop time)
I've managed to build the website using both the IFrame API and Javascript API from Youtube.
*However the problem is-*With Javascript API, the player on the desktop works and the view gets counted too but it doesnt work when i view it on a mobile device. The place where it should show the video (the div tag) doesnt react.
With the IFrame API, everything works in terms of functionality but when i press play on the video, the views do not get counted - accessed from both desktop and mobile. I've tested the viewcount several times, with different IP etc but while javascript API web gets counted instantly, IFrame API web is still not getting counted..
Does anyone have any suggestion to this problem?
Just to remind you, the crucial aspect of the webpage that i need are:
Embeded Youtube player
Being able to interact with the player (e.g. getState() or getDuration())
Website fully functional with mobile access
Valid view count
Thank you all in advance!! :)
Are you sure that the views are really being counted via the javascript api? I have always read that view counts only are considered valid when initiated using the actual YT play button. Meaning you can't initiate play via any API call and have the view count.
Here is a a somewhat dated blog post from Google/Youtube team explaining - https://groups.google.com/forum/#!msg/youtube-api-gdata/7SsbvOJMWL0/rBCBqnFaxRgJ
I'm trying to find something that says this has changed but so far no luck.
** EDIT **
So I went back to the iFrame and Javascript API pages and if you do a search for 'view count' the first piece of content you'll find is a line stating --
"Note: A playback only counts toward a video's official view count if it is initiated via a native play button in the player."
Re-reading your post I'm not sure if you are using the API's to play the video or just to get data from its current state. If you are using it to actually initiate play then the view counts are not actually being counted. If you are seeing something different I'd love to see and learn from it.
I have a webpage that "plays a video" using sprite sheets. The page is mobile-optimized, so it can get loaded into Android and iOS WebViews. I'd like to know is when the page is visible so only after that I can play the video. I don't want users to catch the video mid-stream because the WebView lags in presenting itself.
I can see some developers might wait until the whole page has finished pulling in all the assets from the page before making it visible to the user. So, I don't want the "video" to start before that time. I can't rely on window.onload because that event fires even when the WebView isn't onscreen or visible.
How can I accomplish this from the client side, with some JavaScript, preferably?
[Edit] To be clear, I'm implying that I don't have any control over the native WebView. You can load web pages into a WebView that isn't onscreen and push the view or add it to the on-screen layout at a later time. My issue is that when my webpage's URL is loaded into a WebView, I can't tell when the WebView comes onscreen.
Take a look at the Safari Web Content Guide. Scroll down to the Supported Events table. I am thinking (or hoping) that the pageshow event will do what you are hoping for. There is also the focus event.
Looks like using these events for mobile Safari would be as easy as
<body onpageshow="onPageShow();">
I am less familiar with Android, but I will look into it real quick.
EDIT: The onpageshow solution should work the same way in Android 2.2 and above as it does in iOS 4.0 and above. As for whether it works the way you need it to, I am not entirely sure. Let me know!
It is not possible to control the webview using JavaScript. If its not too late to change the design of the app, using native APIs will give you more control of the webview.
You could insert a timeout in the webpage before loading the video. It might be worth a shot.
you can use phonegap library:
document.addEventListener("deviceready", onDeviceReady, false);
function onDeviceReady() {
// Now safe to use the PhoneGap API
}
phonegap is very good for handle events and more action in webview.
I am trying to integrate a webpage into an iPad application. Something like the Final Hour app where the app is a native app using slideshows. some of the slideshow pages have a small part that is loaded from a website.
The only way I would have known this was when I didn't have internet access the small area where the website loads into said there was no internet connection.
How can I implement something like this? I understand the website aspect but I don't understand what iOS API they use to setup some sort of canvas or frame to hold the website.
Here is an image of the app. The webpage would be loaded into the "blue" box outline.
You are indeed looking for a UIWebView. You need to build the controls yourself if you want your users to navigate in the website. If you just want to show one page, with no Back or Reload button, then you can use it as is.
Note that you should try to make it clear to your users that your app might not work properly if no Internet connection is available.