How can I send livestream to YouTube using node js - javascript

You may know the software called OBS studio: by only knowing the live stream key, it sends live stream to YouTube.
I am supposing that it is using some kind of YouTube API to do that. If that's the case what is that API? Can I use this with Node.js?

Google offers an API that is quite reach of features:
YouTube Live Streaming API.
You may begin your journey by reading the official getting started doc: YouTube Live Streaming API Overview. Then I recommend absorbing these two important documents: Life of a Broadcast and Understanding Broadcasts and Streams.
Depending of the type of application you intend to develop (desktop app or server-side web app), you need to get acquainted with the so-called OAuth 2.0 authentication/authorization flows (since all the endpoints of this API require OAuth): OAuth 2.0 Flow: Installed apps or OAuth 2.0 Flow: Server-side web apps.
For what concerns Node.js, Google has made available a client library: Google API Client Library for Node.js (alpha); and also some Node.js sample code, that, unfortunately, does not yet include programs exercising the Live Streaming API.

There is the YouTube Live Streaming API. It is an HTTP-based API, so you will be able to access it from Node.js as well as basically any other programming language capable of making HTTP-requests.

Related

Connecting WebSockets from browser to other applications

Presentation
My application is split in 3 parts :
C# .Net 5.0 desktop application that harvest and distribute data
Angular 10 static application that control the C# app (start/Stop) and
print the data in the form of charts for the user
Azure Function Rest API used as a backend for the Angular Static
Application (.NET core 3.1)
My goal is to have a WebSocket used for "real time" communication between the C# and Angular app instead of relying on Http Request to Azure Function REST api
I don't use a "classic" server like express.js. Azure Function replaces it fully.
Because of that, I'm using the service Azure WebPubSub to host the sockets.
Problem : I cannot connect the websocket I create from Angular (Browser Application).
The problem doesn't come from WebPubSub because the WebSockets are fully working (sending & receiving) between my C# app, Azure Functions and even some node.js test scripts.
Also, my web socket created in the browser app can communicate with itself but no other apps get the message send from my websocket in browser app.
Question : Can I communicate from the browser to others app in real time (websocket based) with my current architecture and if yes, how ?
On the Azure WebPubSub Documentation Page there are no examples of what I try to achieve. Same thing searching on the internet.
A solution using an Express.js server linked to the app, being the could be used as the middleman between the browser app and c# app. But I want to avoid it if possible.
I just want to know if what I'm trying to do is possible.
Also, the app could be re-done in React.js so non-Angular specific answer are better.
The Web PubSub logstream sample:https://learn.microsoft.com/azure/azure-web-pubsub/tutorial-subprotocol?tabs=javascript sounds like a similar scenario.
Your AngualrJS Web App is similar to the logstream's web application, and it joins the stream group.
Your C# .Net 5.0 desktop application is similar to the logstream's message publisher application stream: https://learn.microsoft.com/azure/azure-web-pubsub/tutorial-subprotocol?tabs=csharp#publish-messages-from-client, which send messages to the stream group.

How can I use Google Speech API with streaming from a web app?

I am currently trying to use Google Speech API to do a live speech to text transcription in a web application. In order to do that I have to use the RPC streaming recognition (web sockets). I know there are multiple client libraries, but none of them gives the possibility to stream the audio directly from the web app to the Google Speech API. No plain javascript libraries.
I also know it probably is possible to do this by setting up a web socket connection between the front-end and the backend, and then, in my case, use the NodeJS client library to stream to the Google Speech API. However, this seems to be unnecessary complex.
Is there really no supported way of using the streaming recognition directly from a web app?
Does anyone know how this could be done?
EDIT
I havent gotten as far as actually sending a stream to the service, which is the baseline of my question.
Let me rephrase my question: Is there a way to send an audio stream to the Google Speech API directly from the browser/microphone? My app is created in JavaScript (Angular).
I've used IBM Watson S2T before, and they deliver a JavaScript SDK available through bower that can transcript audio from microphone directly to the service without passing it through a backend layer.
Regards,
Kjetil

Is a Spring Boot microservice with a non-Java front-end client possible?

I've implemented the shell of a microservices-based REST API application. I have simply followed the guides on Pivotal Springs' own documentation using Eureka and Ribbon for load balancing. Everything works. I have a discovery server with a handful of independent services which can register with the discovery server.
Now, my problem is that I might prefer not to write my client-side app in Java - maybe Angular or node.js, etc. However, the load balancing and connecting to the discovery server is all done in Java in the examples I've followed.
Is it possible to use JavaScript to do the same things that the Eureka client does with the Spring Boot microservices so that I don't need to be constrained in my choices of browser client technology? Does anybody have any advice for how this should be approached? I had difficulty finding any articles that cover this, to be honest.
Yes. Definitely you can choose technology of your choice for developing front end application. From your front end application, you make calls to API endpoint that you expose via your spring boot application.
You might want to expose your services via single API gateway that will help you route requests to designated micro services using your discovery server.
Actually you should not be doing load balancing/service discover etc. in the front-end. So the question about whether it is possible in JavaScript or with which libraries is irrelevant.
Typically you'll have an API gateway or a (load balancing) proxy which works with your service registry and routes requests accordingly. In the current project we use Consul for service registry and Nginx + consul-template as proxy. We plan to migrate to some API gateway.
With this setup your front-end will connect to just one central endpoint which would do load balancing/routing to individual service instances behind the scenes. Thus your front-end will not need to implement anything like Eureka/Ribbon etc.

Javascript real-time voice streaming and processing it in django backend

Hi I'm currently working on a project where I want to stream users' voice, using js, in realtime - from user's perspective, think Google's speech recognition API demo.
So far I tried few jquery libraries but they doesn't seem to work like I expected - there was either no compatibility with web browser, they couldn't detect microphone or sending to server failed.
Recently, I was exploring webrtc and it seems it could do the job, but I'm not sure if it's possbile to stream from web browser to django backend.
I don't want to use neither node.js nor java's apllets.
I will appreciate any help with js as well as with receiving voice stream in django. Thank you!
There are two separate parts here to consider: signaling and media.
The signaling part (as well as the application logic) can be handled by django. The media part can't.
In order to handle the media part you will need to use a media server that receives and processes that data - the low level media processing parts are usually implemenetd in C/C++. See http://kurento.org for a media server framework that can fit your needs (though it isn't written in Python).

How to build xmpp over web sockets web chat application like gtalk

How to build xmpp over web sockets web chat application like gtalk using javascript+html or asp.net?
ıt accepts file transfer, video conferencing, private or group chatting. Multiple users and servers can communicate with each other.
If you want your chat system to work with google's gtalk, or If you want to create your own chat server and make private system you will need to implement xmpp on javascript,
follow the links that will help you
http://professionalxmpp.com/
https://github.com/maxpowel/jQuery-XMPP-plugin
There are two ways to achieve your goal:
You can build your own chat solution from scratch (build backend solution using XMPP XEPs, etc. and client-side app)
To make it easier for you, you can use a ready backend and SDK provided by some BaaS providers. In such way you can concentrate on building client-side solution and its UI while you already have a ready backend and a set of requests to connect to that backend and use it.
You can try ConnectyCube since it has both chat and video chat. For video chat they have two options: WebRTC peer-to-peer solution and SFU based one.

Categories

Resources