Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GROUP VIDEO CHAT - audio doesn't work with vid constraints set to false. #7

Open
mmvphil opened this issue Dec 28, 2018 · 11 comments
Open

Comments

@mmvphil
Copy link

mmvphil commented Dec 28, 2018

REGARDING group_video_chat
WHEN we SET
var mediaConstraints = {
audio: true,
video: false
};
then when we connect it connects fine and data is being passed as seen by analytics
HOWEVER no auido.
Is this something to do with autoplay .. this was done on two desktops

What could we adjust in main.js to solve this problem?

@mmvphil
Copy link
Author

mmvphil commented Dec 28, 2018

web-rtc example works with video:false audio:true just fine so that means it's definitely with a way
that group-video deals with the html elements

@mmvphil
Copy link
Author

mmvphil commented Dec 29, 2018

OK so i tried the updated one 'STUN TURN VIDEO' and i can get it to connect for a while then it errors
InvalidStateError: setRemoteDescription needs to called before addIceCandidate

--------HERE IS MY HACKED CODE OF ---------------
BLABLABLA:3443/stun_turn_video/?xuid=user100333&calllist=user100444-
//----------------------------------------------------------------------
'use strict';

// Getting references to page DOM for video calling.
const localVideoEl = document.getElementById('local-video');

var remoteVideoEl=[];
remoteVideoEl[1] = document.getElementById('remote-video1');
remoteVideoEl[2] = document.getElementById('remote-video2');
remoteVideoEl[3] = document.getElementById('remote-video3');

var nextavail=0;
var clist=[];
clist[1]=0; clist[2]=0; clist[3]=0; clist[4]=0;
var incallwith=[];
incallwith[1]=0; incallwith[2]=0; incallwith[3]=0; incallwith[4]=0;
var mediaConstraints = { audio: true, video: false };

var localStream,//local audio and video stream
remoteStream,//remote audio and video stream
ice,//ice server query.
sig,//sigaling
peer,//peer connection.
media;//cam and mic class

/if url has callid wait for other user in list with id to call
else if no id in url create a sharable url with this username.
/
var username,//local username created dynamically.
remoteCallID,//id of remote user
inCall = false,//flag true if user in a call, or false if not.
channelPath = '';//set this variable to specify a channel path

//custom: check URL for "ch" var, and set the channel accourdingly
var ch = decodeURI( (RegExp('ch' + '=' + '(.+?)(&|$)').exec(location.search)||[,null])[1] );
if(ch != 'null' ) channelPath = ch;
console.log('channel path: ',channelPath);

//if there is no remoteCallID show sharable link to call user.

function callRemotePeer(x){
console.log('calling remote peer '+x)
peer.callPeer(x);
}

// Get Xirsys ICE (STUN/TURN)
function doICE(){
console.log('doICE ');
if(!ice){
ice = new $xirsys.ice('/webrtc',{channel:channelPath});
ice.on(ice.onICEList, onICE);
}
}

function onICE(evt){
console.log('onICE ',evt);
if(evt.type == ice.onICEList){
getMyMedia();
}
}

//Get local user media
function getMyMedia(){
console.log('getMyMedia()');
//setup media
if(!media){
media = new $xirsys.media();
media.on(media.DEVICES_UPDATED, onMediaDevices);//returns list of media devices on local user machine.
media.on(media.ON_LOCAL_STREAM, onMediaDevices);//returns a/v stream of local user.
//listen for camera changes.
$('#ctrlMenu #camList').on('click', e => {
let $targ = $(e.target);
//if local stream exists, check if we are selecting the same device we currently are using. otherwise this is false.
let isDup = !!localStream ? hasMedia($targ.text(),localStream.getVideoTracks()) : false;//false;
//console.log('dup ',isDup);
if(isDup) return;

        if(typeof(mediaConstraints.video) != 'object') mediaConstraints.video = {};
        //update mediaConstraints object with new device.
        mediaConstraints.video.deviceId = {
            exact: $targ.attr('id')
        }
        console.log('*main*  cam selected - mediaConstraints: ',mediaConstraints);
        getMyMedia();
    })
    $('#ctrlMenu #micList').on('click', e => {
        let $targ = $(e.target);
        //if local stream exists, check if we are selecting the same device we currently are using. otherwise this is false.
        let isDup = !!localStream ? hasMedia($targ.text(),localStream.getAudioTracks()) : false;//false;
        //console.log('dup ',isDup);
        if(isDup) return;

        if(typeof(mediaConstraints.audio) != 'object') mediaConstraints.audio = {};
        //update mediaConstraints object with new device.
        mediaConstraints.audio.deviceId = {
            exact: $(e.target).attr('id')
        }
        console.log('*main*  mic selected - mediaConstraints: ',mediaConstraints);
        getMyMedia();
    });
}
//gets stream object of local users a/v
media.getUserMedia(mediaConstraints)
    .then(
        str => {
            console.log('*main*  getUser Media stream: ',str);
            setLocalStream(str);
            //create signal if null
            if(!sig) doSignal();
            //if the peer is created, update our media
            if(!!peer) peer.updateMediaStream(localStream);
        }
    ).catch(
        err => {
            console.log('Could not get Media: ', err);
            alert('Could not get Media!! Please check your camera and mic.');
        }
    );

}

//Get Xirsys Signaling service
function doSignal(){
sig = new $xirsys.signal( '/webrtc', username,{channel:channelPath} );
sig.on('message', msg => {
let pkt = JSON.parse(msg.data);
//console.log('main signal message! ',pkt);
let payload = pkt.p;//the actual message data sent
let meta = pkt.m;//meta object
let msgEvent = meta.o;//event label of message
let toPeer = meta.t;//msg to user (if private msg)
let fromPeer = meta.f;//msg from user
//remove the peer path to display just the name not path.
if(!!fromPeer) {
let p = fromPeer.split("/");
fromPeer = p[p.length - 1];
}
switch (msgEvent) {
//first Connect Success!, list of all peers connected.
case "peers":
//this is first call when you connect,
onReady();
// if we are connecting to a remote user and remote
// user id is found in the list then initiate call
if(!!remoteCallID) {
let users = payload.users;
if(users.indexOf(remoteCallID) > -1){
callRemotePeer();
}
}
break;
//peer gone.
case "peer_removed":
if(fromPeer == remoteCallID) onStopCall();
break;
}
})
}

//Ready - We have our ICE servers, our Media and our Signaling.
function onReady(){
console.log('* onReady!');
// setup peer connector, pass signal, our media and iceServers list.
let isTURN = getURLParameter("isTURN") == 'true';//get force turn var.
console.log('isTURN ',isTURN);
peer = new $xirsys.p2p(sig,localStream,(!ice ? {} : {iceServers:ice.iceServers}), {forceTurn:isTURN});
//add listener when a call is started.
peer.on(peer.peerConnSuccess, onStartCall);
docalllist();
}
function docalllist(){
for (var u=1; u<5; u++){
if ( typeof(clist[u]) !== "undefined" && clist[u] !== null && clist[u]!='0') {
var xtime=(u-1)*2000;
window.setTimeout(callRemotePeer, xtime, clist[u]);
}
}
}
// A peer call started udpate the UI to show remote video.
function onStartCall(evt){
console.log('main onStartCall ',evt);
let remoteId = evt.data;
setRemoteStream(peer.getLiveStream(remoteId),evt.data);
remoteCallID = remoteId;
//inCall = true;
}

function onStopCall() {
console.log('main onStopCall ');
if( inCall ){
peer.hangup(remoteCallID);
}
inCall = false;
remoteCallID = null;
}

/* UI METHODS */

//sets local user media to video object.
function setLocalStream(str){
console.log('main setLocalStream & Video obj ',str);
localStream = str;
localVideoEl.srcObject = localStream;
}
//sets remote user media to video object.
function setRemoteStream(str,x){
nextavail++;
if (nextavail>4){ nextavail=1;}
remoteStream = str;
remoteVideoEl[nextavail].srcObject = remoteStream;
incallwith[nextavail]=x; // xid should be next avail call slot
console.log('xxxxxxxxxxxx setRemoteStream ',str,'id is '+x+" dom vid ="+nextavail);
}
//update the list of media sources on UI
function onMediaDevices( e ){
console.log('main onMediaDevices: ',e,' data: ',e.data);
switch(e.type){
case media.DEVICES_UPDATED:
updateDevices(e.data);
break;
case media.ON_LOCAL_STREAM:
//update list with selected.
setSelectedDevices(e.data.getTracks());
break;
}
}

function updateDevices(devices){
const mics = devices.audioin,
cams = devices.videoin;
console.log('main updateDevices - mics:',mics,', cams:',cams);
const camList = $('#ctrlMenu #camList'),
micList = $('#ctrlMenu #micList');
//camToggle = $('#ctrlMenu #camToggle'),
//micToggle = $('#ctrlMenu #micToggle');

micList.empty();
mics.forEach(device => {
    micList.append('<li><a id="'+device.deviceId+'" data-group-id="'+device.groupId+'" class="btn" role="button">'+device.label+'</a></li>')
});

camList.empty();
cams.forEach(device => {
    camList.append('<li><a id="'+device.deviceId+'" data-group-id="'+device.groupId+'" class="btn" role="button">'+device.label+'</a></li>')
});

}

function setSelectedDevices(devices){
console.log('main setSelectedDevices: ',devices);
//console.log('- video: ',devices.getVideoTracks() );
devices.forEach(device => {
switch(device.kind){
case 'audio':
console.log('- audio toggel: ',device);
$('#ctrlMenu #micToggle').html(device.label.substr(0,20) + '');
break;
case 'video':
console.log('- video toggel: ',device);
$('#ctrlMenu #camToggle').html(device.label.substr(0,20) + '');
break;
}
})
}

/* TOOLS */

function hasMedia(label,tracks){
console.log('tracks: ',tracks,', label: ',label );
let l=tracks.length, i, hasIt = false;
for(i=0; i<l; i++){
let track = tracks[i];
if(track.label.indexOf(label) > -1){
hasIt = true;
break;
}
}
return hasIt;
}

//gets URL parameters
function getURLParameter(name) {
let ret = decodeURI( (RegExp(name + '=' + '(.+?)(&|$)').exec(location.search)||[,null])[1] )
return ret == 'null' ? null : ret;
};

window.onload = () => {
var xuid = getURLParameter("xuid").trim();
var calllist = getURLParameter("calllist");
if (calllist!=null){
var z=calllist.split('-');
for (var u=0; u<z.length; u++){
if ( typeof(z[u]) !== "undefined" && z[u] !== null ) { clist[u+1]=z[u].trim() }
}
}
console.log('array clist= '+clist[1]+' '+clist[2]+' '+clist[3]+' '+clist[4]);
console.log('pretty loaded!! your id'+xuid+' list '+calllist);
username = xuid;//create random local username
doICE();
};

@mmvphil
Copy link
Author

mmvphil commented Dec 29, 2018

All I really want is to be able to give a link that atuo joins people to an audio conference.
THATS IT .. been working on this for months. THOUGHT XIRSYS would help but they have zero support and WEBRTC in general is CRAP no doubt thanks to non cooperation of big corporations doing their best to stunt technology and any sort of competition.
TOTALLY DISCUSTED

@Lazarus404
Copy link
Member

Hi @mmvphil,

I do apologise; we don't have anyone actively watching the Github comments. Github doesn't send messages direct unless you follow them specifically and it also makes for lousy ticketing software :-)

What we do advise is that you post a message to [email protected], which will place the message directly in our tech support platform. Questions are usually answered the same day (or next working day if posted on the weekend). We do pride ourselves in having a very responsive support system, regardless of company size or problem, even if you're using someone else's platform, so I do apologise if you have had a bad experience so far. Let's see if we can rectify that.

Regarding your issue, note that we are 100% client agnostic. You do not need a special SDK to work with our platform. We work with ANY WebRTC library or front-end out-of-the-box, providing your platform doesn't require the TURN secret key, long-term credential system. This is typically only used in WebRTC by third-party SFU's. In such circumstances, we can support your needs, but usually only through a dedicated server.

In your circumstance, you should be able to get your requirement satisfied with any existing example.

I believe your problem lies in still having some video related configuration in your application. The audio plays fine but the video element times out, resulting in your message. I might be slightly off here. I am a TURN server expert, not a WebRTC API expert. My colleagues are probably better off answering this and I will present the problem to them so you have a better response.

In the meantime, have you tried:

https://www.webrtc-experiment.com/audio-broadcast/#4630474595939506

@muaz-khan is an exceptional coder and most of his examples work with Xirsys out-of-the-box.

Regards,
Lee

@mmvphil
Copy link
Author

mmvphil commented Jan 2, 2019

Lazurus , thank you for the response,

Let me state this clearly ..
We have a game in which up to 4 people join a room to play.

We want these 4 people to be able to audio with each other. NOT 1 to many NOT video

if YOUR EXAMPLE 'GROUP VIDEO CHAT' worked with constraints audio:true video false
our problem would be solved.

Alll we want to do is make a dynamic link before our game starts that our app can iframe into the dom.

@mmvphil
Copy link
Author

mmvphil commented Jan 2, 2019

Not sure why github had all bold .. regarding this issue.. if you can't easily make your 'group video chat' work with audio true vid false constraints then there is NO WAY we can. This is not our wheelehouse and webrtc is THE MOST frustrating platform we have ever come accross. It's really shameful that their is NO CONSISTANT solution to a SIMPLE AUDIO CONF ROOM. after 5 years.. yes muaz-khan seems to have some examples that work 'for a while' BUT THE DOCUMENTATION IS ALL OVER THE PLACE
AND HARD TO FOLLOW until yet again standards are changed. This is why we wanted to try xirsys. At this point we are so discourage we are thinking about scraping the webrtc all together. It's a real shame why webrtc players are so cheaky.

@Lazarus404
Copy link
Member

I understand your frustration. The biggest contributing factor is that WebRTC has only recently hit a public v1.0 status that is agreed upon by all browsers. The API has been in development for some time and so there is a lot of documentation that will no longer be 100% accurate or cross-browser supported.

Many of my colleagues will be back at work, today. I will ensure they reach out to you and help you get this issue sorted ASAP.

@mmvphil
Copy link
Author

mmvphil commented Jan 2, 2019

Thanks brother.. yeah WEBRTC has such great potential but hard to overcome entrenched closed structure corporations like browsers and telecom oligopolies.

@noeplease
Copy link
Contributor

noeplease commented Jan 2, 2019

@mmvphil As Lee guessed the problem you are experiencing is related to the configuration in your application. By default the video elements attached to the DOM for each incoming peer streams are set with an attribute of muted. The muted attribute was set by default to achieve autoplay on iOS/mobile devices.

In order to enabled audio only stream on the Group Video Chat application you must not only alter the media constraints but you must also remove the muted attribute set on the remote video element added to the DOM.

main.js - line 259:

'<video class="vid-obj" autoplay playsinline></video>' +

Since you have already contacted [email protected] we will be sending you a copy of the Group Video Chat application's main.js file which will contain the required alterations in order to make it an Audio Chat application.

@mmvphil
Copy link
Author

mmvphil commented Jan 3, 2019

Wow I will try this.. I'm so excited now I'm jumping out of my chair.. thanks!

@mmvphil
Copy link
Author

mmvphil commented Jan 3, 2019

FIXED! I saw muted before and my brain registered it on the RECORDING dom not the playing one.
Thanks for the help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants