0

I got the library to work ('react-native-webrtc'), and I can receive an audio stream. But on iOS, the mic permission is turned on and I can see the orange dot in the top right corner of the screen saying it’s recording, but it shouldn’t. I just want to watch/listen to the stream, it should not be activated.

iOS sees: WebRTC peer connection + microphone permission declared = Shows orange indicator

What I researched: According to WebRTC developers, this is "Working As Intended" because "WebRTC is designed for full-duplex" and "the orange dot does not mean that audio is recorded but that the microphone device is opened." I'm not muting anything - there's no microphone stream to mute. The orange indicator appears because WebRTC reserves microphone resources "just in case," even for receive-only connections.

So I was wondering is there a solution to this problem? Anything works. It can be pure WebRTC, Mediasoup enhanced, low level Objective-C solutions etc. etc.

I'm only receiving audio. But iOS still shows the orange indicator because I believe WebRTC reserves the microphone device even for receive-only connections.

// I only create a MediaSoup Device to RECEIVE audio
const device = new Device();
await device.load({ routerRtpCapabilities: response.routerRtpCapabilities });
    
// Device logs show it CAN produce, even though I never asked for mic access
console.log('Device can produce audio:', device.canProduce('audio')); 
// ^ This logs TRUE even though I never called getUserMedia()

// I only create a CONSUMER transport (receive-only)
if (response.audioPidsToCreate.length > 0) {
    const firstPid = response.audioPidsToCreate[0];
    await createConsumerTransport(firstPid); // Only consuming, not producing
}

and when I consume audio:

const consumeAudio = async (transport, audioPid) => {
  // I'm ONLY consuming (receiving) audio from the server
  const consumerParams = await socket.emitWithAck('consumeMedia', {
    rtpCapabilities: device.rtpCapabilities,
    pid: audioPid,
    kind: 'audio'
  });

  const consumer = await transport.consume(consumerParams);
  
  // Create MediaStream ONLY for playback - no local tracks added
  const { MediaStream } = require('react-native-webrtc');
  const remoteStream = new MediaStream();
  remoteStream.addTrack(consumer.track); // Remote track only
  setRemoteStream(remoteStream);
  
  // Never call getUserMedia, never access microphone
};

Logs:

  • LOG rn-webrtc:pc:DEBUG 1 ctor // WebRTC peer connection created
  • LOG rn-webrtc:pc:DEBUG 1 setRemoteDescription // Only receiving
  • LOG rn-webrtc:pc:DEBUG 1 ontrack // Only getting remote tracks
1

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.