This repository has been archived by the owner on May 13, 2024. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 215
Resolution test does not use selected camera #262
Comments
A naïve replacement of the
|
I am also having this same issue on multiple Windows boxes, only the Default camera is used regardless of video selection. |
Maybe if you replace the current
|
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
In a system with multiple cameras (I have 5 listed) select different cameras and re-run the resolution tests. The camera used will not always be the camera selected. This does however work with the microphone.
Browsers and versions affected
Should be all - see Analysis below.
For completeness: MacOS 10.13.3, Chrome Version 65.0.3325.181 (Official Build) (64-bit).
Steps to reproduce
In a system with multiple cameras (I have 5 listed) select different cameras using the hamburger and re-run the resolution test. In my case
CamTwist
is the first camera listed butManyCam Virtual Camera
(or sometimes theFacetime HD Camera
) was the one used. I had selected the fourth option, aLogitech Brio
in this case.This is visible by expanding a resolution test and comparing the output to the expectations, e.g.:
Expected results
The resolution tests should be run with the camera selected in the configuration options.
Actual results
The camera used is chosen by the system and not the user.
Analysis
In
testrtc-main.html
there is a method which is intended to inject the settings into theGUM
call:In
mictest.js
we can see where this is called:however in
camresolutionstest.js
we see this is not called:the code bypasses the
doGetUserMedia
and callnavigator.mediaDevices.getUserMedia
directly, bypassing the source injection.The text was updated successfully, but these errors were encountered: