-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multi External Camera #121
Comments
Here's an example, but it's a single USB camera, https://github.com/pedroSG94/USBStreaming, I'm also dealing with multiplay today, as the lack of documentation is a big headache! |
Hello, Do you mean use this class?: I will need check it but I haven't multiple cameras so I can't fully test it. I only can test open one camera using that class. |
Currently, I'm browsing other libraries https://github.com/shiyinghan/UVCAndroid, it can play multiple cameras properly and I can use TextureView or SurfaceView as a container, but when using TextureView or SurfaceView as an RTSP Server screen, I have a problem, the rtsp:// is always black! @pedroSG94 Can you look at the problem for me? TextureView textureView = new TextureView(getContext());
textureView.setSurfaceTextureListener(new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(@NonNull SurfaceTexture surface, int width, int height) {
streamPlayerSurfaceViewUsbCameraHelper.startPreview();
streamPlayerSurfaceViewUsbCameraHelper.addSurface(surface, false);
RtspServerStream rtspServerStream = new RtspServerStream(getContext(), 17554, new ConnectCheckerEvent() {
@Override
public void onStreamEvent(StreamEvent streamEvent, String s) {
Log.d(TAG, s);
}
});
rtspServerStream.getGlInterface().setAspectRatioMode(AspectRatioMode.Fill);
Size previewSize = streamPlayerSurfaceViewUsbCameraHelper.getPreviewSize();
boolean prepareVideo = rtspServerStream.prepareVideo(previewSize.width, previewSize.height, 2000 * 1024, 25, 0, 90);
boolean prepareAudio = rtspServerStream.prepareAudio(48000, false, 128000);
rtspServerStream.startStream();
rtspServerStream.startPreview(textureView);
}
}); |
I can't help you like that. I need a full example code to test it. You have an internal surface that is the surface that you need render with an external api (camerax, usb library, etc). This surface receive frames from that apis and it is used internally to work. You can access to that surface using a VideoSource in start method like here: startPreview method is only a method that you use to show the result of the render in the library. Basically you copy the stream result in that surface. That surface is not used to read frame, it is used to write frames provided by the surface of VideoSource. So you need create a new VideoSource like CameraXSource and use your streamPlayerSurfaceViewUsbCameraHelper as a source replacing camerax. |
@pedroSG94 Thanks for the reply, I realize the problem have managed to get it running but the screen seems to have black edges and no matter how I set the resolution the screen is always vertical, I'm trying to figure out how to deal with it! It's worth noting that isRunning must be false the first time it's called, otherwise it won't be able to enter the startup state, i.e. it won't be able to render the SurfaceTexture of the video source. RtspServerStream rtspServerStream = new RtspServerStream(getContext(), 17554, new ConnectCheckerEvent() {
@Override
public void onStreamEvent(StreamEvent streamEvent, String s) {
Log.d(TAG, s);
}
}, new VideoSource() {
@Override
protected boolean create(int i, int i1, int i2, int i3) {
return true;
}
@Override
public void start(@NonNull SurfaceTexture surfaceTexture) {
streamPlayerSurfaceViewUsbCameraHelper.addSurface(surfaceTexture, false);
}
@Override
public void stop() {
}
@Override
public void release() {
}
@Override
public boolean isRunning() {
return false;
}
}, new MicrophoneSource());
GlStreamInterface glInterface = rtspServerStream.getGlInterface();
glInterface.setAspectRatioMode(AspectRatioMode.Fill);
glInterface.setAutoHandleOrientation(false);
Size previewSize = streamPlayerSurfaceViewUsbCameraHelper.getPreviewSize();
boolean prepareVideo = rtspServerStream.prepareVideo(1080, 1920, 6000 * 1024, 25, 0, 90);
boolean prepareAudio = rtspServerStream.prepareAudio(48000, false, 128000);
rtspServerStream.startStream(); |
Try to add this line before startStream: rtspServerStream.getGlInterface().forceOrientation(OrientationForced.LANDSCAPE); If the problem is not solved, share me a photo of the result in both cases (with the line and without the line) |
About running value. I recommend you create a boolean value and change the state in the start/stop methods. Also, if you have a method to remove the surface from streamPlayerSurfaceViewUsbCameraHelper use it in stop method to desattach the surface. |
@pedroSG94 Thank you for your answers, I have managed to achieve the results I wanted and your answers have been very helpful, thanks again! |
Great! I'm glad that you solved the problem. |
I will, at a later date I will create a sample |
@dixtdf Thank you very much, I would be very grateful for an example. The only thing I managed to do was switch between the connected cameras. Now I'm trying to find a solution so that all cameras are displayed on the screen at the same time |
@milazki Sorry I'm a bit busy with work, the full example will need to be later, this is a simple example, maybe you can try it, when a device is inserted StateCallback will automatically go to onAttach, I believe this part of the code can help you. CameraHelper tries to be as globally unique as possible, frequent camera switching may lead to crashes I've been debugging forceOrientation and prepareVideo(rotation=90) since it was really late yesterday, they feel a bit strange, maybe it's the orientation of my usb camera that's wrong, but it basically does what I want, can't ask for too much! usbCameraHelper = new CameraHelper();
usbCameraHelper.setStateCallback(new ICameraHelper.StateCallback() {
@Override
public void onAttach(UsbDevice device) {
UsbManager usbManager = (UsbManager) getSystemService(Context.USB_SERVICE);
PendingIntent permissionIntent = PendingIntent.getBroadcast(getContext(), 0, new Intent(ACTION_USB_PERMISSION), 0);
usbManager.requestPermission(device, permissionIntent);
}
@Override
public void onDeviceOpen(UsbDevice device, boolean isFirstOpen) {
usbCameraHelper.openCamera();
}
@Override
public void onCameraOpen(UsbDevice device) {
usbCameraHelper.startPreview();
String deviceName = StringUtils.substring(device.getDeviceName(), StringUtils.lastIndexOf(device.getDeviceName(), "/") + 1);
RtspServerStream rtspServerStream = new RtspServerStream(getContext(), 10554 + Integer.valueOf(deviceName), new ConnectCheckerEvent() {
@Override
public void onStreamEvent(StreamEvent streamEvent, String s) {
Log.d(TAG, s);
}
}, new VideoSource() {
@Override
protected boolean create(int i, int i1, int i2, int i3) {
return true;
}
@Override
public void start(@NonNull SurfaceTexture surfaceTexture) {
usbCameraHelper.addSurface(surfaceTexture, false);
}
@Override
public void stop() {
}
@Override
public void release() {
}
@Override
public boolean isRunning() {
return false;
}
}, new MicrophoneSource());
GlStreamInterface glInterface = rtspServerStream.getGlInterface();
glInterface.setAspectRatioMode(AspectRatioMode.Adjust);
glInterface.forceOrientation(OrientationForced.NONE);
Size previewSize = usbCameraHelper.getPreviewSize();
boolean prepareVideo = rtspServerStream.prepareVideo(1920, 1080, 6000 * 1024, 25, 0, 90);
boolean prepareAudio = rtspServerStream.prepareAudio(48000, false, 128000);
if (prepareVideo && prepareAudio) {
rtspServerStream.startStream();
}
}
}); |
@dixtdf Thanks for the hint, but unfortunately I didn't succeed. I would be glad to see your example of how you will be able to solve this problem |
My approach for multiple USB playback is to automatically create an RTSP service when a USB is inserted, and then play the video directly using the RTSP address. This can avoid frequent switching of the camera, although it may consume more memory. The VideoSource is the most important part in the example I provided. Once you learn how to write data to the SurfaceTexture of VideoSource, you will have succeeded. My suggestion is that you can avoid using CameraHelper. You can try rendering the USB video to a regular view first, and then try creating an RTSP service if it works. |
Hi, Pedro!
I recently studied the work of Android and external camera, I found the feature/extra-video-source branch from you. I managed to use it to output video from my usb camera.
In addition, I got acquainted with the issue and found this: #110 . In it, a ronaldsampaio implemented his openCamera function for CameraClient for the AndroidUSB library (#110 (comment) ), which helped solve the problem with video output.
I have a question, is there any way to simultaneously output all the cameras connected to an Android device (for example, two cameras connected via a hub), and stream everything at once? Do I need to implement my own openCamera with SurfaceTexture for MultiCamera?
Thanks
The text was updated successfully, but these errors were encountered: