Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any new direction on using this library with USB Cameras (UVC Camera) #110

Closed
ronaldsampaio opened this issue Feb 6, 2024 · 37 comments
Closed

Comments

@ronaldsampaio
Copy link

Is there any new approach or implementation regarding using this Android USB Camera library to create a RTSP server?
If not could anyone point the directions that would be needed to implement this?
Thanks

@pedroSG94
Copy link
Owner

pedroSG94 commented Feb 7, 2024

Hello,

Using the main project (RootEncoder) you can add a custom VideoSource like in this:
https://github.com/pedroSG94/RootEncoder/blob/feature/extra-video-source/app/src/main/java/com/pedro/streamer/rotation/CameraXSource.kt

This example support CameraX but you can use any source rendering a SurfaceTexture. If you can find the way to use that library to render a SurfaceTexture. You can create your own VideoSource using USB camera.

This was added recently to the library so RTSP-Sever is not updated yet to support it but I will add it soon. For now, create a VideoSource in the RootEncoder example and you can reuse it with RTSP-Server

@ronaldsampaio
Copy link
Author

Hello,
Thanks for your quick reply! I'll try this solution and give some feedback here futurely.

@ronaldsampaio
Copy link
Author

Hi @pedroSG94 ,
So what this two methos create and start actually do inside the VideoSource? What I'm supposed to implement on them?
I think I'll have to change them to use a USBCamera. Those two methods how they are used by the RtspStream class?

The surfaceTexture that comes on the start method should be user to start the camera?
Thanks in advance!

@ronaldsampaio
Copy link
Author

Also, I have this callback that gives me encoded data

class EncodedPreview : IEncodeDataCallBack{
    override fun onEncodeData(
        data: ByteArray?,
        size: Int,
        type: IEncodeDataCallBack.DataType,
        timestamp: Long
    ) {
        TODO("Not yet implemented")
    }

}

Is there a way a can use it to input this ByteArray to RTSPServer?

@pedroSG94
Copy link
Owner

pedroSG94 commented Feb 8, 2024

Hi @pedroSG94 , So what this two methos create and start actually do inside the VideoSource? What I'm supposed to implement on them? I think I'll have to change them to use a USBCamera. Those two methods how they are used by the RtspStream class?

The surfaceTexture that comes on the start method should be user to start the camera? Thanks in advance!

Hello,

  • Create method is to configure camera resolution and fps. In this method you should check if your camera support this resolution and return true if success or false if failed. This method is only called when you call prepareVideo method.
  • Start method is used to start the camera and SurfaceTexture must be used for it. Using USB camera start preview in this method should be the way.
  • Stop method should be used to stop preview.
  • Release method is called only if VideoSource is replaced by other VideoSource. In this method you should release any pending resource.

Keep in mind that:

  • call stop and then start methods must be supported

Also, I have this callback that gives me encoded data

class EncodedPreview : IEncodeDataCallBack{
    override fun onEncodeData(
        data: ByteArray?,
        size: Int,
        type: IEncodeDataCallBack.DataType,
        timestamp: Long
    ) {
        TODO("Not yet implemented")
    }

}

Is there a way a can use it to input this ByteArray to RTSPServer?

For this case, if you already have h264 data. You can use RtspServer class directly like here:
https://github.com/pedroSG94/RTSP-Server/blob/master/rtspserver/src/main/java/com/pedro/rtspserver/RtspServerCamera2.kt#L61

@ronaldsampaio
Copy link
Author

Hi,
So using rtspStream I need to call prepareVideo() and prepareAudio() too? If yes from where should I pick the information like bit rate, sample rate, and etc to put inside those methods?

@pedroSG94
Copy link
Owner

Hello,

Using rtspStream you only need call prepareVideo and prepareAudio one time before use startStream, startRecord or startPreview instead of each time you need startStream or startRecord.

So this is a valid case sequence using rtspStream (this is invalid using other class like RtspCamera2):
prepareVideo, prepareAudio, startPreview, startStream, stopStream, startStream, stopStream, startRecord, stopPreview, stopRecord

About about bitrate. You can select the bitrate using prepareVideo and prepareAudio:
https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/base/StreamBase.kt#L98
https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/base/StreamBase.kt#L120

You don't need a bitrate inside VideoSource because the source is not supposed to be encoded. The source only need render a SurfaceTexture for video or send PCM buffers for audio that will be encoded internally

@ronaldsampaio
Copy link
Author

ronaldsampaio commented Feb 10, 2024

Hello,
So I'm trying to implement like the following.

This is my USBCameraSource

class USBCameraSource(
    private val context: Context,
    private val cameraClient: CameraClient,
    private val aspectRatioTextureView: AspectRatioTextureView,
): VideoSource(), LifecycleOwner {

    init {
        Log.d("camera_str", "USBCameraSource INIT!")
    }

    private val lifecycleRegistry = LifecycleRegistry(this)

    override fun create(width: Int, height: Int, fps: Int): Boolean {
        return try {
            Log.d("camera_str", "INSIDE CREATE USBCameraSource")
            cameraClient.setRenderSize(width,height)
            true
        } catch (e : Exception){
            Log.d("camera_str","EXCEPTION ON CREATE USBCAMERASOURCE -> $e")
            false
        }

    }

    override fun start(surfaceTexture: SurfaceTexture) {
        this.surfaceTexture = surfaceTexture
        aspectRatioTextureView.setSurfaceTexture(surfaceTexture)
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START)
        //cameraClient.openCamera(aspectRatioTextureView)
    }

    override fun stop() {
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_STOP)
        cameraClient.let {
            it.closeCamera()
            surfaceTexture = null
        }
    }

    override fun release() {
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_DESTROY)
    }

    override fun isRunning(): Boolean {
        return cameraClient.isCameraOpened()!!
    }
    override val lifecycle: Lifecycle
        get() = lifecycleRegistry
}
class StreamingController(private val context: Context) : ClientListener, ConnectChecker {

    private lateinit var rtspStream: RtspStream

    private var portNum = 18554
    private var prepared = false

    fun setUpServer(cameraClient: CameraClient, aspectRatioTextureView: AspectRatioTextureView) {
        rtspStream = RtspStream(
            context,
            this,
            USBCameraSource(context,cameraClient,aspectRatioTextureView),
            MicrophoneSource()
        )
        //rtspStream.changeVideoSource(usbCameraSource)
    }

    fun startStream(){
        if(rtspStream.prepareVideo(1280, 720, 3500) && rtspStream.prepareAudio(48000,false,320)){
            rtspStream.startStream("rtsp://107881f1a991.entrypoint.cloud.wowza.com/app-04C9G6g0")
        }
        else{
            Log.d("camera_str","ERROR PREPARING VIDEO OR AUDIO")
        }

        Log.d("Streaming", "STARTED SERVER!")

    }

I call the setUpServer and startStream inside my viewModel. The viewModel I use inside my Composable screen like on this AndroidView

AndroidView(
                    factory = { context1 ->
                        AspectRatioTextureView(context1).apply {
                            this.surfaceTextureListener = (object : TextureView.SurfaceTextureListener{
                                val _tag = "camera_streaming"
                                override fun onSurfaceTextureAvailable(
                                    surface: SurfaceTexture,
                                    width: Int,
                                    height: Int
                                ) {
                                    cameraClient.openCamera(this@apply)
                                }

                                override fun onSurfaceTextureSizeChanged(
                                    surface: SurfaceTexture,
                                    width: Int,
                                    height: Int
                                ) {
                                    Log.d(_tag, "onSurfaceTextureSIZECHANGED")
                                    cameraClient.setRenderSize(width, height)
                                }

                                override fun onSurfaceTextureDestroyed(surface: SurfaceTexture): Boolean {
                                    Log.d(_tag, "onSurfaceTextureDESTROYED")
                                    cameraClient.closeCamera()
                                    return true
                                }

                                override fun onSurfaceTextureUpdated(surface: SurfaceTexture) {
                                    //Log.d(_tag, "onSurfaceTextureUPDATED")
                                }

                            })
                        }.also {
                            usbCameraViewModel.startVideoStreaming(cameraClient, it)

                        }
                    }
                )

With this implementation I'm having the error Connection Failed: sps or pps is null.
This error means that the stream is not getting video data? Like the surface that I configured is not sending anything?
My camera preview(which is called when I call the cameraClient.openCamera(AspectRatioTextureView) ) appears only when I don't try to stream, i.e, when I dont call usbCameraViewModel.startVideoStreaming(cameraClient, it)

Thank you so much for your help until now! I'll be sponsoring the project with a few bucks.

@pedroSG94
Copy link
Owner

Hello,

According with your error. I think that the SurfaceTexture is not rendering properly. We can check it like this:
Copy RtmpStream class:
https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/rtmp/RtmpStream.kt
Then add logs or breakpoints here:
https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/rtmp/RtmpStream.kt#L88
And here:
https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/rtmp/RtmpStream.kt#L84
If that methods are never called after call startRecord or startStream. That means that your SurfaceTexture is not properly renderer.
Also, check that you disabled openGl in cameraClient (enabled by default):
https://github.com/jiangdongguo/AndroidUSBCamera/blob/ebfd9437eb6e23e30f583a0151474b2fa6267fca/libausbc/src/main/java/com/jiangdg/ausbc/CameraClient.kt#L649
It is needed to work using surfaceTexture as you did:
https://github.com/jiangdongguo/AndroidUSBCamera/blob/ebfd9437eb6e23e30f583a0151474b2fa6267fca/libausbc/src/main/java/com/jiangdg/ausbc/CameraClient.kt#L153
https://github.com/jiangdongguo/AndroidUSBCamera/blob/ebfd9437eb6e23e30f583a0151474b2fa6267fca/libausbc/src/main/java/com/jiangdg/ausbc/CameraClient.kt#L155

If this fail. Maybe the best way is modify that method to work as we need:
https://github.com/jiangdongguo/AndroidUSBCamera/blob/ebfd9437eb6e23e30f583a0151474b2fa6267fca/libausbc/src/main/java/com/jiangdg/ausbc/CameraClient.kt#L132
Maybe copy that class, modify the method to set my SurfaceTexture directly instead of the cameraView and use width and height as the size needed from the view. If this is working you can add a preview using my startPreview method.
Also, make sure that your code works using a normal cameraView before modify it to discard code errors (basically remove aspectRatioTextureView.setSurfaceTexture and check if the preview is working fine)

If you get stuck testing this. You can share me a code example and maybe I can try to test it (I'm not sure if I will be able to test it because I haven't a usb camera but in Android 14 connect your device as a webcam is possible so maybe this can work to test).

@ronaldsampaio
Copy link
Author

ronaldsampaio commented Feb 13, 2024

I think I'm getting closer!
I've cloned and changed the Android USB Camera and added the following:

    fun openCamera(surfaceTexture : SurfaceTexture, width: Int, height: Int){
        initEncodeProcessor()
        val listener = object : RenderManager.CameraSurfaceTextureListener {
            override fun onSurfaceTextureAvailable(surfaceTexture: SurfaceTexture?) {
                surfaceTexture?.let {
                    mCamera?.startPreview(mRequest!!, it)
                    mCamera?.addPreviewDataCallBack(this@CameraClient)
                }
            }
        }
        mRenderManager?.startRenderScreen(width, height, Surface(surfaceTexture), listener)
        mRenderManager?.setRotateType(mDefaultRotateType)
    }

My create and start functions from USBCameraSource are like

    override fun create(width: Int, height: Int, fps: Int): Boolean {
        mWidth = width
        mHeight = height
        return try {
            Log.d("camera_str", "INSIDE CREATE USBCameraSource")
            cameraClient.setRenderSize(width,height)
            true
        } catch (e : Exception){
            Log.d("camera_str","EXCEPTION ON CREATE USBCAMERASOURCE -> $e")
            false
        }

    }

    override fun start(surfaceTexture: SurfaceTexture) {
        this.surfaceTexture = surfaceTexture
        //aspectRatioTextureView.setSurfaceTexture(surfaceTexture)
        lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START)
        cameraClient.openCamera(surfaceTexture,mWidth,mHeight)
    }

And setUpServer and startStream from StreamingController:

    fun setUpServer(cameraClient: CameraClient, aspectRatioTextureView: AspectRatioTextureView) {
        rtspServer = RtspServer(this, portNum)
        _aspectRatioTextureView = aspectRatioTextureView
        rtspStream = RtspStream(
            context,
            this,
            USBCameraSource(context,cameraClient,aspectRatioTextureView),
            MicrophoneSource()
        )
        //rtspStream.changeVideoSource(usbCameraSource)

    }

    fun startStream(){
        if(rtspStream.prepareVideo(1280, 720, 3500) && rtspStream.prepareAudio(48000,false,320)){
            rtspStream.startStream("rtsp://localhost:8554/mystream")
        }
        //rtspStream.startStream("rtsp://rtspstream:0f0c888313b4e035a937b04a52b505e8@zephyr.rtsp.stream/movie")
        else{
            Log.d("camera_str","ERROR PREPARING VIDEO OR AUDIO")
        }

Now I'm getting a preview (looks like in inverted width and height because the image is like with black strips on the side and the image is deformed)
I'm trying to test sendind to a local server built with https://github.com/bluenviron/mediamtx
And getting the error:

Connection Failed: Error configure stream, failed to connect to localhost/127.0.0.1 (port 8554) from /127.0.0.1 (port 41556) after 5000ms: isConnected failed: ECONNREFUSED (Connection refused)

This looks like a simple problem to solve but I don't have a lot of experience with auth in rtsp streams. Is there a easier way to test it? Or how can I set up the authentication to a localhost?

@pedroSG94
Copy link
Owner

Hello,

This is good news.
localhost in the url is like if you try to stream to yourself. I mean that localhost is your own Android device.
You need set the local ip where you installed mediamtx server. Normally something like: 192.168.x.x where x depend of your local network and device that you want to stream. You can check the ip using a command in your shell (ifconfig or ipconfig command depend of your SO)

About the image rotated and deformed. I need a photo to help you solve it (If possible a photo that I can understand easy the bad orientation, for example capture a object like a table or a monitor). Also, I need know the resolution that you selected to open the camera and the resolution selected in my prepareVideo method.

@ronaldsampaio
Copy link
Author

ronaldsampaio commented Feb 13, 2024

Finally I was able to do it! Thank you so much man!
About the screen here some info:

  • This is where I create the cameraClient
@Composable
fun rememberCameraClient(context: Context): CameraClient = remember {
    CameraClient.newBuilder(context).apply {
        setEnableGLES(true)
        setCameraStrategy(CameraUvcStrategy(context))
        setRawImage(false)
        setCameraRequest(
            CameraRequest.Builder()
                .setFrontCamera(false)
                .setPreviewWidth(1280)
                .setPreviewHeight(720)
                .create()
        )
        openDebug(true)

    }.build()
}
  • And the prepareVideo() is here
    fun startStream(){
        if(rtspStream.prepareVideo(1280, 720, 3500) && rtspStream.prepareAudio(48000,false,320)){
            rtspStream.startStream("rtsp://192.168.1.107:8554/mystream")
        }
        //rtspStream.startStream("rtsp://rtspstream:0f0c888313b4e035a937b04a52b505e8@zephyr.rtsp.stream/movie")
        else{
            Log.d("camera_str","ERROR PREPARING VIDEO OR AUDIO")
        }

        Log.d("Streaming", "STARTED SERVER!")
        //else Log.e("Streaming", "Error preparing audio or video")

    }
  • A pic demonstrating the distortion
    Screenshot_20240213_191140

O thing that I should mention is that the captured pic have a normal aspect ratio.

@pedroSG94
Copy link
Owner

rtspStream.prepareVideo(1280, 720, 3500) && rtspStream.prepareAudio(48000,false,320)

You should fix the bitrate. 3500 is 3,5kbps but the recommended bitrate for video in 1280x720 is 4mbps (4000000) and audio is recommended use 128kbps (128000).
You have a guide about it here:
https://support.google.com/youtube/answer/2853702?hl=en

About the image.

O thing that I should mention is that the captured pic have a normal aspect ratio.

What do you mean? the stream result received from mediamtx using a player?
If yes, maybe the image is distorted because you are using AspectRatioTextureView as the preview. Are you using my startPreview method to render your preview?
If you are not doing it. Try to do it but with a normal TextureView or SurfaceView (remember wait until the view is ready to call startPreview like here)

Also, remember that using RtspStream (this is applicable to all class that use StreamBase as parent) you only need call prepareVideo and prepareAudio one time in the whole lifecycle of the object and only is necessary call it again if you want change the video or audio parameters. So after a success prepareVideo and prepareAudio you can call start and stop anything (stream, record or preview) all times you need without worry about it.

@pedroSG94
Copy link
Owner

I'm going to start to add support for StreamBase in RTSP-Server tomorrow.
I think this should be ready in 1 or 2 days depend of my time.

@ronaldsampaio
Copy link
Author

ronaldsampaio commented Feb 14, 2024

What do you mean? the stream result received from mediamtx using a player?

A mean a picture captured with cameraClient.capturePhoto()

Thanks for the bitrate info.

I'm going to start to add support for StreamBase in RTSP-Server tomorrow. I think this should be ready in 1 or 2 days depend of my time.

You mean that I'll be able to create a RTSP server with my phone and add this surfaceTexture as a source of the video stream?
Because what I really need is that. Creating a server and use the feed from the USB Camera as a video source.
The encodedCallBack that I tried to use earlier didn't work, that's why I went down this SurfaceTexture road.
Is there an option to adapt what I've made (sending to a RTSP server) to make the own phone become a RTSP server? (I think this is the purpose of this lib)

@pedroSG94
Copy link
Owner

You mean that I'll be able to create a RTSP server with my phone and add this surfaceTexture as a source of the video stream?
Because what I really need is that. Creating a server and use the feed from the USB Camera as a video source.
The encodedCallBack that I tried to use earlier didn't work, that's why I went down this SurfaceTexture road.
Is there an option to adapt what I've made (sending to a RTSP server) to make the own phone become a RTSP server? (I think this is the purpose of this lib)

Yes, I will add this as soon as possible (this should be able today maybe, I'm already on it).

A mean a picture captured with cameraClient.capturePhoto()

Can you tell me if the stream result image has distortion?
Basically startStream like you did and get the stream from mediamtx using VLC player for example.

@ronaldsampaio
Copy link
Author

Yes, I will add this as soon as possible (this should be able today maybe, I'm already on it).

Amazing man! Today I'll be sponsoring the project. This helped me a lot.

Can you tell me if the stream result image has distortion? Basically startStream like you did and get the stream from mediamtx using VLC player for example.

Yes the stream is distorted as the screenshot shows. Only the captured photo looks fine.

@pedroSG94
Copy link
Owner

pedroSG94 commented Feb 14, 2024

Amazing man! Today I'll be sponsoring the project. This helped me a lot.

I just updated to library to support StreamBase check the readme to know the version. You can use the class RtspServerStream for it.

Yes the stream is distorted as the screenshot shows. Only the captured photo looks fine.

  • First of all, the best idea is discard AspectRatioTextureView and use a normal SurfaceView or TextureView as the preview. This way we can discard that AspectRatioTextureView is doing something that can distort the image.
  • After that, you can test using rotation parameter in the prepareVideo method with values 0, 90, 180 and 270 to check if the orientation could be related (I will need at least a photo with values 0 and 90 to check).
  • Also a photo using this method to remove code related with aspect ratio:
rtmpCamera?.getGlInterface()?.setAspectRatioMode(AspectRatioMode.NONE)

Show me a photo from the stream result (in player side) and from preview.

The idea is debug the possible reason to fix it properly. If we can't find the reason we can develop a filter to force render exactly as we need but I think this should be the last bullet.

@ronaldsampaio
Copy link
Author

The screenshot of the preview
Screenshot_20240214_205644

And the preview on the streaming side.
Screenshot (1)

I think that the distortion ended with using a TextureView instead of AspectRatioTextureView. But how can I make the preview fill all the black box of the screen that is available for it.

Here the implementations:

The Composable Scaffold used to show the preview

    Scaffold(modifier = Modifier.fillMaxSize()) {
        Column(
            modifier = Modifier.fillMaxSize(),
            horizontalAlignment = Alignment.CenterHorizontally,
            verticalArrangement = Arrangement.Center
        ) {
            Box(modifier = Modifier.height(200.dp))
            Box(Modifier.height(300.dp)) {
                AndroidView(
                    factory = { context1 ->
                        TextureView(context1).apply {
                            this.surfaceTextureListener = (object : TextureView.SurfaceTextureListener{
                                val _tag = "camera_streaming"
                                override fun onSurfaceTextureAvailable(
                                    surface: SurfaceTexture,
                                    width: Int,
                                    height: Int
                                ) {
                                    usbCameraViewModel.startVideoStreaming(cameraClient, this@apply)
                                }

                                override fun onSurfaceTextureSizeChanged(
                                    surface: SurfaceTexture,
                                    width: Int,
                                    height: Int
                                ) {
                                    Log.d(_tag, "onSurfaceTextureSIZECHANGED")
                                    cameraClient.setRenderSize(width, height)
                                }

                                override fun onSurfaceTextureDestroyed(surface: SurfaceTexture): Boolean {
                                    Log.d(_tag, "onSurfaceTextureDESTROYED")
                                    cameraClient.closeCamera()
                                    return true
                                }

                                override fun onSurfaceTextureUpdated(surface: SurfaceTexture) {
                                    //Log.d(_tag, "onSurfaceTextureUPDATED")
                                }

                            })
                        }
                    }
                )
            }

The cameraClient creation

@Composable
fun rememberCameraClient(context: Context): CameraClient = remember {
    CameraClient.newBuilder(context).apply {
        setEnableGLES(true)
        setCameraStrategy(CameraUvcStrategy(context))
        setRawImage(false)
        setCameraRequest(
            CameraRequest.Builder()
                .setFrontCamera(false)
                .setPreviewWidth(1280)
                .setPreviewHeight(720)
                .create()
        )
        openDebug(true)

    }.build()
}

startVideoStreaming from USBCameraViewModel that makes the calls to StreamingController

    fun startVideoStreaming(
        cameraClient: CameraClient,
        textureView: TextureView
    ) {
        streamingController.setUpServer(cameraClient)
        streamingController.startStream()
        startPreview(textureView)
    }
    (...)
    fun startPreview(textureView: TextureView){
        streamingController.startPreview(textureView)
    }

StreamingController

class StreamingController(private val context: Context) : ClientListener, ConnectChecker {
    private lateinit var rtspServerStream : RtspServerStream
    private var portNum = 18554
    private var prepared = false


    fun setUpServer(cameraClient: CameraClient) {
        rtspServerStream = RtspServerStream(context,portNum,this,USBCameraSource(context,cameraClient),MicrophoneSource())

    }


    fun startStream(){
        if(rtspServerStream.prepareVideo(1280, 720, 4000000, rotation = 90) && rtspServerStream.prepareAudio(48000,false,128000)){
            rtspServerStream.startStream()
        }
        else{
            Log.d("camera_str","ERROR PREPARING VIDEO OR AUDIO")
        }

        Log.d("Streaming", "STARTED SERVER!")
        //else Log.e("Streaming", "Error preparing audio or video")

    }
    (...)
    fun startPreview(textureView: TextureView){
        rtspServerStream.startPreview(textureView)
    }

If I don't call the rtspServerStream.startPreview(textureView) the preview is not shown.

@pedroSG94
Copy link
Owner

pedroSG94 commented Feb 15, 2024

If I don't call the rtspServerStream.startPreview(textureView) the preview is not shown.

Yes, this is the expected way.

For now, try change this:

                                    Log.d(_tag, "onSurfaceTextureSIZECHANGED")
                                    cameraClient.setRenderSize(width, height)

To 1280x720. And use this:

rtmpCamera?.getGlInterface()?.setAspectRatioMode(AspectRatioMode.NONE)

I will try to create my own environment to test it according with your code but maybe get a full example code is the best way to reproduce the case

@ronaldsampaio
Copy link
Author

Hi,

I will try to create my own environment to test it according with your code but maybe get a full example code is the best way to reproduce the case

I'll create a minimum exemple of my implementation for tests.

rtmpCamera?.getGlInterface()?.setAspectRatioMode(AspectRatioMode.NONE)

How can I change configurations for a rtmpCamera? AFAIK I'm not using it in any place. Is there any method to access an underlying RTMPCamera?

@pedroSG94
Copy link
Owner

How can I change configurations for a rtmpCamera? AFAIK I'm not using it in any place. Is there any method to access an underlying RTMPCamera?

Sorry for the confusion. RtspServerStream should have the same method because that method come from StreamBase

I'll create a minimum exemple of my implementation for tests.

Ok, I will wait for it

@ronaldsampaio
Copy link
Author

So it got better with
rtspServerStream.getGlInterface().setAspectRatioMode(AspectRatioMode.Fill)

Using rtspServerStream.getGlInterface().setAspectRatioMode(AspectRatioMode.NONE) gave me a distorted preview too.

But one thing that I noticed is that the stream kept just like the square that it already was.
Here the test application
https://github.com/ronaldsampaio/USBStreaming

@pedroSG94
Copy link
Owner

pedroSG94 commented Feb 16, 2024

Hello,

I finally found the reason about your case. Try update your gradle to my last master commit that contain a new method to solve this case:

  implementation 'com.github.pedroSG94:RTSP-Server:7469cdcb78'
  implementation 'com.github.pedroSG94.RootEncoder:library:1627f7f62b'

With this commit you should call in setup server:

rtspServerStream.getGlInterface().setAspectRatioMode(AspectRatioMode.Adjust)
rtspServerStream.getGlInterface().forceOrientation(OrientationForced.LANDSCAPE)

This should fix your preview like this (this is the expected way):
Screenshot_20240216_021610
And the stream result is full screen.

@ronaldsampaio
Copy link
Author

Hello, sorry for the delay.
Here, after changing my setUpServer to this:

    fun setUpServer(cameraClient: CameraClient) {
        rtspServerStream = RtspServerStream(context,portNum,this,USBCameraSource(context,cameraClient),
            MicrophoneSource()
        )
        rtspServerStream.getGlInterface().setAspectRatioMode(AspectRatioMode.Adjust)
        rtspServerStream.getGlInterface().forceOrientation(OrientationForced.LANDSCAPE)

    }

I got those results
Screenshot (3)
In the streaming client

Screenshot_20240217_134901_USBStreaming
In the phone app that I shared with you ( and also in the main app that I'm developing)

@pedroSG94
Copy link
Owner

I did you a PR to fix the error and add sanity checks:
ronaldsampaio/USBStreaming#1

The reason about your result is that you need remove rotation = 90 from prepareVideo. I forgot that change

@pedroSG94
Copy link
Owner

One more thing. I added captureImage from usb library to CameraUsbSource but you can use my captureImage implementation if you want. My implementation also capture filters if you add filters with my library:

rtspServerStream.getGlInterface().takePhoto { bitmap -> 
      //save the bitmap or do anything you want. The bitmap should has the same resolution that the stream
}

@ronaldsampaio
Copy link
Author

ronaldsampaio commented Feb 18, 2024

Great stuff man! Worked like a charm! Thank you a lot for all the support!
I have only one more question that I'd like some directions on:
What are your thoghts on streaming the camera image with bounding boxes overlays? The use case is object detection.
What I have now is: Rectangle coordinates from the bounding boxes.
Is there way to draw those boxes in the stream video? Like changing in the preview and in the stream.

@pedroSG94
Copy link
Owner

pedroSG94 commented Feb 18, 2024

Yes, you can do it.
You can use AndroidViewFilterRender for it. You have an example here:
https://github.com/pedroSG94/RootEncoder/blob/master/app/src/main/java/com/pedro/streamer/openglexample/OpenGlRtmpActivity.java#L168
With this, you can create a view that draw rectangles depend or your need (For example, using a custom view with background transparent and draw over the view using canvas).

Alternative, you can create your own filter extending from BaseFilterRender but you need know about OpenGl so maybe this is more complex.

Edit:

AndroidViewFilterRender can't support SurfaceView or TextureView because AndroidViewFilterRender get data from a Canvas (the canvas from the View) and both views use a Surface or a SurfaceTexture to draw. So you need use a custom class that extend from View.
I recommend you start from an easy view that draw something easy like a circle to make sure that all is working as you expected to solve possible questions first

@ronaldsampaio
Copy link
Author

But after that how can I change what surfaceTexture the RTSPStreamServer will use to upload video image? ( If this is the way things are done)

@pedroSG94
Copy link
Owner

I don't understand you.
If you add the filter, the filter affect to preview and stream result image in real time.

@ronaldsampaio
Copy link
Author

I don't understand you. If you add the filter, the filter affect to preview and stream result image in real time.

Hum. Okay. So the path would be:

  1. Create a CustomView from View
  2. change the surfaceTexture of this view to the one from the start method from VideoSource (can I do this?)
  3. Draw on that CustomView what I want.

@pedroSG94
Copy link
Owner

No, the step 2 is not correct. It should be something like this:

  1. Create a CustomView from View
  2. Create an XML layout that contain that CustomView in full size.
  3. Add this XML that contain CustomView to stream using a AndroidViewFilterRender
  4. Modify CustomView in realtime as you need using the onDraw method inside CustomView

Steps 2 and 3 can be resumed using a code like this:
https://github.com/pedroSG94/RootEncoder/blob/master/app/src/main/java/com/pedro/streamer/openglexample/OpenGlRtmpActivity.java#L169
You can check that I'm adding a XML layout to stream using AndroidViewFilterRender. In this case that XML contains buttons but you can replace that buttons with any view (CustomView in this case).

This is a basic example of a view that you can create that modify in real time each second:

class CircleTestView(context: Context, attrs: AttributeSet) : View(context, attrs) {

  private var radius = 0f
  private var rendering = true
  private var increment = true
  private val paint = Paint()

  init {
    this.setBackgroundColor(Color.TRANSPARENT)
    CoroutineScope(Dispatchers.IO).launch {
      while (rendering) {
        delay(1000)
        if (radius >= 500) increment = false
        else if (radius <= 0) increment = true
        if (increment) radius += 100 else radius -= 100
      }
    }
  }

  override fun onDraw(canvas: Canvas) {
    paint.style = Paint.Style.FILL
    canvas.drawCircle(width / 2f, height / 2f, radius / 2f, paint)
    super.onDraw(canvas)
  }

  /**
   * onCreate equivalent
   */
  override fun onAttachedToWindow() {
    super.onAttachedToWindow()
  }

  /**
   * onDestroy equivalent
   */
  override fun onDetachedFromWindow() {
    super.onDetachedFromWindow()
    rendering = false
  }
}

You can use this class as example of steps 1 and 4. You can copy this class in my project and add it to layout_android_filter XML layout to test (remove all buttons and add it as full size view to test)

@ronaldsampaio
Copy link
Author

Thanks for all the support! I'm closing the issue with one question: Will you make a new release with the master commits from RTSPServer and RootEnconder that I'm using?

@pedroSG94
Copy link
Owner

Hello,

Yes, I have plans to create a release with these changes (all is currently in the master branch).

But first, I want to finish opus codec support (RTSP and SRT) and add it to rtspserver too but I'm stuck supporting mpegts packetization.

After finish this. I will update the library with all.

@ronaldsampaio
Copy link
Author

ronaldsampaio commented Feb 22, 2024

Hello,

Yes, I have plans to create a release with these changes (all is currently in the master branch).

But first, I want to finish opus codec support (RTSP and SRT) and add it to rtspserver too but I'm stuck supporting mpegts packetization.

After finish this. I will update the library with all.

Alright great! Please let me know when you finish!
Thanks

@pedroSG94
Copy link
Owner

Hello,

I updated the libraries versions. You can check readme

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants