New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Any new direction on using this library with USB Cameras (UVC Camera) #110
Comments
Hello, Using the main project (RootEncoder) you can add a custom VideoSource like in this: This example support CameraX but you can use any source rendering a SurfaceTexture. If you can find the way to use that library to render a SurfaceTexture. You can create your own VideoSource using USB camera. This was added recently to the library so RTSP-Sever is not updated yet to support it but I will add it soon. For now, create a VideoSource in the RootEncoder example and you can reuse it with RTSP-Server |
Hello, |
Hi @pedroSG94 , The surfaceTexture that comes on the start method should be user to start the camera? |
Also, I have this callback that gives me encoded data class EncodedPreview : IEncodeDataCallBack{
override fun onEncodeData(
data: ByteArray?,
size: Int,
type: IEncodeDataCallBack.DataType,
timestamp: Long
) {
TODO("Not yet implemented")
}
} Is there a way a can use it to input this ByteArray to RTSPServer? |
Hello,
Keep in mind that:
For this case, if you already have h264 data. You can use RtspServer class directly like here: |
Hi, |
Hello, Using rtspStream you only need call prepareVideo and prepareAudio one time before use startStream, startRecord or startPreview instead of each time you need startStream or startRecord. So this is a valid case sequence using rtspStream (this is invalid using other class like RtspCamera2): About about bitrate. You can select the bitrate using prepareVideo and prepareAudio: You don't need a bitrate inside VideoSource because the source is not supposed to be encoded. The source only need render a SurfaceTexture for video or send PCM buffers for audio that will be encoded internally |
Hello, This is my USBCameraSource class USBCameraSource(
private val context: Context,
private val cameraClient: CameraClient,
private val aspectRatioTextureView: AspectRatioTextureView,
): VideoSource(), LifecycleOwner {
init {
Log.d("camera_str", "USBCameraSource INIT!")
}
private val lifecycleRegistry = LifecycleRegistry(this)
override fun create(width: Int, height: Int, fps: Int): Boolean {
return try {
Log.d("camera_str", "INSIDE CREATE USBCameraSource")
cameraClient.setRenderSize(width,height)
true
} catch (e : Exception){
Log.d("camera_str","EXCEPTION ON CREATE USBCAMERASOURCE -> $e")
false
}
}
override fun start(surfaceTexture: SurfaceTexture) {
this.surfaceTexture = surfaceTexture
aspectRatioTextureView.setSurfaceTexture(surfaceTexture)
lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START)
//cameraClient.openCamera(aspectRatioTextureView)
}
override fun stop() {
lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_STOP)
cameraClient.let {
it.closeCamera()
surfaceTexture = null
}
}
override fun release() {
lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_DESTROY)
}
override fun isRunning(): Boolean {
return cameraClient.isCameraOpened()!!
}
override val lifecycle: Lifecycle
get() = lifecycleRegistry
} class StreamingController(private val context: Context) : ClientListener, ConnectChecker {
private lateinit var rtspStream: RtspStream
private var portNum = 18554
private var prepared = false
fun setUpServer(cameraClient: CameraClient, aspectRatioTextureView: AspectRatioTextureView) {
rtspStream = RtspStream(
context,
this,
USBCameraSource(context,cameraClient,aspectRatioTextureView),
MicrophoneSource()
)
//rtspStream.changeVideoSource(usbCameraSource)
}
fun startStream(){
if(rtspStream.prepareVideo(1280, 720, 3500) && rtspStream.prepareAudio(48000,false,320)){
rtspStream.startStream("rtsp://107881f1a991.entrypoint.cloud.wowza.com/app-04C9G6g0")
}
else{
Log.d("camera_str","ERROR PREPARING VIDEO OR AUDIO")
}
Log.d("Streaming", "STARTED SERVER!")
} I call the AndroidView(
factory = { context1 ->
AspectRatioTextureView(context1).apply {
this.surfaceTextureListener = (object : TextureView.SurfaceTextureListener{
val _tag = "camera_streaming"
override fun onSurfaceTextureAvailable(
surface: SurfaceTexture,
width: Int,
height: Int
) {
cameraClient.openCamera(this@apply)
}
override fun onSurfaceTextureSizeChanged(
surface: SurfaceTexture,
width: Int,
height: Int
) {
Log.d(_tag, "onSurfaceTextureSIZECHANGED")
cameraClient.setRenderSize(width, height)
}
override fun onSurfaceTextureDestroyed(surface: SurfaceTexture): Boolean {
Log.d(_tag, "onSurfaceTextureDESTROYED")
cameraClient.closeCamera()
return true
}
override fun onSurfaceTextureUpdated(surface: SurfaceTexture) {
//Log.d(_tag, "onSurfaceTextureUPDATED")
}
})
}.also {
usbCameraViewModel.startVideoStreaming(cameraClient, it)
}
}
) With this implementation I'm having the error Connection Failed: sps or pps is null. Thank you so much for your help until now! I'll be sponsoring the project with a few bucks. |
Hello, According with your error. I think that the SurfaceTexture is not rendering properly. We can check it like this: If this fail. Maybe the best way is modify that method to work as we need: If you get stuck testing this. You can share me a code example and maybe I can try to test it (I'm not sure if I will be able to test it because I haven't a usb camera but in Android 14 connect your device as a webcam is possible so maybe this can work to test). |
I think I'm getting closer! fun openCamera(surfaceTexture : SurfaceTexture, width: Int, height: Int){
initEncodeProcessor()
val listener = object : RenderManager.CameraSurfaceTextureListener {
override fun onSurfaceTextureAvailable(surfaceTexture: SurfaceTexture?) {
surfaceTexture?.let {
mCamera?.startPreview(mRequest!!, it)
mCamera?.addPreviewDataCallBack(this@CameraClient)
}
}
}
mRenderManager?.startRenderScreen(width, height, Surface(surfaceTexture), listener)
mRenderManager?.setRotateType(mDefaultRotateType)
} My create and start functions from USBCameraSource are like override fun create(width: Int, height: Int, fps: Int): Boolean {
mWidth = width
mHeight = height
return try {
Log.d("camera_str", "INSIDE CREATE USBCameraSource")
cameraClient.setRenderSize(width,height)
true
} catch (e : Exception){
Log.d("camera_str","EXCEPTION ON CREATE USBCAMERASOURCE -> $e")
false
}
}
override fun start(surfaceTexture: SurfaceTexture) {
this.surfaceTexture = surfaceTexture
//aspectRatioTextureView.setSurfaceTexture(surfaceTexture)
lifecycleRegistry.handleLifecycleEvent(Lifecycle.Event.ON_START)
cameraClient.openCamera(surfaceTexture,mWidth,mHeight)
} And setUpServer and startStream from StreamingController: fun setUpServer(cameraClient: CameraClient, aspectRatioTextureView: AspectRatioTextureView) {
rtspServer = RtspServer(this, portNum)
_aspectRatioTextureView = aspectRatioTextureView
rtspStream = RtspStream(
context,
this,
USBCameraSource(context,cameraClient,aspectRatioTextureView),
MicrophoneSource()
)
//rtspStream.changeVideoSource(usbCameraSource)
}
fun startStream(){
if(rtspStream.prepareVideo(1280, 720, 3500) && rtspStream.prepareAudio(48000,false,320)){
rtspStream.startStream("rtsp://localhost:8554/mystream")
}
//rtspStream.startStream("rtsp://rtspstream:0f0c888313b4e035a937b04a52b505e8@zephyr.rtsp.stream/movie")
else{
Log.d("camera_str","ERROR PREPARING VIDEO OR AUDIO")
} Now I'm getting a preview (looks like in inverted width and height because the image is like with black strips on the side and the image is deformed) Connection Failed: Error configure stream, failed to connect to localhost/127.0.0.1 (port 8554) from /127.0.0.1 (port 41556) after 5000ms: isConnected failed: ECONNREFUSED (Connection refused) This looks like a simple problem to solve but I don't have a lot of experience with auth in rtsp streams. Is there a easier way to test it? Or how can I set up the authentication to a localhost? |
Hello, This is good news. About the image rotated and deformed. I need a photo to help you solve it (If possible a photo that I can understand easy the bad orientation, for example capture a object like a table or a monitor). Also, I need know the resolution that you selected to open the camera and the resolution selected in my prepareVideo method. |
Finally I was able to do it! Thank you so much man!
@Composable
fun rememberCameraClient(context: Context): CameraClient = remember {
CameraClient.newBuilder(context).apply {
setEnableGLES(true)
setCameraStrategy(CameraUvcStrategy(context))
setRawImage(false)
setCameraRequest(
CameraRequest.Builder()
.setFrontCamera(false)
.setPreviewWidth(1280)
.setPreviewHeight(720)
.create()
)
openDebug(true)
}.build()
}
fun startStream(){
if(rtspStream.prepareVideo(1280, 720, 3500) && rtspStream.prepareAudio(48000,false,320)){
rtspStream.startStream("rtsp://192.168.1.107:8554/mystream")
}
//rtspStream.startStream("rtsp://rtspstream:0f0c888313b4e035a937b04a52b505e8@zephyr.rtsp.stream/movie")
else{
Log.d("camera_str","ERROR PREPARING VIDEO OR AUDIO")
}
Log.d("Streaming", "STARTED SERVER!")
//else Log.e("Streaming", "Error preparing audio or video")
} O thing that I should mention is that the captured pic have a normal aspect ratio. |
You should fix the bitrate. 3500 is 3,5kbps but the recommended bitrate for video in 1280x720 is 4mbps (4000000) and audio is recommended use 128kbps (128000). About the image.
What do you mean? the stream result received from mediamtx using a player? Also, remember that using RtspStream (this is applicable to all class that use StreamBase as parent) you only need call prepareVideo and prepareAudio one time in the whole lifecycle of the object and only is necessary call it again if you want change the video or audio parameters. So after a success prepareVideo and prepareAudio you can call start and stop anything (stream, record or preview) all times you need without worry about it. |
I'm going to start to add support for StreamBase in RTSP-Server tomorrow. |
A mean a picture captured with cameraClient.capturePhoto() Thanks for the bitrate info.
You mean that I'll be able to create a RTSP server with my phone and add this surfaceTexture as a source of the video stream? |
Yes, I will add this as soon as possible (this should be able today maybe, I'm already on it).
Can you tell me if the stream result image has distortion? |
Amazing man! Today I'll be sponsoring the project. This helped me a lot.
Yes the stream is distorted as the screenshot shows. Only the captured photo looks fine. |
I just updated to library to support StreamBase check the readme to know the version. You can use the class RtspServerStream for it.
rtmpCamera?.getGlInterface()?.setAspectRatioMode(AspectRatioMode.NONE) Show me a photo from the stream result (in player side) and from preview. The idea is debug the possible reason to fix it properly. If we can't find the reason we can develop a filter to force render exactly as we need but I think this should be the last bullet. |
Yes, this is the expected way. For now, try change this:
To 1280x720. And use this:
I will try to create my own environment to test it according with your code but maybe get a full example code is the best way to reproduce the case |
Hi,
I'll create a minimum exemple of my implementation for tests.
How can I change configurations for a rtmpCamera? AFAIK I'm not using it in any place. Is there any method to access an underlying RTMPCamera? |
Sorry for the confusion. RtspServerStream should have the same method because that method come from StreamBase
Ok, I will wait for it |
So it got better with Using But one thing that I noticed is that the stream kept just like the square that it already was. |
I did you a PR to fix the error and add sanity checks: The reason about your result is that you need remove rotation = 90 from prepareVideo. I forgot that change |
One more thing. I added captureImage from usb library to CameraUsbSource but you can use my captureImage implementation if you want. My implementation also capture filters if you add filters with my library: rtspServerStream.getGlInterface().takePhoto { bitmap ->
//save the bitmap or do anything you want. The bitmap should has the same resolution that the stream
} |
Great stuff man! Worked like a charm! Thank you a lot for all the support! |
Yes, you can do it. Alternative, you can create your own filter extending from BaseFilterRender but you need know about OpenGl so maybe this is more complex. Edit: AndroidViewFilterRender can't support SurfaceView or TextureView because AndroidViewFilterRender get data from a Canvas (the canvas from the View) and both views use a Surface or a SurfaceTexture to draw. So you need use a custom class that extend from View. |
But after that how can I change what surfaceTexture the RTSPStreamServer will use to upload video image? ( If this is the way things are done) |
I don't understand you. |
Hum. Okay. So the path would be:
|
No, the step 2 is not correct. It should be something like this:
Steps 2 and 3 can be resumed using a code like this: This is a basic example of a view that you can create that modify in real time each second: class CircleTestView(context: Context, attrs: AttributeSet) : View(context, attrs) {
private var radius = 0f
private var rendering = true
private var increment = true
private val paint = Paint()
init {
this.setBackgroundColor(Color.TRANSPARENT)
CoroutineScope(Dispatchers.IO).launch {
while (rendering) {
delay(1000)
if (radius >= 500) increment = false
else if (radius <= 0) increment = true
if (increment) radius += 100 else radius -= 100
}
}
}
override fun onDraw(canvas: Canvas) {
paint.style = Paint.Style.FILL
canvas.drawCircle(width / 2f, height / 2f, radius / 2f, paint)
super.onDraw(canvas)
}
/**
* onCreate equivalent
*/
override fun onAttachedToWindow() {
super.onAttachedToWindow()
}
/**
* onDestroy equivalent
*/
override fun onDetachedFromWindow() {
super.onDetachedFromWindow()
rendering = false
}
} You can use this class as example of steps 1 and 4. You can copy this class in my project and add it to layout_android_filter XML layout to test (remove all buttons and add it as full size view to test) |
Thanks for all the support! I'm closing the issue with one question: Will you make a new release with the master commits from RTSPServer and RootEnconder that I'm using? |
Hello, Yes, I have plans to create a release with these changes (all is currently in the master branch). But first, I want to finish opus codec support (RTSP and SRT) and add it to rtspserver too but I'm stuck supporting mpegts packetization. After finish this. I will update the library with all. |
Alright great! Please let me know when you finish! |
Hello, I updated the libraries versions. You can check readme |
Is there any new approach or implementation regarding using this Android USB Camera library to create a RTSP server?
If not could anyone point the directions that would be needed to implement this?
Thanks
The text was updated successfully, but these errors were encountered: