New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is there any way possible to clone the camera feed from CameraManager? #226
Comments
Note that for my purposes, the camera feed is just one part of the display, so I don't anticipate using the recording capability just for the camera feed itself. However, if the recording capability could have an option where instead of just recording the camera feed, if it could record the whole device screen (which would include any overlaid objects/effects onto the feed) that would be really appealing to me. An option to record the whole screen might be just slightly different than recording the camera feed only. A lot of use cases probably use self.view as the bounding rectangle already, so we may be talking about the same size real estate, but with the feed coming from a slightly different source. |
One thing I would like to add if an attempt to capture the whole screen for recording is made. I use emitters in my designs. I tried some code to do a screen shot, but it wasn't capturing the CAEmitterLayer objects/cells. There's probably a way to get those objects rendered into the result, but I wasn't familiar with a technique. |
using CameraManager for that purpose is not possible because it only supports a single previewLayer. Since that's a very specific use case, which would ultimately change the behaviour of the lib, we are not considering doing that. That being said, you probably need a |
Maybe I can try instantiating 2 objects, and using one for each session,
although I expect that is more expensive in both memory and throughput
causing extra loading for the CPU and or GPU.
Like so :
let cameraManager = CameraManager()
let cameraManager_2 = CameraManager()
I will let you know if this works, and provide video link if it does. Do
you know if others have tried the 2 instantiations approach?
Since it really is the same feed, it would be nice if it was optimized as
best as possible.
Please consider trying to use an array inside your code, and letting users
access a clone feed with a slightly different call to get a clone preview
layer, with no recording capability.
I don't think it really changes the behavior of your pod if you can figure
out a way to tweak it, without breaking any code for the existing users.
If all you do is add new APIs so users can have more options, that might
help bring more even more users to your pod. If you think users who use
cloned feeds, or use whole screen recording might experience any degraded
performance, just put a cautionary note on your websites that guide devs in
integrating to Camera Manager. With hardware advancing, degraded
performance might be a non issue soon, if not already.
I sent a message through your website expounding on how that might be done.
Another thing that you might consider is consulting someone inside Apple to
find out if they might know of a way to get that desired effect. Maybe
they can add something to the AVCapture library to assist in that effect.
With CPU and GPU speed increasing, and with the very artistic effects that
can be done with a camera feed, the hardware might be able to support clone
feeds now, or not too long from now, where it doesn't degrade in quality
because of throughput limitations.
My use case is very artistic, kind of like Photoshop, and having multiple
views of the same feed at the fingertips of users could be a novel and
unique feature.
There may be some value in your pod of including that feature, if it plays
well and if you can provide users with an example video showcase and
example code to get it working.
Also, even though your pod currently only records the camera feed, if you
can provide an option to the user to capture the whole screen for recording
(which includes your camera feed) and overlaid effects, I think that would
add quite a bit of value to your pod.
Giving your users the flexibility to choose either whole screen recording,
or just the camera feed recording might just be a few dozen lines of code
changes in your pod.
Let me share with you a few links to show you my use cases, and how those
use cases could be even more fun with clone views or whole screen recording
making fun and unique .gif files as an end goal
https://www.youtube.com/watch?v=OOSxJhyzCVw&feature=youtu.be
https://www.youtube.com/watch?v=DJQzwmXBRa0&feature=youtu.be
If you have any suggestion, please share.
And thanks for the info.
Mike
…On Tue, Apr 21, 2020 at 5:24 AM Ricardo Torrão ***@***.***> wrote:
@MichaelKucinski <https://github.com/MichaelKucinski>
using CameraManager for that purpose is not possible because it only
supports a single previewLayer. Since that's a very specific use case,
which would ultimately change the behaviour of the lib, we are not
considering doing that.
That being said, you probably need a AVCaptureSession with a
AVCaptureVideoPreviewLayer for every one of your clone UIViews.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#226 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGNKEMDMF6TIHFYB47GDOZ3RNVQ43ANCNFSM4MMQAI7Q>
.
|
I tried 2 instantiations, but it didn't work. as shown below, (not all code
shown, just the pertinent parts)
let cameraView_1 = UIView()
let cameraView_2 = UIView()
let cameraManager_1 = CameraManager()
let cameraManager_2 = CameraManager()
The feed only went to one of the views.
There might be merit in checking with Apple about clone feed
possibilities. Since your team has standing with your pod, an enhancement
request that asks for a way to produce clone feeds might get serious
consideration on their side (if they don't currently have a way of already
doing that). Cloned feeds could be fun in advertising effects. Imagine
10 cloned feeds coordinated with size reduction or spacing in appealing
ways, with coordinated rotation or scaling too.
On your end, are there any objects that could be made into an array that
could help make cloning possible? By poking around, I wondered if this
might be made into an array :
fileprivate weak var embeddingView: UIView?
That's about as far as I can run with this I think.
Thanks for looking this over.
Mike
…On Tue, Apr 21, 2020 at 5:24 AM Ricardo Torrão ***@***.***> wrote:
@MichaelKucinski <https://github.com/MichaelKucinski>
using CameraManager for that purpose is not possible because it only
supports a single previewLayer. Since that's a very specific use case,
which would ultimately change the behaviour of the lib, we are not
considering doing that.
That being said, you probably need a AVCaptureSession with a
AVCaptureVideoPreviewLayer for every one of your clone UIViews.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#226 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGNKEMDMF6TIHFYB47GDOZ3RNVQ43ANCNFSM4MMQAI7Q>
.
|
thanks for the feeback, but as I mentioned before we don't plan on changing the lib for your very specific use case since our goal is to mimic the default iOS camera in terms of functionality. |
Well, if you ever think of trying some things along those lines or get
curious, consider trying to prototype the concept and weighing it's
merits. Or consider a new pod for new functionality. Your team does well
at that level.
You might consider updating your website and documentation to let people
looking over your pod know that it makes a great live camera feed, even if
there is no intent to record. I would link some video showing the
flexibility that is possible, and maybe include detailed code for attaching
gesture recognizers.
I added gesture recognizers to my UIView so that it can be dragged,
pinched, and rotated, and I added supporting code to get a circular feed.
Those YouTube links I shared earlier showcased that. The touch to drag
needed some special logic. I let touches on the left side of the view drag
it around the screen and I let touches on the right side go to the exposure
control. I had to set cameraManager.shouldEnableExposure based on where
the touch was located. Worked like a charm.
if cameraFeedTouchpoint.x > cameraView.frame.width / 2
{
cameraManager.shouldEnableExposure = true
return
}
else
{
cameraManager.shouldEnableExposure = false
}
BTW, this is the best pod I have ever incorporated into an app!
…On Tue, Apr 21, 2020 at 12:39 PM Ricardo Torrão ***@***.***> wrote:
@MichaelKucinski <https://github.com/MichaelKucinski>
thanks for the feeback, but as I mentioned before we don't plan on
changing the lib for your *very specific* use case since our goal is to
mimic the default iOS camera in terms of functionality.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#226 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGNKEMCGFJFLQZUG5VJFGE3RNXD4VANCNFSM4MMQAI7Q>
.
|
EmitSoEasy has been released. It’s got the live camera feed using CameraManager. It’s flexible and can be cropped to a circle feed. See lower left corner of the app toolbars to start it. Try tiling the background with text or photo with the Camera feed and move the phone around. That is a lot of art content! Now you might see why I wanted to be able to record the whole screen of the device instead of just the camera video stream. And you might see how a cloned copy could have made for fun content. Imagine 3 of 4 circular camera feeds on screen and recording that effect! Users may like it. https://apps.apple.com/us/app/emitsoeasy/id1500308841 EmitSoEasy can produce interesting artistic content very quickly EmitSoEasy lets you work with Memoji and emoji. You can tile with text, Memoji, webpages, and photos too. You can export tiles, text, photo, and web page clippings with transparent backgrounds. Those exports can be used other places like Photoshop, etc. If anyone is using Photoshop, please look at the following photos showcasing the export and tile features. It might be fun / creative to use exports from EmitSoEasy and incorporate them into Photoshop or other products. The EmitSoEasy steps are pretty easy.
If you want tiles :
It only took me one minute to create each image |
Just for display purposes, I would like to clone the camera feed to multiple UIView objects at the same time. It could be fun to display two or more identical camera feeds that appear synced! Any cloned feeds would not need to have video recording capability. Please see https://stackoverflow.com/questions/61295613/is-there-a-way-to-copy-a-uiview-objects-video-content-into-a-second-uiview-im
Does anyone know if there is a way to clone them in my own code? Or could there by an internal array kept inside the pod that would allow clones (with no recording capability)?
The text was updated successfully, but these errors were encountered: