[14:50:36] autr (~textual@port-92-193-237-239.dynamic.as20676.net) joined the channel [14:50:36] Topic is Welcome to the libcamera.org official development channel | Bugs are tracked in https://bugs.libcamera.org/ [14:50:36] Set by ChanServ on 22. June 2021 at 14:30:15 CEST [14:50:36] Mode is +Snt [15:00:06] Hello šŸ‘‹ [15:00:09] I havenā€™t used IRC much so ā€˜scuse if its bad etiquette to post big blocks of textā€¦ [15:00:43] Iā€™ve been taking a look at libcamera-apps and am feeling the urge to make an addon for OpenFrameworks (C++ creative coding toolkit) [15:01:49] Bit of context is OF has helper classes for Pixels and Textures which then play nicely with whatever GL / windowing is used for the platform (ie. Android or Linux) ā€¦ so Iā€™m curious about how Preview streams work before I go any further [15:03:18] First guess is that preview stream is basically a lower-prio thread for displaying a texture to screen - so not blocking the main Raw or Video or Still stream? So more geared toward CLI usage? [15:05:56] In this case for the OF addon it would basically work similarly to the ā€œVideo Grabberā€ (which is the class for opening UVC cameras) - and to use only a Raw stream - then on the callback copy the pixel buffer to an OF-world-Pixels-object or OF-world-Texture-object (based on configuration on setup) [15:07:22] Does this sound like the right approach or should I use a Video stream from libcamera (ie. are there some optimisations or nice things I would be missing out on)? [15:09:39] And one more question - when playing with MMAL I ended up categorising parameters as things which have to be set __before__ the camera is opened (ie. FPS, width, height) and things that can be set __while__ the camera is opened (ie. shutter speed, exposure) [15:10:08] How might I best get a run down of those capabilities from libcamera? [15:10:31] hi autr [15:10:36] and welcome [15:11:00] Hi pinchartl :) [15:12:31] that's quite a few questions :-) [15:12:42] first of all, libcamera doesn't deal with the display directly [15:13:28] Disconnected for Sleep Mode [15:14:57] autr (~textual@port-92-193-237-239.dynamic.as20676.net) joined the channel [15:14:58] Topic is Welcome to the libcamera.org official development channel | Bugs are tracked in https://bugs.libcamera.org/ [15:14:58] Set by ChanServ on 22. June 2021 at 14:30:15 CEST [15:14:58] Mode is +Snt [15:15:29] welcome back :-) [15:15:42] so I was saying [15:15:43] Arg, laptop went to sleep for a second there! [15:15:46] Thanks :) [15:15:46] first of all, libcamera doesn't deal with the display directly [15:16:05] That makes sense [15:16:14] please also note that libcamera-apps are applications developed by Rapsberry Pi. they use libcamera, and aim at replicating the features of the applications for the MMAL-based camera stack [15:16:27] Got it [15:16:28] so terminology, and even features, may differ. I haven't checked in details [15:17:18] in libcamera we have the concept of stream roles. this is actually about to change, but currently we have four roles, raw, viewfinder, video and still capture [15:17:32] viewfinder is what you've called preview I think [15:17:45] those roles are hints, to tell libcamera of expected usage of a stream [15:18:06] so, for instance, by default libcamera will configure a raw stream with a raw format, but you can change all parameters [15:18:41] for your use case, the video role is likely best, as it's intended to capture video for processing [15:19:12] there's no notion in libcamera of priority between streams (at least at this point) [15:19:49] implementing an OF addon that will behave like the video grabber sounds like to right way to go [15:19:55] two things to note [15:20:51] one, the callbacks from libcamera that signal that a buffer is ready must not block for an extended period of time. if you need to perform costly operations there (including copying to an OF object), that should be deferred to another thread [15:21:03] two, copying should be avoided if possible [15:21:31] I don't know how OF is implemented, but the best way would be to use the buffers from OF and give them to libcamera, so that frames can be captured there directly [15:21:54] if that's not possible, maybe OF has the ability to go the other way around, using a buffer provided by libcamera, without a copy operation ? [15:22:35] regarding your last question [15:22:46] we also have parameters that have to be configured before starting the camera [15:23:04] those are essentially the parameters set in StreamConfiguration (pixel format, width, height) [15:23:38] the rest is set through controls, which can be set at Camera::start() time, or during capture, through requests (Request::controls()) [15:24:00] I'm working on improving that API to expose which controls can be set in requests and which need to be set before starting capture [15:27:39] Awesome [15:27:45] Re. hints - is this like a ā€œpresetā€ for configuration (ie. a Raw stream could be created but options modified so its identical to a Video stream) - or does it change how the stream works in another way? [15:28:09] it's a preset [15:28:23] right now it only selects the pixel format and default size [15:28:26] OF for sure letā€™s you put a reference to pixels or char[] into the ofPixels object (with a flag for pixel type), so doesn't need to be copied - but my basic understanding of this is that that could also mean you show frames where half of the array is the last frame, half of the array is the next frame - and the callback is also a signal to say ā€œhere is a full frameā€? [15:28:35] OK, got it thank you [15:28:46] I'm reworking the configuration API extensively, so there will be changes there, hopefully for the best :-) [15:29:18] the callbacks will notify you when a buffer is ready to be consumed [15:29:24] you can then take that buffer and pass it to OF [15:29:40] it will not be touched by libcamera anymore until you explicitly give it back to libcamera through a request [15:29:47] so you can cycle between buffers [15:30:18] capture a buffer from libcamera, give it to OF, and when OF is done with it (I suppose it can signal that somehow), queue it back to libcamera [15:30:27] Ahh so you could sort of ping between two buffers - one is being used, one is being written to? [15:32:06] NiksDev (~NiksDev@192.91.101.31) joined the channel [15:32:28] you'll probably need three buffers at least [15:33:37] OK awesome, is there an example of this in libcamera-apps? [15:33:42] if OF is using buffer A, and libcamera is capturing to buffer B, libcamera will need a buffer C in the queue, to use right after B completes [15:34:13] Got it [15:34:19] otherwise, if you want for completion of B to give it to OF, and for OF to release A, which you then give back to libcamera, it may be that by the time you give A back, the next frames has already started arriving. you would then lose it [15:34:38] s/if you want/if you wait/ [15:34:54] That makes perfect sense [15:35:11] Likely I will return to check back here with my homework :) [15:35:48] in practice you may even need more buffers, as libcamera may require more than one buffer in the queue for proper operation. that's also something we're working on, to make that requirement explicit. you can start with 3 buffers, just make sure not to hardcode 3 in your code but make it generic, so it can be increased later [15:36:42] I haven't checked libcamera-apps so I don't know if there's a good example there. let me check [15:38:50] Awesome, that makes sense - I think I would start be creating test GUI with section for pre-init (width, height, pix format, number of buffers), and another for what Request::controls() makes available [15:39:12] we have a CLI test application in libcamera itself, named cam [15:39:15] in src/cam/ [15:39:21] and a GUI application in src/qcam/ [15:39:46] OK I will check them out [15:39:52] qcam may be a better example [15:40:43] actually cam is fine too [15:40:49] it can write frames to files [15:41:14] and it defers that to its own thread [15:41:32] so it shows how all this works [15:41:39] and it's probably easier to read than qcam [15:41:46] Awesome [15:41:47] except possibly if you're familiar with Qt :-) [15:41:54] Qcam has a lot more QTkit specific things in it [15:42:17] cam has a very crude event loop implementation, I'd expect you will use the OF event loop in your case [15:42:25] and the OF threading facilities