{"id":66760,"date":"2018-12-04T12:54:53","date_gmt":"2018-12-04T12:54:53","guid":{"rendered":"http:\/\/www.sickgaming.net\/blog\/2018\/12\/04\/libcamera-aims-to-make-embedded-cameras-easier\/"},"modified":"2018-12-04T12:54:53","modified_gmt":"2018-12-04T12:54:53","slug":"libcamera-aims-to-make-embedded-cameras-easier","status":"publish","type":"post","link":"https:\/\/sickgaming.net\/blog\/2018\/12\/04\/libcamera-aims-to-make-embedded-cameras-easier\/","title":{"rendered":"Libcamera Aims to Make Embedded Cameras Easier"},"content":{"rendered":"<div><img decoding=\"async\" src=\"http:\/\/www.sickgaming.net\/blog\/wp-content\/uploads\/2018\/12\/libcamera-aims-to-make-embedded-cameras-easier.png\" class=\"ff-og-image-inserted\" \/><\/div>\n<p><span><span>The <\/span><a href=\"https:\/\/linuxtv.org\/downloads\/v4l-dvb-apis\/uapi\/v4l\/v4l2.html\"><span>V4L2<\/span><\/a><span> (Video for Linux 2) API has long offered an open source alternative to proprietary camera\/computer interfaces, but it\u2019s beginning to show its age. At the <\/span><a href=\"https:\/\/events.linuxfoundation.org\/events\/elc-openiot-europe-2018\/\"><span>Embedded Linux Conference Europe in October<\/span><\/a><span>, the V4L2 project unveiled a successor called libcamera. V4L2 co-creator and prolific Linux kernel contributor Laurent Pinchart outlined the early-stage libcamera project in a presentation called \u201cWhy Embedded Cameras are Difficult, and How to Make Them Easy.\u201d<\/span><\/span><\/p>\n<p><span><span>V4l and V4L2 were developed when camera-enabled embedded systems were far simpler. \u201cMaybe you had a camera sensor connected to a SoC, with maybe a scaler, and everything was exposed via the API,\u201d said Pinchart, who runs an embedded Linux firms called Ideas on Board and is currently working for Renesas. \u201cBut when hardware became more complex, we disposed of the traditional model. Instead of exposing a camera as a single device with a single API, we let userspace dive into the device and expose the technology to offer more fine-grained control.\u201d<\/span><\/span><\/p>\n<p><span><span>These improvements were extensively documented, enabling experienced developers implement more use cases than before. Yet, the spec placed much of the burden of controlling the complex API on developers, with few resources available to ease the learning curve. In other words, \u201cV4L2 became more complex for userspace,\u201d explained Pinchart.<\/span><\/span><\/p>\n<p><span><span>The project planned to add a layer called libv4l to address this. The libv4l userspace library was designed to mimic the V4L2 kernel API and expose it to apps \u201cso it could be completely transparent in tracking the code to libc,\u201d said Pinchart. \u201cThe plan was to have device specific plugins provided by the vendor and it would all be part of the libv4l file, but it never happened. Even if it had, it would not have been enough.\u201d<\/span><\/span><\/p>\n<p><span><span>Libcamera, which Pinchart describes as \u201cnot only a camera library but a full camera stack in user space,\u201d aims to ease embedded camera application development, improving both on V4L2 and libv4l. The core piece is a libcamera framework, written in C++, that exposes kernel driver APIs to userspace. On top of the framework are optional language bindings for languages such as C. <\/span><\/span><\/p>\n<p><span><span>The next layer up is a libcamera application layer that translates to existing camera APIs, including V4L2, Gstreamer, and the Android Camera Framework, which Pinchart said would not contain the usual vendor specific Android HAL code. As for V4L2, \u201cwe will attempt to maintain compatibility as a best effort, but we won\u2019t implement every feature,\u201d said Pinchart. There will also be a native libcamera app format, as well as plans to support Chrome OS. <\/span><\/span><\/p>\n<p><span><span>Libcamera keeps the kernel level hidden from the upper layers. The framework is built around the concept of a camera device, \u201cwhich is what you would expect from a camera as an end user,\u201d said Pinchart. \u201cWe will want to implement each camera\u2019s capabilities, and we\u2019ll also have a concept of profiles, which is a higher view of features. For example, you could choose a video or point-and-shoot profile.\u201d<\/span><\/span><\/p>\n<p><span><span>Libcamera will support multiple video streams from a single camera. \u201cIn videoconferencing, for example, you might want a different resolution and stream than what you encode over the network,\u201d said Pinchart. \u201cYou may want to display the live stream on the screen and, at the same time, capture stills or record video, perhaps at different resolutions.\u201d<\/span><\/span><\/p>\n<h3><span><span>Per-frame controls and a 3A API<\/span><\/span><\/h3>\n<p><span><span>One major new feature is per-frame controls. \u201cCameras provide controls for things like video stabilization, flash, or exposure time which may change under different lighting conditions,\u201d said Pinchart. \u201cV4L2 supports most of these controls but with one big limitation. Because you\u2019re capturing a video stream with one frame after another, if you want to increase exposure time you never know precisely at what frame that will take effect. If you want to take a still image capture with flash, you don\u2019t want to activate a flash and receive an image that is either before or after the flash.\u201d<\/span><\/span><\/p>\n<p><span><span>With libcamera\u2019s per-frame controls, you can be more precise. \u201cIf you want to ensure you always have the right brightness and exposure time, you need to control those features in a way that is tied to the video stream,\u201d explained Pinchart. \u201cWith per-frame controls you can modify all the frames that are being captured in a way that is synchronized with the stream.\u201d<\/span><\/span><\/p>\n<p><span><span>Libcamera also offers a novel approach to a given camera\u2019s 3A controls, such as auto exposure, autofocus, and auto white balance. To provide a 3A control loop, \u201cyou can have a simple implementation with 100 lines of code that will give you barely usable results or an implementation based on two or three years of development by device vendors where they really try to optimize the image quality,\u201d said Pinchart. Because most SoC vendors refuse to release the 3A algorithms that run in their ISPs with an open source license, \u201cwe want to create a framework and ecosystem in which open source re-implementations of proprietary 3A algorithms will be possible,\u201d said Pinchart. <\/span><\/span><\/p>\n<p><span><span>Libcamera will provide a 3A API that will translate between standard camera code and a vendor specific component. \u201cThe camera needs to communicate with kernel drivers, which is a security risk if the image processing code is closed source,\u201d said Pinchart. \u201cYou\u2019re running untrusted 3A vendor code, and even if they\u2019re not doing something behind your back, it can be hacked. So we want to be able to isolate the closed source component and make it operate within a sandbox. The API can be marshaled and unmarshaled over IPC. We can limit the system calls that are available and prevent the sandboxed component from directly accessing the kernel driver. Sandboxing will ensure that all the controls will have to go through our API.\u201d<\/span><\/span><\/p>\n<p><span><span>The 3A API combined with libcamera\u2019s sandboxing approach, may encourage more SoC vendors to further expose their ISPs just as some are have begun to open up their GPUs. \u201cWe want the vendors to publish open source camera drivers that expose and document every control on the device,\u201d he said. \u201cWhen you are interacting with a camera, a large part of that code is device agnostic. Vendors implement a completely closed source camera HAL and supply their own buffer management and memory location and other tasks that don\u2019t add any value. It\u2019s a waste of resources. We want as much code as possible that can be reused and shared with vendors.\u201d<\/span><\/span><\/p>\n<p><span><span>Pinchart went on to describe libcamera\u2019s cam device manager, which will support hot plugging and unplugging of cameras. He also explained libcamera\u2019s pipeline handler, which controls memory buffering and communications between MIPI-CSI or other camera receiver interfaces and the camera\u2019s ISP. <\/span><\/span><\/p>\n<p><span><span>\u201cOur pipeline handler takes care of the details so the application doesn\u2019t have to,\u201d said Pinchart. \u201cIt handles scheduling, configuration, signal routing, the number of streams, and locating and passing buffers.\u201d The pipeline handler is flexible enough to support an ISP with an integrated CSI receiver (and without a buffer pool) or other complicated ISPs that can have a direct pipeline to memory.<\/span><\/span><\/p>\n<p><span><span>Watch Pinchart\u2019s entire ELC talk below:<\/span><\/span><\/p>\n<p>[youtube https:\/\/www.youtube.com\/watch?v=GIhV7tiUji0]<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The V4L2 (Video for Linux 2) API has long offered an open source alternative to proprietary camera\/computer interfaces, but it\u2019s beginning to show its age. At the Embedded Linux Conference Europe in October, the V4L2 project unveiled a successor called libcamera. V4L2 co-creator and prolific Linux kernel contributor Laurent Pinchart outlined the early-stage libcamera project [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":66761,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[40],"tags":[],"class_list":["post-66760","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-linux-freebsd-unix"],"_links":{"self":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts\/66760","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/comments?post=66760"}],"version-history":[{"count":0,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/posts\/66760\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/media\/66761"}],"wp:attachment":[{"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/media?parent=66760"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/categories?post=66760"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sickgaming.net\/blog\/wp-json\/wp\/v2\/tags?post=66760"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}