Cameras!

Today, I am announcing a new release of Plasma Camera, a camera application for Plasma Mobile (though it can also be used on desktop!). This release ports the application to use libcamera as the backend for interfacing with cameras, finally allowing for it to be used on Linux mobile devices (such as the OnePlus 6).
The main porting work was done by my friend Andrew (koitu) a couple of months ago. It remained stalled on some issues, so I picked it up in the past week to complete the port and finish the application. Here is a link, which has more technical details!
Background ๐
Cameras have been a long neglected area in Plasma Mobile, ever since the focus shifted from halium to mainline devices. With mainline devices, libcamera drivers have been developed for them, allowing for cameras to be used in applications over Pipewire (ex. GNOME Snapshot, Firefox, Chromium).
Plasma Camera was originally created in 2019 with halium devices in mind, using the official Qt Camera library as a backend for interfacing with cameras. This library allows for the app to work on Android and on desktop with USB webcams. Unfortunately, Qt Camera does not currently have support for using Pipewire or libcamera directly as a backend, and so is unable to interface with the cameras on the OnePlus 6 and Pixel 3a.
Porting Plasma Camera ๐
Qt Camera is a fairly high-level API designed to abstract over many different platforms, beyond Linux. Since our focus is on Linux, we decided to take this chance to port Plasma Camera to use libcamera directly for best control over the camera pipeline and features. Note that this approach differs from some other camera applications that use Pipewire, which has a backend to communicate with libcamera.
The API for libcamera is fairly comprehensive.
In order to implement the viewfinder (camera preview), we create a worker thread that is responsible for polling the camera for frames. A series of “requests” with a framebuffers allocated to each were created, which we cycle through when polling for frames. Libcamera then gives us a frame for each poll request, in which we send to our application thread to display.
For simplicity, Qt Multimedia was used for media processing. Frames from libcamera are wrapped in QImages and sent to a QVideoSink to be displayed in the UI. Any transformations needed (such as rotation correction due to how sensors are mounted on phones, or mirroring for front-facing cameras) are done before the frame is added to the sink. For taking photos and videos, we reuse the viewfinder’s frames.
For photos, we simply write the QImage to the disk.
Videos are much more tricky. Using Qt Multimedia we can build a video processing pipeline. We create a QMediaCaptureSession to facilitate all of the inputs and outputs needed. We then attach a media recorder QMediaRecorder for writing the video, an audio input (QAudioInput) and a video input (QVideoFrameInput). We have a separate polling timer that polls at the framerate of the video (which can differ from the framerate of the viewfinder), copying frames one-by-one into the QVideoFrameInput instance (more on this later) to be encoded by QMediaRecorder.
In the future, it may make sense to investigate whether we could benefit from porting to using GStreamer directly for media processing. We currently use Qt Multimedia with its ffmpeg backend. While Qt Multimedia does have an gstreamer backend, it has some limitations and was thus removed from being the default backend as a result.
UI work ๐
I also took the liberty of doing some substantial refactoring and reworking of the UI code. We dropped some camera settings for the initial port of the application, to be restored later. However some other features were introduced.
The application has these features:
- Photo capture
- Video capture
- Audio recording toggle for video capture
- EV setting (exposure value)
- Captured photo/video preview
- Video recording settings (codec, resolution, FPS, quality)
- Timer before taking a photo
- Warnings for when the encoder is detected to not be keeping up with the video stream
- Settings persistence

Results ๐
With USB webcams, both photo capture and video recording work.


It also sort of works on phones. I tested on the OnePlus 6 and Pixel 3a. I suspect that most of the issues are simply due to the camera driver not yet being mature enough, as I can replicate most of the issues on other camera applications. The photo quality and colours are not optimal, and there appears to be a fixed focal length, and so far away things look blurry.
The viewfinder stream is fine on my OnePlus 6 and looks smooth. However, for my Pixel 3a, the frames start flashing light and dark colours when I point the camera at any bright light source. I suspect it is due to the camera driver overcompensating for exposure perhaps? Not sure ๐
Photo capture works on both devices, outputting the frame from the viewfinder at full resolution to the disk almost instantly. Though the quality of the pictures is reminiscent of early 2000s phone photography.


The video recording experience however isn’t quite usable unfortunately, the video encoder does not appear to be able to keep up.
Limitations ๐
Video recording issues on phones ๐
The main barrier to video recording seems to be the performance of the video encoder. I’ve noticed on both phones that many frame calls to QVideoFrameInput fail because QMediaRecorder’s queue is simply full and cannot keep up with the amount of frames coming in. This can be mitigated somewhat by playing with the video recording settings. I’ve generally found the MPEG2 codec to be substantially faster for devices, though it gives very ugly artifacting at low quality, and sometimes gives an error. Of course, lowering the resolution and FPS also can help too.
For each frame given to QVideoFrameInput, I also set its timestamp to ensure that the encoder places it at the correct place. However, when we start dropping frames due to the encoder being full, we end up with gaps in the video without a frame, which I suspect is what is causing the pixelated “corrupt video” effect (though it only happens with H264, and not MPEG2 encoding?). We cannot really queue frames for the encoder, because we would very quickly run out of memory. I have an open issue about this since I am not really sure how to address it yet.

Rotation issues ๐
Device rotation can be a bit of a problem with the application right now. We already account for the screen orientation in comparison to the camera orientation, which is reported as a property by libcamera.
The viewfinder however can be a problem when the display rotation is different from the screen’s orientation (ex. rotated 90, 180, 270 degrees). This is done by the compositor (ex. KWin), the application only sees that the window size has changed. However, that means the viewfinder is rotated as well! We are able to adjust for this in taken photos and video by reading the rotation sensor data (with QOrientationSensor/iio-sensor-proxy), however we cannot do the same for the viewfinder because we don’t know which orientation the compositor has the application in, which could be different from the sensor due to rotation-lock and manual settings.
I recommend keeping an orientation lock on “portrait” mode when using the application on a phone until we find a fix, that way the viewfinder does not get mismatched from what you see. We are tracking this issue here: https://invent.kde.org/plasma-mobile/plasma-camera/-/issues/14

Missing camera controls ๐
The drivers for the OnePlus 6 and Pixel 3a seem to be missing almost all of the libcamera controls. At least, calling camera->controls()
(doc) gives only the Contrast control from libcamera. There are other controls that I would like to implement once they become available, such as focus windows.
Once these are implemented in the driver (or if it’s fixed as an issue on our side) and support is added in the application, we will have a lot more camera features to play with!
Conclusion ๐
We finally have a base on to use for the camera stack on Linux mobile. I hope the application continues to improve as drivers and camera support get better over time on these devices.
So, give it a try! And feel free to come join us to talk about it in the Plasma Mobile matrix channel!