I'm working on an app to take photos using #gstreamer and #libcamera on #postmarketos running mainline Linux. This is an early success. The focus motor for the "wide" camera lens I took this with has no kernel support, so it only works on close focus. That worked out well in this picture though
Autofocus with #libcamera is fun. You are attempting to tune AF to do something, and while doing that you realize that a) AE does not really work, and b) there's memory leak of GPU memory or something, so if you keep playing with focus for too long, system is hosed in a way application restart does not cure. #librem5
**Stream video from Raspberry Pi Zero W camera in 2026**

TL;DR

I wanted to revive my old Raspberry Pi Zero W camera and stream it to Home Assistant.

I made it work via libcamera2 (mjpeg stream). If I compare it with ESP32 Cam, it’s resolution and fps is better, but it’s more complex to set it up.

Long story

RBPi camera used to work when I bought it (2018) – stream using uv4l, but now (in 2026) and with new version of Raspbian OS, it doesn’t.

Which streaming tools I’ve tried and didn’t manage to get them work:

  • Motion
  • Uv4l
  • mjpeg-streamer1
  • gstreamer

Libcamera2 and Python MJPEG streaming server

Lastly I’ve found libcamera2 python code2 (mjpeg_server2.py3)and it works. I can stream MJPEG stream to Home Assistant. The code:

#!/usr/bin/python3 # This is the same as mjpeg_server.py, but uses the h/w MJPEG encoder. import io import logging import socketserver from http import server from threading import Condition from picamera2 import Picamera2 from picamera2.encoders import MJPEGEncoder from picamera2.outputs import FileOutput PAGE = """\ <html> <head> <title>picamera2 MJPEG streaming demo</title> </head> <body> <h1>Picamera2 MJPEG Streaming Demo</h1> <img src="stream.mjpg" width="640" height="480" /> </body> </html> """ class StreamingOutput(io.BufferedIOBase): def __init__(self): self.frame = None self.condition = Condition() def write(self, buf): with self.condition: self.frame = buf self.condition.notify_all() class StreamingHandler(server.BaseHTTPRequestHandler): def do_GET(self): if self.path == '/': self.send_response(301) self.send_header('Location', '/index.html') self.end_headers() elif self.path == '/index.html': content = PAGE.encode('utf-8') self.send_response(200) self.send_header('Content-Type', 'text/html') self.send_header('Content-Length', len(content)) self.end_headers() self.wfile.write(content) elif self.path == '/stream.mjpg': self.send_response(200) self.send_header('Age', 0) self.send_header('Cache-Control', 'no-cache, private') self.send_header('Pragma', 'no-cache') self.send_header('Content-Type', 'multipart/x-mixed-replace; boundary=FRAME') self.end_headers() try: while True: with output.condition: output.condition.wait() frame = output.frame self.wfile.write(b'--FRAME\r\n') self.send_header('Content-Type', 'image/jpeg') self.send_header('Content-Length', len(frame)) self.end_headers() self.wfile.write(frame) self.wfile.write(b'\r\n') except Exception as e: logging.warning( 'Removed streaming client %s: %s', self.client_address, str(e)) else: self.send_error(404) self.end_headers() class StreamingServer(socketserver.ThreadingMixIn, server.HTTPServer): allow_reuse_address = True daemon_threads = True picam2 = Picamera2() picam2.video_configuration.controls.FrameRate = 5.0 picam2.configure(picam2.create_video_configuration(main={"size": (800, 600)})) output = StreamingOutput() picam2.start_recording(MJPEGEncoder(), FileOutput(output)) try: address = ('', 8000) server = StreamingServer(address, StreamingHandler) server.serve_forever() finally: picam2.stop_recording()

Starting the program above with:

python mystreamer.py

Now I can finally see what is happening in my aquarium:

It works ok, but when my ssh connection closes, the python program terminates and it doesn’t stream anymore.

So I made it as a systemd service:

nano /etc/systemd/system/mystreamer.service

mystreamer.service looks like:

[Unit] # Human readable name of the unit Description=Python MJPEG web streamer After=network-online.target Wants=network-online.target [Service] Type=simple User=tomi WorkingDirectory=/home/tomi/mymjpeg ExecStart=/usr/bin/python /home/tomi/mymjpeg/mystreamer.py Restart=on-failure RestartSec=10 StartLimitIntervalSec=60 StartLimitBurst=3 Environment=PYTHONUNBUFFERED=1 StandardOutput=journal StandardError=journal CPUQuota=70% MemoryMax=300M Nice=10 [Install] WantedBy=multi-user.target

Start the service:

sudo systemctl start mystreamer.service

Status?

sudo systemctl status mystreamer.service

See logs:

sudo journalctl -u mystreamer.service -f

Enable it (autostart at boot):

sudo systemctl enable mystreamer.service

Check if enabled:

systemctl is-enabled mystreamer.service

And finally, reboot and see if it starts:

sudo systemctl reboot


Then I modified .py code and made a mistake. Pi booted, but the service restarted every few seconds. I couldn’t even ssh to Pi.

What now?

I took SD card out of PI, put it in my computer, edited faulty .py code, put it back to RBPi, booted and I could SSH again.

Bottomline: streaming now works, but the poor PI’s CPU is at 100% (60% python streaming server, 30% glances). I found out I can set my python streamer service to CPUQuota=50% and still works ok.

Streaming H.264?

Raspberry Pi camera can also stream H.264 video, but I didn’t know how to integrate it in Home Assistant. It looks like the Camera integration doesn’t support it (it supports RTSP stream).

The command is:

libcamera-vid -t 0 --inline --listen -o tcp://0.0.0.0:8888

If anyone knows how to put this stream to HA, please let me know.

Footnotes

  • https://krystof.io/mjpg-streamer-on-a-raspberry-pi-zero-w-with-a-usb-webcam-streaming-setup/ ↩︎
  • https://github.com/raspberrypi/picamera2 ↩︎
  • https://github.com/raspberrypi/picamera2/blob/main/examples/mjpeg_server_2.py ↩︎
  • https://blog.rozman.info/stream-video-from-raspberry-pi-zero-w-camera-in-2026/ #Homeassistant #Libcamera #Raspberrypi
    @Pavel Machek has rebased his #libcamera AutoFocus experimental branch above mainline v0.7.0 tag during @okias organized 6th #MobileLinux Hackday yesterday (thanks to #SUSE for hosting), and I have built it on my #oneplus6 #Qualcomm #sdm845 phone running my build of 6.19.0-rc4-next-20260106-sdm845-gdc7b19cffd9e kernel and #mobian. Then I have tested it with Pavel’s mcam and, probably the first time ever, with Gnome/Phosh snapshot application with a complete pipewire-libcamera stack. The simple SW AF prototype has been enabled by the addition of - Af: line into /usr/share/libcamera/ipa/simple/uncalibrated.yam. On the occasion, I had the chance to test the result to document an actual event: when the president of the Czech Republic, Petr Pavel, spoke at the #Ukraine Support meeting in #Prague Old Town Square today. The autofocus algorithm is quite unstable, periodically seeking a sharp image while the view is blurred in between. Same for the uncalibrated colors. But that could/should be solved in the longer perspective, see FOSDEM talk1 and talk2. But in general, the day when we can run at least older devices under real user control is yet closer. It depends now on @EUCommission whether the future is like 1984 or not for new devices. If it insists on Chat Control, which requires eliminating user control over what software runs on the device and supports rhetoric of sideloading for user control and even own builds of application installing, then the governments and corporations controlled botnet would be abused by mighty ones to control society as a whole. Back to today’s photos and the demonstration of goodwill to help the attacked neighborhood country to survive the imperial war. The original 1080p resolution photo with far focus is there and the short distance shot to the paper there. Both were taken by an IMX519 camera connected to MIPI C-phy.
    Speaking of, I really wonder what's the present situation with #libcamera in #waydroid and how did #furios / #furiphone get camera to work in android apps
    #libcamera 0.7 really improved my camera experience on the OnePlus 6 running #postmarketOS edge.

    Not only do the results look better, there's also no more 'take a picture, end up in crashdump' happening, which almost as important.

    (I am using Plasma Camera with @nekocwd's cameractrl.)
    Sent the fix upstream and megi said hes making the change on his kernel. So at this point i think the #pinephonePro rear camera driver is as complete as can be so now all that remains is tuning and improvements in the camera stacks. #libcamera will need some tuning and the addition of autofocus for example. Theres nothing that should be stopping it though on the kernel side anymore.
    📱Yet another #libcamera hacking 🐱📷

    Now it's
    #autofocus with Sobel operator and wippy MSV-like ring-buffer based model

    #mainlining #linuxonmobile #linuxonmobile #shotonmainline

    Well, I'm on a camera bringup streak it seems! 🚀

    As of tonight the Fairphone 3 cameras (front and rear) are working on postmarketOS with mainline Linux!

    And if you saw, only last week I got the Fairphone 4 cameras to work.

    A bright future is ahead for mobile Linux! 📸

    (Brought to you by a night train 🚂)

    #postmarketOS #LinuxMobile #MobileLinux #libcamera #Fairphone #Fairphone3

    Camera support on Fairphone 4 with postmarketOS is coming to you very soon!

    All 3 cameras are supported, main rear camera, ultra-wide camera and front camera.

    And all with a completely open source software stack of course!

    #postmarketOS #LinuxMobile #MobileLinux #libcamera #Fairphone #Fairphone4 #Cat #Caturday