TL;DR
I wanted to revive my old Raspberry Pi Zero W camera and stream it to Home Assistant.
I made it work via libcamera2 (mjpeg stream). If I compare it with ESP32 Cam, it’s resolution and fps is better, but it’s more complex to set it up.
Long story
RBPi camera used to work when I bought it (2018) – stream using uv4l, but now (in 2026) and with new version of Raspbian OS, it doesn’t.
Which streaming tools I’ve tried and didn’t manage to get them work:
Libcamera2 and Python MJPEG streaming server
Lastly I’ve found libcamera2 python code2 (mjpeg_server2.py3)and it works. I can stream MJPEG stream to Home Assistant. The code:
#!/usr/bin/python3
# This is the same as mjpeg_server.py, but uses the h/w MJPEG encoder.
import io
import logging
import socketserver
from http import server
from threading import Condition
from picamera2 import Picamera2
from picamera2.encoders import MJPEGEncoder
from picamera2.outputs import FileOutput
PAGE = """\
<html>
<head>
<title>picamera2 MJPEG streaming demo</title>
</head>
<body>
<h1>Picamera2 MJPEG Streaming Demo</h1>
<img src="stream.mjpg" width="640" height="480" />
</body>
</html>
"""
class StreamingOutput(io.BufferedIOBase):
def __init__(self):
self.frame = None
self.condition = Condition()
def write(self, buf):
with self.condition:
self.frame = buf
self.condition.notify_all()
class StreamingHandler(server.BaseHTTPRequestHandler):
def do_GET(self):
if self.path == '/':
self.send_response(301)
self.send_header('Location', '/index.html')
self.end_headers()
elif self.path == '/index.html':
content = PAGE.encode('utf-8')
self.send_response(200)
self.send_header('Content-Type', 'text/html')
self.send_header('Content-Length', len(content))
self.end_headers()
self.wfile.write(content)
elif self.path == '/stream.mjpg':
self.send_response(200)
self.send_header('Age', 0)
self.send_header('Cache-Control', 'no-cache, private')
self.send_header('Pragma', 'no-cache')
self.send_header('Content-Type', 'multipart/x-mixed-replace; boundary=FRAME')
self.end_headers()
try:
while True:
with output.condition:
output.condition.wait()
frame = output.frame
self.wfile.write(b'--FRAME\r\n')
self.send_header('Content-Type', 'image/jpeg')
self.send_header('Content-Length', len(frame))
self.end_headers()
self.wfile.write(frame)
self.wfile.write(b'\r\n')
except Exception as e:
logging.warning(
'Removed streaming client %s: %s',
self.client_address, str(e))
else:
self.send_error(404)
self.end_headers()
class StreamingServer(socketserver.ThreadingMixIn, server.HTTPServer):
allow_reuse_address = True
daemon_threads = True
picam2 = Picamera2()
picam2.video_configuration.controls.FrameRate = 5.0
picam2.configure(picam2.create_video_configuration(main={"size": (800, 600)}))
output = StreamingOutput()
picam2.start_recording(MJPEGEncoder(), FileOutput(output))
try:
address = ('', 8000)
server = StreamingServer(address, StreamingHandler)
server.serve_forever()
finally:
picam2.stop_recording()Starting the program above with:
python mystreamer.py
Now I can finally see what is happening in my aquarium:
It works ok, but when my ssh connection closes, the python program terminates and it doesn’t stream anymore.
So I made it as a systemd service:
nano /etc/systemd/system/mystreamer.service
mystreamer.service looks like:
[Unit]
# Human readable name of the unit
Description=Python MJPEG web streamer
After=network-online.target
Wants=network-online.target
[Service]
Type=simple
User=tomi
WorkingDirectory=/home/tomi/mymjpeg
ExecStart=/usr/bin/python /home/tomi/mymjpeg/mystreamer.py
Restart=on-failure
RestartSec=10
StartLimitIntervalSec=60
StartLimitBurst=3
Environment=PYTHONUNBUFFERED=1
StandardOutput=journal
StandardError=journal
CPUQuota=70%
MemoryMax=300M
Nice=10
[Install]
WantedBy=multi-user.target
Start the service:
sudo systemctl start mystreamer.serviceStatus?
sudo systemctl status mystreamer.serviceSee logs:
sudo journalctl -u mystreamer.service -fEnable it (autostart at boot):
sudo systemctl enable mystreamer.serviceCheck if enabled:
systemctl is-enabled mystreamer.serviceAnd finally, reboot and see if it starts:
sudo systemctl reboot
Then I modified .py code and made a mistake. Pi booted, but the service restarted every few seconds. I couldn’t even ssh to Pi.
What now?
I took SD card out of PI, put it in my computer, edited faulty .py code, put it back to RBPi, booted and I could SSH again.
Bottomline: streaming now works, but the poor PI’s CPU is at 100% (60% python streaming server, 30% glances). I found out I can set my python streamer service to CPUQuota=50% and still works ok.
Streaming H.264?
Raspberry Pi camera can also stream H.264 video, but I didn’t know how to integrate it in Home Assistant. It looks like the Camera integration doesn’t support it (it supports RTSP stream).
The command is:
libcamera-vid -t 0 --inline --listen -o tcp://0.0.0.0:8888
If anyone knows how to put this stream to HA, please let me know.
Footnotes
mcam and, probably the first time ever, with Gnome/Phosh snapshot application with a complete pipewire-libcamera stack. The simple SW AF prototype has been enabled by the addition of - Af: line into /usr/share/libcamera/ipa/simple/uncalibrated.yam. On the occasion, I had the chance to test the result to document an actual event: when the president of the Czech Republic, Petr Pavel, spoke at the #Ukraine Support meeting in #Prague Old Town Square today. The autofocus algorithm is quite unstable, periodically seeking a sharp image while the view is blurred in between. Same for the uncalibrated colors. But that could/should be solved in the longer perspective, see FOSDEM talk1 and talk2. But in general, the day when we can run at least older devices under real user control is yet closer. It depends now on @EUCommission whether the future is like 1984 or not for new devices. If it insists on Chat Control, which requires eliminating user control over what software runs on the device and supports rhetoric of sideloading for user control and even own builds of application installing, then the governments and corporations controlled botnet would be abused by mighty ones to control society as a whole. Back to today’s photos and the demonstration of goodwill to help the attacked neighborhood country to survive the imperial war. The original 1080p resolution photo with far focus is there and the short distance shot to the paper there. Both were taken by an IMX519 camera connected to MIPI C-phy.Well, I'm on a camera bringup streak it seems! 🚀
As of tonight the Fairphone 3 cameras (front and rear) are working on postmarketOS with mainline Linux!
And if you saw, only last week I got the Fairphone 4 cameras to work.
A bright future is ahead for mobile Linux! 📸
(Brought to you by a night train 🚂)
#postmarketOS #LinuxMobile #MobileLinux #libcamera #Fairphone #Fairphone3
Camera support on Fairphone 4 with postmarketOS is coming to you very soon!
All 3 cameras are supported, main rear camera, ultra-wide camera and front camera.
And all with a completely open source software stack of course!
#postmarketOS #LinuxMobile #MobileLinux #libcamera #Fairphone #Fairphone4 #Cat #Caturday