My new article is out, this time it’s about internet-connected cameras, mostly being marketed as spy cameras. While the cameras themselves are very different, the common factor is the LookCam app used to manage them.

There is already a considerable body of research on these and similar P2P cameras, so it shouldn’t be a surprise that their security is nothing short of horrible. Still, how the developers managed to make all the wrong choices here on every level (firmware, communication protocol, cloud functionality) is quite something.

https://palant.info/2025/09/08/a-look-at-a-p2p-camera-lookcam-app/

#infosec #iot #lookcam #security #vulnerability

A look at a P2P camera (LookCam app)

I’ve got my hands on an internet-connected camera and decided to take a closer look, having already read about security issues with similar cameras. What I found far exceeded my expectations: fake access controls, bogus protocol encryption, completely unprotected cloud uploads and firmware riddled with security flaws. One could even say that these cameras are Murphy’s Law turned solid: everything that could be done wrong has been done wrong here.

Almost Secure

By the way, you are welcome to post your suggestions here about what “financial-grade encryption scheme” means in the context of their cloud service or where it stands in comparison to “military-grade encryption.”

Edit: I checked and the text hasn’t been mistranslated. It is just as repetitive and incomplete in Chinese as it is in English.

One of the articles writing about my findings had the following to add:

“However, it should be noted that this vulnerability could also be used for good. Spy cameras are sometimes used to film people illegally and without consent. Law enforcement agencies can take advantage of this vulnerability to trace criminals who use spy cameras for nefarious purposes.”

This is wrong on so many levels… I mean: you really want to trust law enforcement with access to footage of people being filmed without consent? Is it even legal for law enforcement to acquire this kind of access without knowing up front who the camera belongs to and what it is recording? Isn’t it way more likely that this access will be abused by criminals? And if law enforcement is willing to ignore people’s privacy and laws in order to “protect people,” isn’t it in itself proof that they cannot be trusted?

Another article uses my research to advertise a Wyze camera instead which is supposed to be much better. Fun thing is: while the Wyze app doesn’t seem to use the PPPP protocol, their cameras also use the P2P approach. And the respective library contains the string “Charlie is the designer of P2P!!” (padded with exclamation marks to make 32 bytes). You might have guessed it: it’s some kind of hardcoded key.

You ask who Charlie is? He seems to be the designer of the PPPP protocol. A bunch of CS2 Networks documents are signed by him. That algorithm in the Wyze app that uses the hardcoded key? While I don’t recognize it from the PPPP protocol, it’s still the same messing with numbers long enough to make things look complicated. Same “let’s throw in some prime numbers, this can never be wrong.”

So it looks like Wyze is using PPPP v2 or whatever they call it. From the little I’ve seen I wouldn’t bet on it being any better than the original.

Edit: This is apparently called the ThroughTek Kalay P2P SDK which has already amassed an interesting collection of CVEs. And the function in question seems to be their packet scrambling. So at least they recognize that it isn’t encryption.

Got reminded of something that I meant to mention in the article but – well, there is only so much that can fit in. My article only looks at two networks of PPPP-based cameras but there are more, and there used to be a lot more. I’ve seen some apps list hundreds of different networks, and almost all of them seem defunct.

So there is this other consequence of cameras insisting on connecting to their servers even though they could technically do without: when these servers will inevitably go down, all the cameras will immediately turn to junk.

A commenter on this blog post just added one more trivial scenario in which these cameras could end up being compromised. Apparently, the vendor reuses device IDs. No idea why and how but this means that whenever you buy such cameras, there is a chance that the previous owner of your newly acquired device ID still has it connected in their LookCam app. Wow…
@WPalant given they are all in favour of the likes of Amazon giving warrantless access to Ring camera footage.
@WPalant
In China there are several standarized encryption algorithm that together forms ShangMi (商密, lit "Commercial Encryption") standard. 3 of them (pubkey crypto SM2, hash SM3, symmetric crypto SM4) are open standard, others are confidential.
Government entities and important professions (non-confidential) must use ShangMi, according to Chinese laws.
"Military-grade encryption" (that protect state secrets) though, they're never opened at all. Depending on the level of confidentiality needed, there are 普通密码 (lit. "normal encryption") and 核心密码 (lit. "core encryption") for different uses. (I don't think there's official English translation for these terms, sorry)
@WPalant "We rolled our own crypto. Trust us."
@Ichinin Yes, but where did they roll it? It’s plain unencrypted HTTP. 😅
@WPalant Financial-grade encryption means a 4 digit PIN

@WPalant

I hear double encryption is twice as secure as single encryption!

@krinkle @WPalant especially if it's ROT13!
@fraggle @krinkle Now things finally start to make sense. Financial-grade double ROT13 encryption… 🤔