@ttuegel @halcy Current tech yes, but there is no physics reason you can't have each pixel sampling the waveform from IR to UV with say 1000 samples and storing that. And it's what you'd want if really caring about colour rendering.
But yes, you are very unlikely to do that before encountering, and probably starting to staff species with different eyes.
I guess you might start doing it if people start bio engineering replacement eyes with exotic receptors.
@ttuegel @halcy What if you didn't use RGB? Maybe they've all got full-spectrum cameras and can transmit the spectrum data. Or I wonder if there's some fancy signal processing way to produce a synthetic spectrum from an RGB-channels signal so you can convert it to a different-primaries color space.
This'd be a problem even on earth! Wolves don't have the same color receptors as humans, so if I ever somehow got to transition (I won't :<) I'd have issues with every existing display.
@ttuegel @halcy ... Apparently purple is weird. But it sounds like most everything else can be mapped to a wavelength.
I wonder how accurate such a mapping would be. I bet it'd fail for a bunch of stuff because it probably doesn't reflect one singular wavelength, just a mix that humans can't distinguish from that singular wavelength. Other critters with different eyes would be terribly confused.
Exactly.
"I wonder if there's some fancy signal processing way to produce a synthetic spectrum from an RGB-channels signal."
Can't be done, for mathematical reasons. If you pick only four different wavelengths out of the optical spectrum and amplitude-modulate each independently of the other three, you have a whole dimension of additional options than can be covered with the three RGB sensors.
Next, use 100 different wavelengths...
But is that truly the case? Couldn't RGB map comprehensively onto any trichromate system? (Cf space telescopes that work outside of the terrestrial visual range, but can nevertheless be made to produce imagery that makes sense to humans?)
Also, there are many cross-modality mappings that produce interesting & useful results. Print-to-speach being only one example.
@halcy Maybe the systems are just banking on everyone having figured out some product of primes and the systems sling a ton of zeroes back and forth until they figure out a common one and then things get all weird for a few frames until it can figure out the light and dark sums of channels as things move around in the hi-res signals and then try to figure out the rest based on how many channels there are.
Meeting a new species is understood as everyone waving their arms around a bit and being weird and purple for a few seconds and then everyone just pretending it never happened. XD
I would love to see the VFX implied by this. Some skiffy production should really have a go at it. It'd be a marvelous opportunity for some stealth math education.
(Why does Darmok suddenly leap to mind?)
My reference to Darmok is less about the story element of the language decoding than the way the story is structured to bring the audience along while the characters work out the puzzle.
Structuring a story to display the process you describe in the 1st paragraph of the reply I resonded to could be a lot of fun for at least one episode. (Though I imagine subsequently it'd just be implicit, in the way the transporter was first introduced, & then later just assumed.) >
There are a lot of critiques that could (& have been) made about the primary conceit in Darmok.
But as a piece of story-telling that also carries the viewer along through the experience of solving the puzzle, I found it to be a delight.
Oh grumble. Now I really really wanna see somebody do that.
I mean, I love the comms hook-up conceit, but it'd be so much fun from a production standpoint to play with that.
In Star Trek, the Universal Translator translates speech between aliens with entirely different frames of reference (e.g. humans and sentient gas clouds). Presumably it’s also doing some false-colour magic to video streams that come from species with totally different visual perception. Next to that, decoding some data stream that you already know is a video feed seems quite easy.
Okay, this legit made me LOL. I'm surprised I didn't scare the guinea pigs.
See also: "The beautiful thing about standards is that there are so many of them!"