Mars.

Processed, leveled, cropped MCZ_LEFT, FL: 110mm
Sol: 880, RMC: 43.0000, LMST: 12:07:17
Original: https://mars.nasa.gov/mars2020-raw-images/pub/ods/surface/sol/00880/ids/edr/browse/zcam/ZL0_0880_0745060589_769EBY_N0430000ZCAM08882_1100LMJ01.png
Credit: #NASA/JPL-Caltech/ASU/65dBnoise

#Perseverance #Mars2020 #Solarocks #Space #crackedRocks

"There's more than one way to skin a cat"

The first image is part of an Arizona State University panorama composed from images captured on #Mars2020 Sol 4, "approximately simulating the colors of the scene that we would see if we were there viewing it ourselves."
https://mastcamz.asu.edu/galleries/mastcam-zs-first-high-resolution-mosaic-sol-4/?back=%2Fmars-images%2Fpanoramas-mosaics%2F%3Fitem_type%3D360-panoramas

The second image is the same but filtered for people with insensitivity to green.

When it comes to color perception or other sensory experience, things don't necessarily follow our preconceptions 😀

#GIMP

Mastcam-Z's First High-Resolution 360° Mosaic! (Sol 4) - Mastcam-Z

[Note: The banner image here is a lower-resolution MP4 movie file, and the JPEG available at the red button link below is a 1/3 resolution version. For full-resolution versions of this Mastcam-Z mosaic, click on the TIFF or PNG button below, and for additional versions of this panorama at full resolution, including 3-D anaglyphs for […]

Mastcam-Z

@65dBnoise

Colour perception varies more than most people realise.

I guess the folk that set up some of the M20 cameras and their image pipelines appreciate the raw images on the mission server more than others.

#cantpleaseeveryone

@PaulHammond51
Indeed, those people definitely know better. And knowing better makes them show a number of different approaches to viewing, rather than sticking to one for all intents and purposes.
@65dBnoise @PaulHammond51 I agree that color perception can vary quite a lot between different people, however there is a pretty well established process for digital cameras here on earth to produce natural looking colors.
NASA doesn't apply any of those established processes to the Mastcam images.
And what they are doing instead is no substitute to this process, from my understanding it's plainly wrong to call the "natural color images" true to what the human eye would see.
@65dBnoise @PaulHammond51 First of all, they are completely ignoring white balancing. White balancing is important when the scene light color doesn't match the environment the image is looked at. Our eyes always try to balance to the local light conditions, so if we display a non whitebalanced image on our screen, our eyes won't adjust to the colors of the image because they are adjusted to our surroundings.
@65dBnoise @PaulHammond51 So while their approach to show the images this red, may not be completely wrong, as in the light on mars is overall much redder than on earth, our eyes can't adjust to the colors like they would standing on mars.
@65dBnoise @PaulHammond51 Therefore it is important to white-balance an image to simulate the color adjustments of our eyes to the normal white point of our displays.
Whitebalancing is not a perfect method though, our eyes don't adjust in a linear manner to different lighting conditions. But it's at least a better approximation than showing the unbalanced image.
@65dBnoise @PaulHammond51 That said, there is another big issue with the natural color images. Their calibration completely ignores color spaces.
A color space defines how much real life color saturation an RGB triplet value corresponds to. This depends on the capabilities of the display your viewing the image on and it also depends on the sensitivity curves of the camera sensor.
@65dBnoise @PaulHammond51 If you don't account for that, and just interpret the raw color values as sRGB colors, an image will usually look less saturated than the real scene.
This also applies to the "natural color" images from Mastcam. So even if those images would show the reddish lighting conditions on Mars correctly, they don't show the saturation (and therefore the color separation) correctly.
@65dBnoise @PaulHammond51 Producing images that show what our adjusted eyes would see on Mars is certainly not easy. I'm trying my best to do that with my calibrated images, but it's definitely not perfect.
The real answer to this question will likely only come once we eventually visit there ourselves.
What I'm sure of however is that the Mastcam calibration (as described here: https://www.hou.usra.edu/meetings/lpsc2023/pdf/2504.pdf) is not accurate in this way.

@65dBnoise @PaulHammond51 In contrast, here is the color calibration for Insight, which is doing things a lot better and in a similar way to my own process.
https://doi.org/10.1029/2020EA001336

I described my own process here: https://twitter.com/stim3on/status/1649000065350893568
Once I find the time I will do a better writeup on this.

@65dBnoise @PaulHammond51 Oh, and I think there is another issue with the natural color images, I'm not completely sure about this since the calibration (Edit: the documentation of it) is not very detailed.
The issue concerns gamma. Our eyes don't react to light in a linear way, but in a logarithmic fashion. That means we are more sensitve to differences in dark parts of a scene than bright ones.
@65dBnoise @PaulHammond51 To account for this, sRGB images usually apply a "gamma curve", i.e. an exponential brightness encoding to map more dark values to the 8-bit brightness range than bright ones.
Calculations usually happen in linear brightness range, so if you want to create an image that should be displayed correctly, you need to apply a gamma curve to it.
It's not a very intuitive topic, but this video explained it well: https://www.youtube.com/watch?v=LKnqECcg6Gw
Computer Color is Broken

YouTube
@65dBnoise @PaulHammond51 It is my understanding, that the "natural color" images don't apply this gamma curve after the calibration is applied. Therefore the brightness values are not at all true to our perception of brightness. Rather these images look like what we would see if we had a linear brightness perception. 😬

@stim3on @PaulHammond51
My concern here is visual perception of Martian terrain and how a viewer can make the most out of it. So I take a "high altitude" view on the subject, caring most for the end result.

Anyone who has ever dived deeper than 10m (or has seen pictures/movies) knows that everything at that depth looks blue-green, despite fish and corals still being colorful. If one needs to understand what goes on at the bottom of the sea, presenting one ...

1/4

with an accurately calibrated picture does not help; most creatures, plants and other details will go unnoticed in a homogeneous color.

Same with Mars. I've seen numerous attempts to process Martian images "as one would see them if one were on Mars", most of them based on a preconception of a "red planet" giving a red/orange tint to everything. I doubt most such attempts are any close to what one would actually *perceive* when being/living there. Instead, most such attempts reduce ...

2/4

significantly visual perception: details get washed out in the amplified orange/red colors, overall perceived contrast gets reduced, things look completely unnatural, alien. The animal inside a human perceives a world artificially worsen for the sake of a preconception of how things "could|should|might" appear.

While doing that might be useful to a movie director, it is certainly NOT good if one tries to understand what goes on there. It's not an accident that almost all instruments ...

3/4

scientists (and everyone else) use are made to improve perception of the object/process being observed. The objective is always to augment human perception with details hard to discern otherwise.

I believe that NASA follows the above as close as possible. When the task is to preserve color fidelity, they do that using calibration targets for each instrument, and other procedures pertinent to the task.

It's inconceivable to me that they don't make use of those facilities.

4/4

@65dBnoise I agree that showing Mars in this overly red depiction doesn't help for exactly the reasons you said.
Because our color perception will adjust to the reddish light on mars it would not even appear as red as shown in these images. Here is an example of a similarly red lighting situation I've see on earth. The left one is how yellow it was relative to normal daylight and on the right how I actually perceived it:
https://fosstodon.org/@stim3on/110535700701027285
Simeon Schmauß (@[email protected])

Attached: 2 images @[email protected] yeah pretty crazy with all the wildfires! I'm curious though if you have seen the smoke yourself, did you notice your eyes adjusting to the overall reddish lighting? We had lots of Sahara dust blown over to Germany to while ago and it also tinted the sky very orange. However, after a while of being outside my eyes adjusted to it and it nearly looked like a "normal" overcast day. The left image is with daylight whitebalance and the right one roughly how I perceived it.

Fosstodon

@65dBnoise And yeah, most such attempts are basically just "winging" it and are not based on physical properties.

While the NASA images are technically based on physical properties of the cameras, they completely ignore the correct mapping of those values to a display. In that way they are not much more useful in creating an authentic view than the approximated hobbyist attempts.

@65dBnoise Even though the rovers have color calibration targets, I have not seen a single paper that uses these to calculate a correction for accurate colors. In principle they could be used for this, but they are far from optimal because there are too few targets with too little variation.

That said, they are actually used for calibration processes, but none that concern color accuracy. They are only used to calibrate radiometric data, i.e. how much light energy is entering the camera.

@65dBnoise NASA/ASU also produces "enhanced color" images which I think are more similar to the processing you are aiming for. These are not processed in a way that reflects how our eyes work, but for better color separation. This can certainly be useful for interpretations but won't produce an authentic colors.

@65dBnoise The "natural color" images can also be useful for some things, but they don't help with creating a view that is useful for visual interpretation. Also, calling them "how the human eye perceives it" is objectively and physically wrong.

In that way there seems to be a misconception with certain NASA folks on how our color vision works.

@stim3on
Instead of enhanced color, I rather aim for familiar colors, and have the impression that such familiarity helps untrained or casual viewers see and understand better what goes on on Mars, the shape and texture of rocks, similarities with what one sees on Earth, clouds, frost, etc, by eliminating the reservation created by something that looks too alien. And when a dust storm hits, then the reddish haze caries a true (and very practical) meaning 😀

@stim3on
Hm, I'm afraid I wouldn't know, since I'm by no means an expert.

I would expect the PDS to have calibrated images using well established procedures tailored to the purpose they are using them for, which is the most important criterion for them to spend time to meet. There are other areas where similar criteria apply, and it might make it appear as if they didn't care.

Not sure if the purpose of the (~live) pipeline, where we get the raw images from, is to provide calibrated images.

@65dBnoise the PDS data are calibrated for radiometric accuracy. And as far as I know they are doing this really well.
The problem seems to start outside when people (and the above mentioned LPSC abstract) expect the "natural color" images (which basically are contrast enhanced radiometric images) to also show correct colors which they definitely don't.

There are no color calibrated products for Mastcam mentioned in the PDS, as far as I know proper ones don't exist at all.

@65dBnoise the raw images pipeline is by design completely uncalibrated, the colors there depend on the physical properties of the cameras used. They don't represent accurate colors at all.
I'm always a little annoyed when NASA press releases use these uncalibrated, often greenish looking images.

@stim3on
Oh yeah, I'm also annoyed by green Mars, and especially by sinking horizons, as I've made sure we all know by now 🙃 😆

But having managed projects with clearly defined scope of work and deliverables for which one has to spend one's time, resources and effort, where time is of the essence, I very well understand why such annoyances (to others) may happen. :)

@stim3on
What would those color calibrated images be used for, had they existed? (hobbyist question 😀)

@stim3on
Yes, I remember that post of yours.

I'm sure when the time comes for humans to go to Mars, or rather long before that time comes, a lot of research is going to go into such matters for very practical reasons, as the outcome will directly affect the life and work of those that will make the trip. Sure, we'll also have their first impressions about colors, but if naturally filtered color proves to be a problem to visual perception, that will change with the use of visors etc.

@65dBnoise I highly doubt that it will be a problem, just because our eyes are so capable of adjusting to different lighting situations.
On a clear day Mars light isn't even that much redder than earth light. After all, it's the same sun that illuminates us here on Earth.
Again, this paper compares the both:
https://doi.org/10.1029/2020EA001336

It's only during heavy dust storms when the sun is partially hidden when color perception may get a little funky.

@65dBnoise Interesting. I wonder what caused that rock to split?

@mightyspaceman
Thermal stress would be a common cause considering Jezero has a ~60°C temperature swing from day to night. But IANAG™, so, let's wait and see that the geologists will have to say.

Meanwhile, I have a collection of other tagged #crackedRocks seen around Jezero Crater.

IANAG™ = I Am Not A Geologist 🙂

@65dBnoise Looks like the bottom of an ocean
@RobotPoet54
Or like a beach at low tide