Why you can't color calibrate deep space photos

(maurycyz.com)

129 points | by LorenDB 9 hours ago

16 comments

  • klysm 7 hours ago
    Recently I've been on a bit of a deep dive regarding human color vision and cameras. This left me with the general impression that RGB bayer filters are vastly over-utilized (mostly due to market share), and are they are usually not great for tasks other than mimicking human vision! For example, if you have a stationary scene, why not put a whole bunch of filters in front of a mono camera and get much more frequency information?
    • nothacking_ 6 hours ago
      That's common in high end astophotography, and almost exclusively used at professional observatories. However, scientists like filters that are "rectangular", with a flat passband and sharp falloff, very unlike human color vision.
      • rachofsunshine 6 hours ago
        Assuming the bands are narrow, that should allow approximately true-color images, shouldn't it?

        Human S cone channel = sum over bands of (intensity in that band) * (human S-cone sensitivity in that channel)

        and similarly for M and L cone channels, which goes to the integral representing true color in the limit.

        Are the bands too wide for this to work?

        • nothacking_ 6 hours ago
          > Are the bands too wide for this to work?

          For wideband filters used for stars and galaxies, yes. Sometimes the filters are wider then the entire visible spectrum.

          For narrowband filters used to isolate emission from a particular element, no. If you have just the Oxygen-III signal isolated from everything else, you can composite it as a perfect turquoise color.

      • jofer 7 hours ago
        In case you weren't already aware, that last bit basically describes most optical scientific imaging (e.g. satellite imaging or spectroscopy in general).
        • adornKey 2 hours ago
          And don't forget about polarization! There's more information out there than just frequency.
          • chaboud 5 hours ago
            I think you want a push broom setup:

            https://www.adept.net.au/news/newsletter/202001-jan/pushbroo...

            Hyperspectral imaging is a really fun space. You can do a lot with some pretty basic filters and temporal trickery. However, once you’re out of hot mirror territory (near IR and IR filtering done on most cameras), things have to get pretty specialized.

            But grab a cold mirror (visible light cutting IR filter) and a nighvision camera for a real party on the cheap.

          • strogonoff 1 hour ago
            It is not just in space where nothing is lit by a uniform light source or with a uniform brightness. This is also true for many casual photos you would take on this planet.

            Outside of a set of scenarios like “daylight” or “cloudy”, and especially if you shoot with a mix of disparate artificial existing light sources at night, you have a very similar problem. Shooting raw somewhat moves this problem to development stage, but it remains a challenge: balance for one, make the others look weird. Yet (and this is a paradox not present in deep space photography) astoundingly the same scene can look beautiful to the human eye!

            In the end, it is always a subjective creative job that concerns your interpretation of light and what you want people to see.

            • HPsquared 1 hour ago
              I suppose the human visual system is already adapted to deal with the same problem.
            • cyb_ 5 hours ago
              Having dabbled a bit in astrophotography, I would suggest that color is best used to bring out the structure (and beauty) of the object. Trying to faithfully match the human eye would, unfortunately, cause a lot of that data to be harder to see/understand. This is especially true in narrowband.
              • ekunazanu 58 minutes ago
                > Because there’s a lot of overlap between the red and green cones, our brain subtracts some green from red, yielding this spectral response:

                No, cones do not produce a negative response. The graph shows the intensity of the primaries required to recreate the spectral colour at that wavelength. The negative implies that the primary was added to the spectral colour to match it with itself, instead of adding it with the other primaries.

                https://en.wikipedia.org/wiki/CIE_1931_color_space#Color_mat...

                • rf15 40 minutes ago
                  > No, cones do not produce a negative response.

                  not what was claimed at all...

                • Retr0id 8 hours ago
                  The next space mission should be to leave a colour calibration chart on the moon.
                • jofer 7 hours ago
                  These same things apply to satellite images of the Earth as well. Even when you have optical bands that roughly correspond to human eye sensitivity, they're a quite different response pattern. You're also often not working with those wavelength bands in the visualizations you make.

                  Scientific sensors want as "square" a spectral response as possible. That's quite different than human eye response. Getting a realistic RGB visualization from a sensor is very much an artform.

                  • mystraline 6 hours ago
                    The proper color of an image would be a multispectral radiograph similar to a waterfall plot for each point. Each FFT bin would be 100GHz in size, and the range would be over 1000THz. And in a way, that'd what a color sensor is doing at the CCD level too - collapsing and averaging the radio energy its susceptible to a specific color.
                  • jpizagno 1 hour ago
                    As a former astronomer, this was a great post. (The website can use some post-90s styling however :> )
                    • hliyan 6 hours ago
                      I still haven't forgiven whoever made Voyager's first images of Jupiter's moon Io bright red and yellow, and The Saturnian moon Enceladus green.
                      • ianburrell 5 hours ago
                        Neptune was shown as deep blue for a long time, but it is really a similar color as Uranus, a pale greenish-blue.
                      • kurthr 8 hours ago
                        What's the white point? Is it D65? Not when the sun isn't out.
                        • klysm 7 hours ago
                          I've always been confused by what the white point actually _means_. Since we are dealing with strictly emissive sources here, and not reflected sunlight, does the whitepoint even mean anything?
                          • esafak 7 hours ago
                            In a scene lit overwhelmingly by one approximately Planckian light source, the white point is the color of the closest Planckian light source.

                            If the light source is not approximately Planckian, or if multiple illuminants have different temperatures, a white point is not defined.

                            • klysm 5 hours ago
                              So in this case there is no sensible white point since there is no illuminant right?
                              • esafak 4 hours ago
                                I'm not sure what case we're talking about but if it serves visible light it is an illuminant.
                        • bhouston 8 hours ago
                          > Many other cameras, particularly those with aggressive UV-IR cut filters, underespond to H-a, resulting in dim and blueish nebula. Often people rip out those filters (astro-modification), but this usually results in the camera overresponding instead.

                          Hmm... astrophotographers do not use cameras with UV-IR cut filters at all. For example, I owned a few of these:

                          https://www.zwoastro.com/product-category/cameras/dso_cooled...

                          They also generally do not use sensors that have Bayer filters. This also screws things up.

                          Instead they use monochromatic sensors with narrowband filters (either one band or multiple) over them keyed to specific celestial emissions. The reason for this is that it gets rid of light pollution that is extensive and bumps up the signal to noise for the celestial items, especially the small faint details. Stuff like this:

                          https://telescopescanada.ca/products/zwo-4-piece-31mm-ha-sii...

                          https://telescopescanada.ca/products/zwo-duo-band-filter

                          Often these are combined with a true color capture (or individual RGBL narrowband) just to get the stars coloured properly.

                          Almost everything you see in high end astrophotography is false color because they map these individual narrowband captures on the monochrome sensors to interesting colours and often spending a lot of time manipulating the individual channels.

                          This is done at the medium to high end using the PixInsight software - including by NASA for the recent James Webb images: https://www.pbs.org/video/new-eye-on-the-universe-zvzqn1/

                          The James Web telescope has a set of 29 narrowband filters for its main sensor: https://jwst-docs.stsci.edu/jwst-near-infrared-camera/nircam...

                          Hubble pictures were famously coloured in a particular way that it has a formal name:

                          https://www.astronomymark.com/hubble_palette.htm

                          (My shots: https://app.astrobin.com/u/bhouston#gallery)

                          • recipe19 7 hours ago
                            What you're describing is the domain of a very, very small number of hobbyists with very deep pockets (plus various govt-funded entities).

                            The vast majority of hobby astrophotography is done pretty much as the webpage describes it, with a single camera. You can even buy high-end Canon cameras with IR filters factory-removed specifically for astrophotography. It's big enough of a market that the camera manufacturer accommodates it.

                            • bhouston 7 hours ago
                              > What you're describing is the domain of a very, very small number of hobbyists with very deep pockets

                              Sort of. The telescope used for the Dumbbell nebula captures featured in the article was at worth around $1000 and his mount is probably $500. A beginner cooled monochrome astrophotography camera is around $700 and if you want filters and a controller another $500.

                              There are quite a few people in the world doing this, upwards of 100K:

                              https://app.astrobin.com/search

                              Various PixInsight videos have +100K views: https://youtu.be/XCotRiUIWtg?si=RpkU-sECLusPM1j-&utm_source=...

                              Intro to narrowband also has 100K+ views: https://youtu.be/0Fp2SlhlprU?si=oqWrATDDwhmMguIl&utm_source=...

                              • looofooo0 4 hours ago
                                Some even scratch of the bayer pattern of old cameras.
                              • tecleandor 5 hours ago
                                You don't need very big pockets for that.

                                Today you can find very affordable monochromatic astrophotography cameras, and you can also modify cheap DSLR cameras or even compact cameras to remove its IR/UV/low pass filters. You can even insert a different semi permanent internal filter after that (like a IR or UV band pass)

                                I've done a Nikon D70 DSLR and a Canon Ixus/Elph compact.

                                Some cameras are very easy, some very difficult, so better check first some tutorials before buying a camera. And there are companies doing the conversion for you for a bunch of hundred dollars (probably 300 or 400).

                                • looofooo0 4 hours ago
                                  You can even do the conversion diy.
                                  • tecleandor 44 minutes ago
                                    Yep. I did both myself, as I was using old cameras that I had hanging around and if I sent them for conversion it would be more expensive than the cost of the camera.

                                    Conversions done in places like Kolari or Spencer run about $300-500 depending on the camera model.

                                    If I were to buy a brand new A7 IV or something like that, I would of course ask one of those shops to do it for me.

                                • tomrod 6 hours ago
                                  And the entire earth observation industry, which doesn't look the same way but uses the same base tech stack.
                                • verandaguy 8 hours ago

                                      > astrophotographers do not use cameras with UV-IR cut filters at all
                                  
                                  I'll be pedantic here and say that the author's probably talking to people who use DSLRs with adapter rings for telescopes. I've been interested in doing this for a while (just unable to financially justify it), and I think this is actually something people in this niche do.

                                  Then there are things like the Nikon D810A, which remove the UV-IR filter from the factory (but IIRC retain the Bayer filter).

                                  • bhouston 7 hours ago
                                    My recommendation, as someone who started with a DSLR and then modded it to remove the UV-IR filter, I would have been better to just skip to a beginner cooled mono astrophotography camera, like the ASI533MM Pro. It is night and day difference in terms of quality and roughly the same cost and it automates better much better.

                                    A high end DSLR is a huge waste of money in astrophotography. Spend the same amount on a dedicated astrophotography camera and you’ll do much better.

                                    • schoen 6 hours ago
                                      > It is night and day difference

                                      Particularly high praise in astronomy!

                                      • verandaguy 7 hours ago
                                        How do you recover colour from a mono astro camera? Just run it for 3 exposures behind a gel of each of the R/G/B colours, then comp?
                                        • gibybo 6 hours ago
                                          Yes, and you would almost certainly want to automate it with a filter wheel that changes the filters for you on a schedule. However, a key advantage of a mono camera is that you don't have to limit yourself to RGB filters. You can use some other set of filters better suited for the object you are capturing and map them back to RGB in software. This is most commonly done with narrowband filters for Hydrogen, Sulfur and Oxygen which allow you to see more detail in many deep space objects and cut out most of the light pollution that would otherwise get in your way.
                                  • dheera 7 hours ago
                                    It's worth noting that many NASA images use the "HSO" palette which is false color imagery. In particular the sulfur (S) and hydrogen (H) lines are both red to the human eye, so NASA assigns them to different colors (hydrogen->red, sulfur->green, oxygen->blue) for interpretability.
                                    • bhickey 8 hours ago
                                      The tiniest of corrections: Ha is 656.28nm not 565.
                                      • execat 7 hours ago
                                        At risk of going off-topic, when I see comments like these, I wonder how the comment author comes up with these corrections (cross-checked, the comment is in fact true)

                                        Did you have the number memorized or did you do a fact check on each of the numbers?

                                        • kragen 6 hours ago
                                          I didn't know the number was wrong, but something about the statement seemed very wrong, because the 565nm number is only 10nm away from 555nm, conventionally considered the absolute maximum wavelength of human visual sensitivity (683lm/W). And you can see that in the photopic sensitivity curves in the rest of the article: both red and green cones respond strongly to light all around that wavelength. So it seemed implausible that 565nm would be nearly invisible.

                                          But I didn't know whether Ha was actually highly visible or just had a different wavelength. I didn't know 683lm/W either, and I wasn't exactly sure that 555nm was the peak, but I knew it was somewhere in the mid-500s. If I'd been less of a lazy bitch I would have fact-checked that statement to see where the error was.

                                          • kragen 2 hours ago
                                            I see that there's a [dead] reply by the kind of person who thinks "tryhard" is an insult and has applied it to me.

                                            When I compare people I know about who tried hard to the people I know about who didn't try hard, literally every single person I would want to be like is one of the people who tried hard. I'm unable to imagine what it would be like to want to be like the other group.

                                            I mean, I don't want to be like Michael Jordan, but I can imagine wanting to be like him, and in part this is because specifically what he's famous for is succeeding at something very difficult that he had to try unbelievably hard at.

                                            So I'm delighted to declare myself a tryhard, or at least an aspiring tryhard.

                                            Completely by coincidence, when I saw the tryhard comment, I happened to be reading https://www.scattered-thoughts.net/writing/things-unlearned/:

                                            > People don't really say this [that intelligence trumps expertise] explicitly, but it's conveyed by all the folk tales of the young college dropout prodigies revolutionizing everything they touch. They have some magic juice that makes them good at everything.

                                            > If I think that's how the world works, then it's easy to completely fail to learn. Whatever the mainstream is doing is ancient history, whatever they're working on I could do it in a weekend, and there's no point listening to anyone with more than 3 years experience because they're out of touch and lost in the past.

                                            > Similarly for programmers who go into other fields expecting to revolutionize everything with the application of software, without needing to spend any time learning about the actual problem or listening to the needs of the people who have been pushing the boulder up the hill for the last half century.

                                            > This error dovetails neatly with many of the previous errors above eg [sic] no point learning how existing query planners work if I'm smart enough to arrive at a better answer from a standing start, no point learning to use a debugger if I'm smart enough to find the bug in my head.

                                            > But a decade of mistakes later I find that I arrived at more or the less the point that I could have started at if I was willing to believe that the accumulated wisdom of tens of thousands of programmers over half a century was worth paying attention to.

                                            > And the older I get, the more I notice that the people who actually make progress are the ones who are keenly aware of the bounds of their own knowledge, are intensely curious about the gaps and are willing to learn from others and from the past. One exemplar of this is Julia Evans, whose blog archives are a clear demonstration of how curiosity and lack of ego is a fast path to expertise.

                                            • cindyllm 5 hours ago
                                              [dead]
                                            • bhickey 6 hours ago
                                              In this case I coincidentally spent a few hundred hours of hobby time over the last year designing hydrogen alpha telescopes.
                                            • nothacking_ 6 hours ago
                                              Fixed.
                                            • vFunct 7 hours ago
                                              You can if you use hyper spectral imaging...
                                              • nothacking_ 6 hours ago
                                                The problem with hyperspectral imaging is that it ends up throwing away 99.9% of all the light that hits your camera. It's been done for the sun and some very bright nebulae, but really isn't practical for most of the stuff in space.
                                                • choonway 7 hours ago
                                                  probably will come out within the next 5 iphone generations.

                                                  POC already out...

                                                  https://pmc.ncbi.nlm.nih.gov/articles/PMC8404918/

                                                  • kragen 6 hours ago
                                                    People have been making production hyperspectral sensors for decades, including hobbyists in garages; we're well beyond the proof-of-concept stage.
                                                • monkeyelite 6 hours ago
                                                  Disappointing that most space photos are made by mapping an analog input onto a gradient and that this isn’t stated more directly.
                                                  • system2 8 hours ago
                                                    Isn't this why they always use the term "artist's impression" when they are colored?
                                                    • nwallin 7 hours ago
                                                      No.

                                                      When you see "artist's impression" in a news article about space, what you're looking at is a painting or drawing created from whole cloth by an artist.

                                                      This article is about how sensors turned signals into images. When you take pictures with a 'normal' camera, we've designed them so that if you take certain steps, the image on your screen looks the same as what it would look like in real life with no camera or monitor. This article is stating that with the cameras and filters they use for telescopes, that same process doesn't really work. We use special filters to measure specific spectral properties about an astronomical object. This gives good scientific information, however, it means that in many cases it's impossible to reconstruct what an astronomical object would really look like if our eyes were more sensitive and we looked at it.

                                                      • okanat 7 hours ago
                                                        There are different reasons for that. Things like black holes are really hard to observe even in other light spectrums. Same for other objects like planets. So the drawings are made in hypothetical expectations based on simulations rather than direct observations.

                                                        Many observations come from scientific cameras rather than actual visible spectrum cameras discussed in TFA. They are not artist's impression like the first case. They will have a completely different view of the object so any visible-light predictions will have some guessing in it but the final picture will be not 100% you would see.

                                                        • recipe19 8 hours ago
                                                          I think that term is reserved mostly for actual artwork (renderings, paintings, etc).

                                                          Some deep-space astronomy pictures are in completely made-up color, often because they're taken at wavelengths different than visible light and then color-mapped to look pretty.

                                                          But the point here is even if you're taking images with a regular camera pointed at the sky, it's pretty much impossible to match "reality".