Wed, 04/15/2020 - 07:32
I shall soon start imaging with a QHY5III 178C CMOS color camera. One issue that is unclear to me, is how to properly color calibrate the images, so objects appear as true to their actual color as possible.
The basic procedure seems to be to choose standard color stars, and adjust the R,G,B channel gain so these stars appear "white" in the image. The question that arises is: Should I choose solar-type G stars (standard daylight 5000-6000K photographic color temperatures) with B-V~ +0.6 as "pure white", or use Vega-type stars with B-V~ 0.0 as "pure white" ?
Which would give the truest real color in the images? Thanks,
Mike
I have the same question for the ZWO ASI 183 MC Pro. From my reading of the guides, this seems to be a hard problem. Exact matches between perceived Bayer matrix color and logged star color information -- e.g., intensity of transmission through Johnson-type filters for a monochrome sensor -- is not easy because the band passes are at different places in the spectrum for the Bayer and Johnson color models. Traditional white balance in terms of degrees K is not expressive enough; that just balances between blue and red without involving the full spectrum of the star.
(One of my more exotic pieces of equipment is a three-axis RGB white balance meter that -- unfortunately -- is intended for terrestrial rather than astro-photography, and can be asked about the Red-Green and Green-Blue balances, respectively. It is used in practice to correct film photography for weird flourescent light sources.)
That being said, in the CMOS DSLR guide, there is a mention of the ability to derive at least partial useful photometry information from the intensity of the green channel of the camera, via a method beyond the scope of the guide.
I am also a newbie to photometry, but have an extensive background in technical photography, and this is an interesting question!
If I had to do this, I would stack Johnson filters on top of the color CMOS, and take exposures of R, G, B channels for each of the Johnson-B and Johnson-V filters and without filter. Then I would use the response curves of the Johnson filters, the response curves of the Bayer matrix, and the normalized Johnson-B and Johnson-V intensities of reference stars to do a best fit to the Bayer intensities that match the measured response through the filter stacks (combining the response curves into one curve for the Bayer/Johnson stacks). This would give an approximate value for R, G, and B for each pixel.
This is complex, laborious, and approximate, and I am sure experts will have a better approach.
(1) Vega at B-V ~ 0 is blue, not white, at least as human perceive color.
(2) For photometry, there's no "pure white". If instead you assign each extracted Bayer signal to a standard passband (e.g., green -> Johnson V) and then apply a separately determined transform (e.g., green -> V from catalog B & V mags), then you're accomplishing very much the same thing the expensive-filter photometrists are doing. If your transforms have small absolute values, then you know that your Bayer signal tracks its assigned standard passband well (and its transform makes it better), and you should do fine. I would forget about "pure white".
Hi Mike,
Probably you could use the same methodology as DSLR photometry (and, in general, CCD filtered photometry) uses: obtaining transformation coefficients from standard stars and apply them to the data.
See, for example, Chapter 6: Photometry – From Measurement to Magnitude of http://www.aavso.org/sites/default/files/publications_files/dslr_manual/AAVSO_DSLR_Observing_Manual_V1.4.pdf (https://aavso.org/dslr-camera-photometry-guide)
Best regards,
Max
The answer differs if you are interested in "pretty picures" or photometry.
For photometry, in addition to the transformation techniques described above, you can also just work with the Green channel, pretend it's Jahnson V, (e.g. use the V reference magnitudes) and report the results as "Tri-Color Green with V zeropoint" (TG) .
For pretty pictures: Yes, selecting a G type star and white-balancing that one to look white is indeed often used, and I have seen astrophotography tools that have a build-in workflow to support exactly that: the software will ask you to select a sun-like star from the image and then adjusts the colors accordingly. Cannot remember which software that was, tho, but it's a legitimate workflow...Again, for pretty pictures, not photometry.
On the other hand...why care to replicate the response of this particular second-class sensor (the human eye)? Your camera registers photons up to ca 680 nm on the red end. Other astro-cameras are even sensitive to 900...1000nm in the near infrared, and there the question of "true color" gets totally moot. There's nothing wrong with "false-color" images in astrophotography if that helps to bring out details,
CS
HB
Heinz-Bernd,
Excuse me if my ignorance of the particular camers leads me astray, but when I'm trying to take purrrrrty pictures, I just set the white balance of my 5DS R to AWB (automatic white balancing). That seems to work pretty well. And if I am not satisfied with the result, I can always tweak white balance in GIMP or Photoshop. So what's the big deal?
CS.
Stephen
There are some issues with that. The whitebalance setting of your DSLR (no matter whether it's some automatic mode or some setting to a fixed color temperature) will only influence the JPEG images generated by the DSLR's image processor. But for best results in astrophotography, you want to work with the RAW images, which are .... well, raw as read from the sensor, and not influenced by the whitebalance setting at all. And even for the JPEGs, the automatic whitebalancing by your camera will most likely not be optimized for astrophotography, so if you take an image of a nebula which is almost exclusively red, it's hard for the camera to "automagically" know this.
CS
HB
Interesting; I should have known that. I always shoot raw, but I am pretty sure Adobe Digital Professional does AWB when bringing images into Photoshop. Sometimes, if I am forced to shoot with incandescent light, I have to tweak the white balance, but of course I don't do pretty picture astrophotography. Don't really understand why anyone would, given all the wonderful Hubble images out there for the taking, and the real pros like Rogelio Bernal Andreo with the WFOV stuff.
Yeah the AWB setting is stored in the meta-data of the RAW file. The pixel data is not changed by it, it just records the setting at the time the image was taken, so it's kind of inconsequential.
>I don't do pretty picture astrophotography. Don't really understand why anyone would, given all the >wonderful Hubble images out there for the taking, and the real pros like Rogelio Bernal Andreo with >the WFOV stuff.
Indeed. Then again, I must admit that sometimes .... when waiting for some photometry target to rise, I do pretty pictures as well e.g. comets or galaxies with SN in them , something that is more transient.
CS
HB
As usual this forum is very thought-provoking and saves a lot of time.
Thank you for all the informative replies so far! Well yes, its hard to take "puuurdy pictures" of variables alone, unless in a cluster or some other interesting or colorful asterism! So, one of my intentions here is to make scientifically useful "pretty pictures" of comets, as a primary example. Determining the "true color" of the coma, and dust and gas tails yields very useful information. So, it is important to have a correct color balance visible to the human eye, for examination of these images.
I don't think using the camera/software manufacturer's AWB is a good idea. As we all experience in daily life, when calling our banks, pharmacies, or any large business, the "AI" systems of today are still as dumb as can be! Relying on such systems to "automatically" color balance seems like the worse alternative.
As Eric mentions, the simplest way, of choosing the "closest to white" star(s) on the image may work, and the G stars may seem whiter than the A stars like Vega or Sirius? But I am unsure of what star CI (B-V) most closely appears as true white to the human eye, this was the main point of my question.
And yes, I do plan to use the green channel for standard TG photometry of variables, and I think the procedures for those transformations are pretty well documeted in the course materials. No problems there.
Mike
Sun color in B-V is +0.63.
Major camera manufacturers' Automatic White Balance is very far from "dumb as can be." It works extraordinarily well for low-contrast photographs. Unfortunately, astronomical images are about as high-contrast as images get. In any case, technical images should *always* be taken in RAW-format, which have no AWB.
Yes, I plan to use RAW format and adjust for white balance manually using the best CI stars. What would those be? Just visually comparing an A star like Vega, an F star like Procyon, and a "sun-like" G star like Capella, the latter seems a bit too off-white, while Vega seems a bit too blue, so my feeling is that an F star, a B-V~ +0.3 may be the closest visual match to "white". But that is just a subjective opinion.
You don't need to reinvent the wheel. Image processing software packages for astrophotograhy ('pretty pictures', not photometry) have routines for colour correction. I use PixInsight. A description of one approach using this software is described at:
https://pixinsight.com/tutorials/PCC/
PixInsight has other processes for doing the same thing, not so strictly scientifically based, but using, for example the 'average' colour of many non-saturated stars in an image as 'white' or the average colour of a galaxy as 'white'.
The point here is not the detail of what can be done (I'm sure there could be lots of discussion about the previous sentence, but I am not going to go there). The point is that there are procedures for colour correction in existing software that may help you achieve your aim.
Roy
It's very unclear whether the questions asked in this Photometry thread are really about aesthetic astrophotography or actually about scientific photometry. But one must know this from the outset, because the goals of these two disciplines have NOTHING to do with each other. Nothing. You might well use the same camera and same software and some overlap in experience to pursue these, and the images may sometimes look the same. But the intents and data handling and results of astrophotography and photometry could not differ more. If nothing else, photometric data management is rigorously linear; astrophotography data handling is almost always non-linear, by design.
Now, this thread was launched under Photometry, and as AAVSO is not really in the aesthetic astrophotography business, I've been answering the questions as though photometry were the goal. So, to summarize for Bayer-matrix astro photometry: (1) Forget white balance. It doesn't matter--at all. If the color balance is too uneven, consider a filter to boost the fainter colors relatively, and/or bracket exposures if your detector pixels don't bloom. (2) Use RAW format (or the equivalent) rather than compressed images. (3) Prove that your detector is linear with your current settings (not always easy for CMOS), apply dark and flat calibrations, and then relate (extracted) color signals (aperture or PSF) to magnitudes in standard passbands, probably by comparison to in-frame comp stars, and very preferably including color transforms.
That's it. Any means that demonstrably accomplishes the above is credible as astro photometry. But dragging in astrophotography aesthetic criteria only confuses the matter. Whether photometric image colors look pleasant or "realistic" is absolutely irrelevant.
Edit, to a specific question: outdoor photographic white is typically taken as solar white. After all, it is the color you see reflected from a perfectly white outdoor surface. The sun is a G2V star, B-V +0.63. In terrestrial shadow, the color is usually bluer because more incident light has been Rayleigh scattered. But this is all essentially irrelevant to astro photometry.
Roy: How can trust that commercial software will properly color balance images for scientific use? Yes, they have several methods to achieve pleasantly appearing "pretty pictures", but are these calibrated to yield correct colors? I seriously doubt it, because using galaxies for "white" would be a poor choice, since different populations of the stars that make them up, different types and ages, will result in different colors of those galaxies. "Averaging" field stars to get a "white" seems pretty crude, sort of like ensemble photometry, but lacking a rigorous theory behind it. What I am looking for is some objective criteria or methodology, based on color theory in astronomical images, to determine what star type and color-index would give pure white as a baseline.
Eric: Absolutely not interested in aesthetic astrophotography! My purpose is to be able to identify the elemental compositions of extended objects like comets, meteors, stars within nebulae, etc. WITHOUT resorting to spectroscopy. This can be done to some degree of confidence by the "color flame test" such as is done in chemistry labs. Many elements or compounds can be identified by a relatively simple color test, without the need for a large and expensive spectrographic system. But, you need the true color in the image to be able to properly compare to the known color of the elements in the flame test. So, there is definitely a scientific purpose behind my question here.
Mike
Mike,
Check out the link in my last message.
Roy
I remain very interested in this question. I am not an astro-photographer. I am not even interested in pretty pictures. I live in a very light polluted environment. I am stuck in my backyard for now due to the virus. To be able to continue observing in a Bortle 8 environment -- with my regular observing haunt at ATMoB shut down -- I have become an Electronically Assisted Astronomy user. Having accurate star colors in an EAA display is not just a matter of pretty pictures. It is my only exposure (pun intended) to the star color at all, especially at 9th magnitude and above. It would be really nice if the colors weren't fictional.
Mike, in post #11 you wrote:
"Determining the "true color" of the coma, and dust and gas tails yields very useful information. So, it is important to have a correct color balance visible to the human eye, for examination of these images."
If you can achieve images (of comet tails, to use your example) with correct colour balance to the human eye, how are you going to study them, or what observations do you plan to make from them?
Roy