Fri, 12/18/2020 - 14:33
I've come to the conclusion that you can't do proper photometry with a DSLR despite what it says in the AAVSO guide.
I've just started doing this so I'm still very much a learner. The AAVSO guide for photometry with a DSLR says you have to do the measurements in the green wavelengths. The problem is that the Bayer matrix in the DSLR sensor means you won't count the green photons from a star which fall on the red and blue pixels so the flux measurements are going to be wrong. I've tried it using debayered images and I get results which look about right but it means that the green flux values in the non-green pixels are not real counts, just estimated.
Steve,
You just have to use the correct approach. Yes, it is necessary to debayer DSLR RAW images, but debayering is just the process of constructing a full colour image from the mosaic of red, green and blue pixels. The next critical step, and the one that makes accurate DSLR photometry achievable, is channel separation. Various software packages allow you to extract any or all of the R, G and B channels. When this is done, you will have your original colour image (red, green, blue) and your newly created image(s), one for each channel. Some software extracts two green channels, as there are twice as many green pixels as blue or fed.
You can do accurate photometry with the green channel data alone, provided that your comparison stars have colour indices very close to that of the variable. Alternatively, if you extract (say) green and blue channels, transformed magnitudes can be calculated from comparison stars of various colours, but you need to obtain transformation coefficients for your camera/lens or camera/telescope setup first.
If you need more detail on procedures, message me offline.
Roy
Steve,
You raise a valid point about the distribution of a star's light across the colored pixels on the sensor. As you say, if the star image on the sensor is small, on the order of the individual color pixels, the star's brightness measured with say, just the green pixels will strongly depend on which color pixel most of the star's light is concentrated on. The important lesson here is that you don't want to focus too sharply. DSLR pixels are tiny (a few microns) but with good-quality short focal length lenses, it's still possible to "over" focus. Ideally, you want to defocus so that the stars a bit "mushy" to ensure they're spread over several "superpixels" (the 2x2 set of red, green, and blue pixels). Besides spreading the light out so that no one pixel gets the bulk of the light, you can now also take longer exposures to reduce the impact of atmospheric twinkling. Short exposures (5-10 seconds) show a lot of noise because you're capturing the star's "twinkling." Longer exposures help average out this effect, with the extra benefit of allowing you to collect more photons so that your measurements are more accurate.
Shawn (DKS)
Thank you Ray and Shawn,
I'm aware of the need to defocus slightly but I find it quite hard to get it right. If you do it too much the star images turn to donuts. It's easy to see if they are heavily defocussed but sometimes you can't see it unless you magnify the image. I did some a few nights ago and thought it was ok but when I started processing the images I could just see slight donut effects. This is something I need to work on but I know where I'm going.
I found a couple of packages which allow me to split the channels out. One is AIP4WIN and the other is AIJ itself. But I'm still not sure about the debayering. I found I can split the colours without debayering but if I look at the images I can see for example in the green channel every other pixel is bright and all the rest are black in a star image. SImilar for red and blue except the pattern is different of course. I can debayer the raw image using various algorithns and then split the colour channels. In this case if I look at the green channel all of the pixels in a star image are bright but I guess only half of them will be true measured values, all the rest will be estimated values based on the debayering algorithm. I can do photometry on either picture and get similar results so long as the star image covers a significant number of pixels.
While donuts in most cases indicate you over-did the defocus, it by no means invalidates the photometry, you can still use the images (but expect a somewhat worse signal-to-noise ratio). If you are doing aperture photometry, it's all about counting the photons from a given star within the measurement aperture, and it doesn't matter so much whether the photons form a circle, donut, crescent, cross, smilie or whatever. It's just not ideal because spreading the photons over more than necessary pixels will increase the noise, and might cause blending with close-by stars. Also some software might get confused by the unusual shape of the star image ("point spread function" in photometry-speak) when auto-adjusting the aperture size, so you might want to manually check/set the aperture size.
Clear skies
HB
Hi,
I use Muniwin to process ~300 files after another: https://sourceforge.net/projects/c-munipack/
which is calculating the mean of the two green channels. (TG_1+TG_2) / 2. I think AIP4Win does this also?
I am getting quite the same results in TG as others in Johnson V.
Barbara Harris, told in her video: https://www.youtube.com/watch?v=7S-doKrQJxc
that the transformation for her DSLR is affecting only the 2nd digit after the decimalpoint (if I remember correctly...)
Also, with the Ensemble-Option (use multiple comp stars) I am getting at good clear skies: 4 - 5 mmag for each data point (with the 4 inch Achromat or the 8" Newtonian).
De-Focus:
It is mandatory to check the first star of a sequence before you perform a sequence for:
1) A good SNR (~70% - 75% of the ADU should be gained, for 14 BIT = 16381. Minus a 2000 BIT Canon offset well...12 000 to 14 000 ADU for the variable will be good. (So you will be also in the linearity range...) The comp +check stars should not be saturated.
2) I have experienced, that with different comp or check stars, sometimes even at the Ensemble-Option, the magnitude of the lightcurve is varying beween 0.5 or even 1 mag up or down! So I'll guess thats the main reason, for not getting the same results as others. (You'll have to had the same: telescope+camera setup, seeing conditions, and comp + check stars, to get similar results...but thats what we facing here. I've read that the mag limit should be not greater than 2.5 mag, for selecting the comp or check stars.. but sometimes one just have to select the available comp stars...)
3) To check the defocus with the ADU-Profile-Function, if it does not has horns (or to big horns..) Curently I am experimenting with the APT Focusing Aid Tool:
https://www.astrophotography.app/usersguide/focusing_aid.htm?ms=AAA%3D&q=Zm9jdXNpbmcgYWlk&st=MQ%3D%3D&sct=MA%3D%3D&mw=MjQw#
I want to have a proper FWHM for my telescope focal length & pixel setup:
https://astronomy.tools/calculators/ccd_suitability
I think, if the FWHM is at low as possible, ~1 its a good starting point ?
(But I've read somewhere, that FWHM 2-4 will be good?) Well thats what I am experimenting for now, if it not were for overcast since October...
clear skies,
Bernhard
PS: take your time, after one year you'll get a better routine for most of the things here (-;
PPS: The feature from Muniwin I like most is the live function: You'll do all photometry steps for the first stars, than you'll hit the "Ampel-Button": "process every new image in the directory the same way...", and you can watch the lightcurve evolving in real time!
Steve,
All of the replys have made good points about getting accurate photometry with a DSLR. I agree that you can get accurate photometry with DSLR. The DSLR manual is a great tool to use to do this. I will summarize some of the key points:
1. Adequate exposure (not too short and do not saturate your target and comp stars). I try to keep my exposure length a minimum of 15 secs. If I have to go shorter because the target is too bright, I will stack enough exposures so that I have a total of a 60-sec image (see p.24 of DSLR manual). Know the saturation and linearity range of your camera to avoid saturation (see appendix E of the DSLR manual). When I advise those wanting to do DSLR photometry, I recommend doing appendix A (DSLR camera testing) and appendix E (Linearity Testing) of the DSLR manual as a start.
2. De-focus your star so that the FWHM of your target is at least 8-10 pixels (see p.57 of the DSLR manual). This assures that the starlight is spread over many pixels. This can be a frustrating part of obtaining images. I use an 80mm refractor with a Canon 40D setup that has an electronic focuser. I will do an autofocus and get perfect focus and then back off from perfect focus to get an FWHM of 8-10 pixels. I also use a Canon 70D with a lens for brighter targets. This is a little more frustrating trying to get the correct amount of de-focus. I usually end up de-focusing just a little and then going to the computer to check the amount of de-focus before I end up taking my images
3. Color separation can be done by Debayering or demosaicing (convert the image to a color image) then split that image into a tricolor R,G,B image or color extraction (extract both green channels and average them together to form one green image). I use MaximDL. I extract the two green images and average them together. AIP4Win performs a bi-linear interpolation(DeBayer) and then extracts the green image (see p.28 and p.56 of the DSLR manual). There is also a PDF manual of software tutorials for IRIS, Muniwin, AIP4Win, and MaximDL here https://www.aavso.org/sites/default/files/publications_files/dslr_manua…
4. Calibrate your images (Bias,Dark, Flat). The order of calibration (before or after extraction of color channel) depends on what software you are using.
5. I only use the TG green channel because it is closest to Johnson V. I rarely report TB or TR because I feel that they are too far away from the photometric B and R unless I transform them (see Spectral Response of DSLR color channels p.70 DSLR manual)
Anyway, following the guidelines in the DSLR manual should give you good photometry with your DSLR.
Barbara
Thanks for your advice everyone.
Cheers
Steve
As others correctly point out, the Beyer matrix problem is addressed by defocusing somewhat then allowing the software to sum over the correct pixels. But there are additional problems that are generally overlooked. First among them is the unfortunate fact that the spectral response of DSLRs varies from make to make and model to model. That necessitates calibration of transformation coefficients and use of comparison stars that are spectrally closely matched to the target, if reasonably accurate (ca 30 mmag) measurements are to be made. But determination of transformation coefficients requires bright standard stars, and unfortunately there are not very many of those. So unless you are willing to use a really big aperture and forego ensemble photometry techniques, DSLR photometry is rarely worth the effort. (Who wants to spend 15 minutes to get a 2- or 3-fold improvevment over what they could do in 3, visually?)