What to expect from telescope images
How big are stars in astronomical images? Let us try a thought experiment, and place a
"typical star" very nearby. Let us assume our star is the same size as the Sun i.e \(R_{*} =
1 R_{\odot} = 7 \times 10^{8} m\). The nearest stars are at a distance of around 1 parsec,
i.e \(d = 1\) pc \(\approx 3 \times 10^{16} m\). This hypothetical star would therefore
subtend an angular size \(\alpha_{\rm act} = R_{*}/d = 2 \times 10^{-8}\) radians \(=
0.005\) arc seconds. This would be the radius of the star in our image, if diffraction or
seeing were not an issue.
We have seen that, in the absence of seeing, the resolution limit of a telescope of diameter
\(D\) is \(\alpha_{\rm resol} = 1.22 \lambda / D\). Using this formula, the diffraction
limit of a 12-cm diameter telescope at optical wavelengths is around 1 arc second. Instead
of the perfect star image we'd hope for, if we use a small telescope, we will see a
diffraction pattern, with a central spot that has a radius of 1 arc second. Larger
telescopes have higher resolving powers. A world-leading, 12-m diameter telescope has a
diffraction limit around 0.01 arc seconds. Therefore, even if we ignore seeing, the very
best telescopes in the world do not resolve nearby stars.
We cannot of course, ignore seeing. Recall that typical seeing at a good site is 0.5-1 arc
seconds. Suppose the seeing is 0.5". For our 12 cm telescope this is smaller than the
diffraction pattern, and so the blurring caused by the diffraction pattern would dominate.
Such an image would be diffraction limited. In such an image, the stars would look like Airy
discs, with sizes of 1 arc second.
For our large telescope, the seeing is much worse than the diffraction, so we would be
seeing limited, and would see images in which all the stellar profiles were Gaussians, with
a FWHM set by the seeing.