Jump to content
  • Join the online East Midlands astronomy club today!

    With active forums, two dark sites and a knowledgeable membership, East Midlands Stargazers has something for everyone.

More photons on a pixel


Guest peepshow

Recommended Posts

Guest peepshow

I have been thinking again.  Well, trying too. :)


 


Around 2 to 5 arc seconds per pixel seems to be about right for imaging DSO.


 


If one increased this many times,  would not, over this wider pixel view, any particular pixel have more photons land on it?   


Normally, on a pixel there may not be sufficient light energy land on it in the exposure time to give a signal. 


A wider view (more arc secs per pixel) might then have sufficient photons to give an output.  


 


YES, I DO  realise that this would result in  less sharp images, but to get even deeper into space with what  equipment we have seems a goer, accepting that  the image will be less sharp.


 


When I get my own gear running I shall try this out,  but has anyone tried this out to get an image of a DSO that one may not normally be able to capture? ..........or should I stop thinking ? :D


 


Edited by peepshow
Link to comment
Share on other sites

Guest Tweedledum

Hmmm,,,,,,From deepest memory and particle physics, is it not more dependent on photons per second arriving at the sensor, this then becomes quite random and feeds off poisson distribution, this subsequently becomes the grail for ccd or cmos sensor design.

 

 

Quick search and mercilessly copied below :---

 

 

Cheers

 

 

Photons

There can be no understanding of cameras without some understanding of photons.
The modern idea of what a photon might be was developed by Albert Einstein from 1905-1917. The name "photon" comes from the Greek word for light, and was coined in 1926 by the physical chemist Gilbert N. Lewis1 as physicists were struggling to get a handle on this slippery fish. A photon is an electromagnetic "wave packet" and can behave as a wave or a particle depending on the experiment. It is fair to say that no one truly understands photons, nevertheless we are able to model their behaviour, and harness them as if we did.

For the purposes of this section a photon is a particle of light. Photons are discrete particles, (ie you can't have half a photon), and are the embodiment of energy in electromagnetic form. Their energy is precisely quantised, (thanks to the quantum theory), and this energy determines the colour of the light. The higher energies are blue, the lower red, and green is in the middle. Light energies extend beyond the visible spectrum but we don't need to cover that here.

  • Photons of visible light are very small: A green photon has a wavelength of 555 nm. If there were 20 waves in the packet it would be about 1/100 of a millimetre across. Not that you should try and visualise them, they don't follow reality as we know it.
  • They travel VERY fast: 300,000 km/s, the fastest speed possible for physical objects, (the laws of Physics travel faster).
  • They contain a VERY small amount of energy. Their energy is proportional to their frequency according to Planck's formula. Green photons have an energy of 3.58×10-19 J.
  • There are a LOT of them: This is just as well, since they carry so little energy. A square meter of the earth's upper atmosphere receives around 1300W see footnote 2 of solar radiation when the sun is directly above it, (not all of this reaches the ground). That's 3.6x1021 photons per second!

 

Photon Flux

Now let's tie this back to cameras and establish some benchmarks with respect to photon flux for a range of real world photographic situations involving different pixel pitches and light conditions.

 

Maximum Photon Flux per Pixel (1/60 sec Exposure)

 

  Starlight Tropical
Full
Moon Living
Room Storm
Dark European
Office Outdoors Overcast Bright
Day
Shade Full
Sun Illuminance (Lux) see footnote 3 0.0001 1 50 100 500 1,000 15,000 80,000 Irradiance (Watts/m²) 2.0x10-7 2.0x10-3 0.10 0.20 1.0 2.0 30 160 Power per µm² (W) 2.0x10-19 2.0x10-15 1.0x10-13 2.0x10-13 1.0x10-12 2.0x10-12 3.0x10-11 1.6x10-10 Energy per µm² per 1/60 sec (J) 3.3x10-21 3.3x10-17 1.7x10-15 3.3x10-15 1.7x10-14 3.3x10-14 5.0x10-13 2.7x10-12 Photons per µm² per 1/60 sec 9.3x10-3 93 4,700 9,300 47,000 93,000 1.4x106 7.4x106                   Photons per Pixel - 70 µm² Pixel Pitch 0.65 6,500 330,000 650,000 3.3x106 6.5x106 9.8x107 5.2x108 Photons per Pixel - 35 µm² Pixel Pitch 0.33 3,300 160,000 330,000 1.6x106 3.3x106 4.9x107 2.6x108 Photons per Pixel - 8 µm² Pixel Pitch 0.074 740 37,000 74,000 370,000 740,000 1.1x107 6.0x107 Photons per Pixel - 3 µm² Pixel Pitch 0.028 280 14,000 28,000 140,000 280,000 4.2x106 2.2x107
Enough Photons For Photography?

From this table we can see that yes, full sun means FAR more photons than a night-time living room, and yes, a 70µm² pixel pitch sensor as used by the Canon 5D mk1 gets a lot more photons than the pixels of your typical crap compact of today boasting a 3µm² pitch, (20 times more), that's all as expected. But look at the actual numbers: an SLR shooting RAW stores 14 bits of dynamic range, that's 16,000 different levels. Obviously the levels must go up in at least 1 photon increments, so you might think we just need 16,000 photons per pixel in the bright spots to get a perfectly clean image. (Or we would if the human eye exhibited linear perception. More on this later). Laughing! Even in the living room we have far more than we need. As for the little compact: it only has a 12 bit ADC4 so it has a maximum dynamic range of 4,000 levels. Laughing again! It easily makes the bar for the living room case. "FANTASTIC!" you say. All these cameras will take perfect shots right down to your dim suburban living room. ... Won't they?

Sadly, no! Light just doesn't work that way. But remember these benchmark figures, they will be important later. In the meantime make yourself a coffee and strap yourself in, because what comes next is rather complicated.

Poisson Distribution

As a general rule, photons impinging on a surface arrive in a random fashion, much like people ringing a call centre. Although the average number arriving per second is constant, the actual number in any given second varies considerably. This form of behaviour is modelled by the Poisson Distribution, and it is a great bugbear of sensor design and imposes one of the critical limits on image resolution and camera miniaturisation.

 

 

photon_probability.jpg

Poisson Distribution in a nutshell

 

This graph shows the quantum behaviour of light at low intensities. This is a simulation of what a single pixel sees when focused on a white object in low light. The mean photon count is a measure of the intensity of the light, it is the average, (or expected), number of photons striking the pixel in a given exposure.

The "Photon Count Probability" graph uses the Poisson distribution to give the probability of the pixel receiving a number of photons for a given expectation (mean). The blue bars are the probabilities when 1 photon is expected, the burgundy when 2 and the cream when 4. Note how the Poisson distribution approaches a standard distribution as the expected number of hits gets to 4 and beyond.

Poisson Noise

In order to get an accurate image of the subject, each pixel must register the correct colour and intensity. For that to happen each photosite must be struck by the expected number of photons. Now here is the problem: Looking at the graph we can see that when just 1 photon is expected, there is only a 36% chance of getting that outcome, there is an equal chance of getting nothing and an almost equal chance of getting more than 1. If 0 photons corresponds to black, 1 to ¼ tone grey, 2 to ½ tone, 3 to ¾ tone and 4 to white, then your image, which is supposed to be a uniform ¼ grey, will be composed of random dots, 36% black, 36% ¼ tone, 18% ½ tone, 6% ¾ tone and 4% white. A speckly random mess! I have greatly reduced the dynamic range in this example to make the point. In the real world, in your shadows, the speckles won't be at maximum, but they will stand out all the same. Also remember that your pixels are not monochromatic, they are composed of 4 separate photosites (1 each for Red and Blue and 2 for Green), and each one is subject to the Poisson distribution, so you get random colours, as well as random intensities in your noise speckles.

You get the worst speckles when the actual photon count is a long way from the expected. The worst case occurs when you are expecting only 1 photon. As the light intensity rises, the number of expected photons rises, and although there is still variation in how many each photosite receives, the variation is relatively less. By the time you get to an expectation of 4 photons per exposure there is a 90% chance of getting between 1 and 7 photons. This sounds like a bigger variation than the previous example but relatively it is less, because you have widened the pixel definition. If 2-6 photons corresponds to ¼ tone grey and 0-2 photons to black, then 6-10 corresponds to ½ tone, 10-14 to ¾ tone and 14+ white. From the graph you can see that you will now have roughly 17% black, 65% ¼ tone, 17% ½ tone, 1% ¾ tone and almost nothing white. You still have speckles but this is a big improvement.

 

 

 

 

 

formatting lost for the table sorry.

Edited by Tweedledum
Link to comment
Share on other sites

What you are describing doing sounds like "binning" to me. Increased sensitivity to light, but decreased resolution.

You need ibbo to comment.

James

Link to comment
Share on other sites

Guest peepshow

Well Damian, thank you very much for troubling to answer me in such detail.   Very complicated, which I am still thinking over.


 


Others have mentioned binning but as I understand it, that is not what I was suggesting.


 


I was considering a single pixel exposed to part of a DSO.    Not 'joining'  pixels in software AFTER the event of photons landing on them which I understand is binning. (?)


 


In my probably ignorant thinking, I visualized two pixels side by side. Each one only getting such a very low exposure so there is no image to form.  Now wouldn't binning not give any image either, because in this case there is nothing to bin on either pixel !   No signal.


 


But if we change the physical viewing conditions from say, 2 arc secs per pixel to 10 then one pixel gets more exposure and so may now give an output, as It is now exposed to 25 times the sky area !


(ratio of 2x2  to 10x10).   Again, accepting that sharpness is surrendered in doing this.


 


 


I don't wish to waste any ones time on this, if this is all nonsense......


..........just call the men in the white coats.  :D  

Edited by peepshow
Link to comment
Share on other sites

Guest peepshow

Thanks for that link, Pete.  What with reading up on your link and Damian's write up, they should keep me quiet for a while...........but don't bet on it ! ;)


 


Link to comment
Share on other sites

  • 4 weeks later...
Guest ollypenrice

Surely you can get this effect by using an extremely short focal length on the pixels you already have? For example you could use a large pixel sensor such as the 9 micron Kodak found in an Atik 11000 and couple this to a camera lens of short FL and fast F ratio. I've done this with an 85mm lens and 7.2 micron pixels. The result does go deep but is very 'blocky' (a stage you might call pre-pixelated) and is only acceptable when the image is presented well below full size. One way round this is to make a mosaic using this coarse pixel scale, such that any one panel of the mosaic is never seen anywhere near full size but the final image is a decent size. This is a 6 panel with the Atik 4000 at 85mm. As long as you don't enlarge it it looks OK.


 


http://ollypenrice.smugmug.com/Photography/Widefield-images-including/i-pNjc6CL/0/X3/ORION%20FIN%20V3%20WEB-X3.jpg


 


Olly


Link to comment
Share on other sites

Guest peepshow
. The result does go deep but is very 'blocky' (a stage you might call pre-pixelated) and is only acceptable when the image is presented well below full size. One way round this is to make a mosaic using this coarse pixel scale, such that any one panel of the mosaic is never seen anywhere near full size but the final image is a decent size. This is a 6 panel with the Atik 4000 at 85mm. As long as you don't enlarge it it looks OK.

I hate to mention my tiddly SPC900 with the same breath as your Atic 11000 . :)   

Chalk and cheese doesn't come anywhere near to it. :D  

But here goes.......

So I guess one could make a big mosaic of many panels using the SPC900  to do as your 6 panel example, providing the result is not enlarged and scrutinised too closely ?

Link to comment
Share on other sites

Guest ollypenrice

I hate to mention my tiddly SPC900 with the same breath as your Atic 11000 . :)   

Chalk and cheese doesn't come anywhere near to it. :D  

But here goes.......

So I guess one could make a big mosaic of many panels using the SPC900  to do as your 6 panel example, providing the result is not enlarged and scrutinised too closely ?

 

I would say so, yes. People make lunar mosaics, for instance.

Olly

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.