DSLR and infrared

Archived from groups: rec.photo.digital.slr-systems (More info?)

Is it possible to remove all visible colour wavelengths
in software post-processing, leaving you with only the infrared image,
rather than use an infrared filter? I figure you could
use a narrow enough aperture on the lens to accommodate
the infrared and visible light focus points.
-Rich
12 answers Last reply
More about dslr infrared
  1. Archived from groups: rec.photo.digital.slr-systems (More info?)

    RichA wrote:
    > Is it possible to remove all visible colour wavelengths
    > in software post-processing, leaving you with only the infrared image,
    > rather than use an infrared filter? I figure you could
    > use a narrow enough aperture on the lens to accommodate
    > the infrared and visible light focus points.


    The photosites are filtered to record RGB (CYM in some cameras). Once
    filtered and counted, it is just data, the camera, the file, the PS s/w
    have no idea what contribution specific wavelengths made to the 'count'
    in any photosite. (When playing it back, it's recreated in the computer
    monitor or printer as a color).

    Whatever IR is recorded is where the sensitivity of those channels
    (mostly the red channel I suppose) spills into the IR part of the
    spectrum. Then, by adding an IR friendly filter to the camera, you
    block as much of the visible wavelengths as possible leaving IR only.
    That IR can be 'counted' in any channel (mostly the red, I believe).

    At that point the gain of a channel in IR is very low (beacuse its not
    centered on the filter for that channel), so a longer exposure is
    required to get anything. To make it worse, some cameras have an IR
    filter over the sensor, so an even longer expsosure is required to
    record the IR from a scene.

    http://www.electro-optical.com/bb_rad/bb_rad.htm

    Since the camera or photoshop does not *know* that that recording was
    IR, you have to seperate it out. Again, I assume its mostly in the red
    channel as that is closest to IR.

    In effect, PS believes whatever 'count' is in a R,G or B channel
    represents red, green or blue, not IR.

    Cheers,
    Alan.


    --
    -- r.p.e.35mm user resource: http://www.aliasimages.com/rpe35mmur.htm
    -- r.p.d.slr-systems: http://www.aliasimages.com/rpdslrsysur.htm
    -- [SI] gallery & rulz: http://www.pbase.com/shootin
    -- e-meil: Remove FreeLunch.
  2. Archived from groups: rec.photo.digital.slr-systems (More info?)

    RichA <none@none.com> writes:
    > Is it possible to remove all visible colour wavelengths in software
    > post-processing, leaving you with only the infrared image, rather
    > than use an infrared filter?

    No.

    Think about it: There are no "infrared" photosites in your camera,
    only monochrome photosites masked by red, green and blue filters
    (that to some extent also are transparent to near IR light),
    When looking at the raw sensor data, you don't know which bits that
    were generated by light in the near infrared band, on which bits
    that were generated by light in the visible band.

    It is possible to fake the infrared film "look" of Kodak HIE pretty
    good in Photoshop (boost green channel, desaturate, blur, add grain),
    but it has nothing to do with a real near-IR image.
    --
    - gisle hannemyr [ gisle{at}hannemyr.no - http://folk.uio.no/gisle/ ]
    ------------------------------------------------------------------------
    Kodak DCS460, Canon Powershot G5, Olympus 2020Z
    ------------------------------------------------------------------------
  3. Archived from groups: rec.photo.digital.slr-systems (More info?)

    Alan Browne <alan.browne@freelunchVideotron.ca> writes:
    > Whatever IR is recorded is where the sensitivity of those channels
    > (mostly the red channel I suppose) spills into the IR part of the
    > spectrum. Then, by adding an IR friendly filter to the camera, you
    > block as much of the visible wavelengths as possible leaving IR
    > only. That IR can be 'counted' in any channel (mostly the red, I
    > believe).

    Not necessarely. On my Oly 2020Z, all three channels seem to
    be equally sensitive to IR. On my DCS460, it is the blue channel
    that is most sensitive, and the red is the least sensitive.

    How coloured dyes (such as those used in beyer filters) respond
    to near-IR wavelengths is quite unpredictable.

    > http://www.electro-optical.com/bb_rad/bb_rad.htm

    That page is totally irrelevant if we talking about the type of image
    you shoot with standard cameras and Hoya R72 filters.

    This guy talks about blackbody radiation, which is the infrared
    energy radiated by an object due to its temperature. Capturing
    blackbody radiation requires a very special camera, and is
    known as "thermal photography".

    What hobbyist call "infrared photography" should really be called
    "Near-infrared photography". It is about taking photographs of
    /reflected/ infrared light in a narrow spectrum just above visble
    light - from 700 nm up to about 950 nm.
    --
    - gisle hannemyr [ gisle{at}hannemyr.no - http://folk.uio.no/gisle/ ]
    ------------------------------------------------------------------------
    Kodak DCS460, Canon Powershot G5, Olympus 2020Z
    ------------------------------------------------------------------------
  4. Archived from groups: rec.photo.digital.slr-systems (More info?)

    Gisle Hannemyr wrote:

    > Alan Browne <alan.browne@freelunchVideotron.ca> writes:
    >
    >>Whatever IR is recorded is where the sensitivity of those channels
    >>(mostly the red channel I suppose) spills into the IR part of the
    >>spectrum. Then, by adding an IR friendly filter to the camera, you
    >>block as much of the visible wavelengths as possible leaving IR
    >>only. That IR can be 'counted' in any channel (mostly the red, I
    >>believe).
    >
    >
    > Not necessarely. On my Oly 2020Z, all three channels seem to
    > be equally sensitive to IR. On my DCS460, it is the blue channel
    > that is most sensitive, and the red is the least sensitive.

    That indicates that the 'out of visible filtering of the RGB color
    filters is not controlled. eg: the blue filter blocks green and red but
    out there beyond visible it is no longer filtering.

    >
    > How coloured dyes (such as those used in beyer filters) respond
    > to near-IR wavelengths is quite unpredictable.

    It appears to be uncontrolled, rather than unpredictable.

    >
    >
    >>http://www.electro-optical.com/bb_rad/bb_rad.htm
    >
    >
    > That page is totally irrelevant if we talking about the type of image
    > you shoot with standard cameras and Hoya R72 filters.

    I only posted the link to show the relationship of the IR to the red
    channel.

    <SNP>
    > What hobbyist call "infrared photography" should really be called
    > "Near-infrared photography". It is about taking photographs of
    > /reflected/ infrared light in a narrow spectrum just above visble
    > light - from 700 nm up to about 950 nm.

    Probably. There is so little authoratitve information on the subject
    with resepct to what wavelenghts are passing the RGB filters, the sensor
    filters and being counted by the photosites. I haven't seen a gain
    chart for the whole chain (in segments). Do you know where there is one?


    --
    -- r.p.e.35mm user resource: http://www.aliasimages.com/rpe35mmur.htm
    -- r.p.d.slr-systems: http://www.aliasimages.com/rpdslrsysur.htm
    -- [SI] gallery & rulz: http://www.pbase.com/shootin
    -- e-meil: Remove FreeLunch.
  5. Archived from groups: rec.photo.digital.slr-systems (More info?)

    > Whatever IR is recorded is where the sensitivity of those channels (mostly
    > the red channel I suppose) spills into the IR part of the spectrum. Then,
    > by adding an IR friendly filter to the camera, you block as much of the
    > visible wavelengths as possible leaving IR only. That IR can be 'counted'
    > in any channel (mostly the red, I believe).

    I've noticed that my S3 records the near-IR in the blue channel - I thought
    I must be mistaken until I read Gisle's post on this subject!

    While tweaking the RAW image in the HyperUtility software if I pull the
    middle of the red curve down and push the blue curve up, it increases the IR
    effect: the areas of reflected IR are pale blue and those of least are deep
    red.

    Strange.

    I'm happy with the result though which is what matters really isn't it!

    Craig.
  6. Archived from groups: rec.photo.digital.slr-systems (More info?)

    Craig Marston wrote:

    > middle of the red curve down and push the blue curve up, it increases the IR
    > effect: the areas of reflected IR are pale blue and those of least are deep
    > red.
    >
    > Strange.
    >
    > I'm happy with the result though which is what matters really isn't it!

    Certainly. The 'typical' presentation of IR photos is in B&W, but they
    can be presented in any color you like since there is no 'color' at all
    in the human sense.

    --
    -- r.p.e.35mm user resource: http://www.aliasimages.com/rpe35mmur.htm
    -- r.p.d.slr-systems: http://www.aliasimages.com/rpdslrsysur.htm
    -- [SI] gallery & rulz: http://www.pbase.com/shootin
    -- e-meil: Remove FreeLunch.
  7. Archived from groups: rec.photo.digital.slr-systems (More info?)

    In article <d6ngej$2sr$1@inews.gazeta.pl>,
    Alan Browne <alan.browne@freelunchVideotron.ca> wrote:
    >Gisle Hannemyr wrote:

    [ ... ]

    ><SNP>
    >> What hobbyist call "infrared photography" should really be called
    >> "Near-infrared photography". It is about taking photographs of
    >> /reflected/ infrared light in a narrow spectrum just above visble
    >> light - from 700 nm up to about 950 nm.
    >
    >Probably. There is so little authoratitve information on the subject
    >with resepct to what wavelenghts are passing the RGB filters, the sensor
    >filters and being counted by the photosites. I haven't seen a gain
    >chart for the whole chain (in segments). Do you know where there is one?

    It isn't just the filters and the sensor which control things
    when you get to the far IR regions. There are only two bands which pass
    the atmosphere cleanly. (They were 3-5 and 9-14 nM IIRC.) All other
    "colors" are blocked to one degree or another.

    Sensors for the longer wavelengths are also strange compared to
    the visible light ones. I have seen (and used) CCD scanned sensors, but
    the actual sensor material is something esoteric like a
    lead-tin-telluride alloy, not a silicon photovoltaic sensor. Now,
    consider the problems of cleaning such a sensor array given that the
    alloy is a lot softer than lead-tin solder, and any deformation destroys
    the crystal structure. But then -- these did not have interchangeable
    lenses, and were typically in sealed enclosures which were dry-nitrogen
    purged on assembly (in a clean room).

    And the sensors are commonly used at liquid nitrogen
    temperatures to minimize resistive electrical noise.

    Probably the most widely seen images through such systems are
    the ones put on television during the gulf war and the more recent one
    in Iraq, as the targeting cameras for the aircraft fired missiles were
    far IR systems.

    They are strange looking beasts, with lenses (totally opaque in
    the visible spectrum) made of germanium or silicon.

    And the only references to which I had access were not public
    documents at the time, so I can't post a link to them. (They were not
    even on the web back then, though they may well be by now.)

    Enjoy,
    DoN.
    --
    Email: <dnichols@d-and-d.com> | Voice (all times): (703) 938-4564
    (too) near Washington D.C. | http://www.d-and-d.com/dnichols/DoN.html
    --- Black Holes are where God is dividing by zero ---
  8. Archived from groups: rec.photo.digital.slr-systems (More info?)

    In article <d6njl3$djr$1@inews.gazeta.pl>,
    Alan Browne <alan.browne@freelunchVideotron.ca> wrote:
    >Craig Marston wrote:
    >
    >> middle of the red curve down and push the blue curve up, it increases the IR
    >> effect: the areas of reflected IR are pale blue and those of least are deep
    >> red.
    >>
    >> Strange.
    >>
    >> I'm happy with the result though which is what matters really isn't it!
    >
    >Certainly. The 'typical' presentation of IR photos is in B&W, but they
    >can be presented in any color you like since there is no 'color' at all
    >in the human sense.

    Or -- you can make a multi-band sensor array, and display the
    near-IR in one color, the 3-5 uM in another, and the 7-14 uM in the
    third for an interesting false color effect.

    Of course, one problem is that the longer the wavelength, the
    greater the diffraction which causes loss of resolution. So I would
    probably put the near IR in the color to which the human eye is most
    sensitive, so it would carry the most detail.

    Enjoy,
    DoN.

    --
    Email: <dnichols@d-and-d.com> | Voice (all times): (703) 938-4564
    (too) near Washington D.C. | http://www.d-and-d.com/dnichols/DoN.html
    --- Black Holes are where God is dividing by zero ---
  9. Archived from groups: rec.photo.digital.slr-systems (More info?)

    "Alan Browne" <alan.browne@freelunchVideotron.ca> wrote in message
    news:d6ngej$2sr$1@inews.gazeta.pl...
    > Gisle Hannemyr wrote:
    SNIP
    >> How coloured dyes (such as those used in beyer filters) respond to
    >> near-IR wavelengths is quite unpredictable.
    >
    > It appears to be uncontrolled, rather than unpredictable.

    In fact, it is predictable in the sense that these dyes are
    transparent to IR. What complicates the predictability is the IR
    filter that can have different cut-off points. The IR filter could be
    omitted, or it is part of an assembly with the Anti-Aliasing filter
    (e.g. Canon's assembly looks like
    http://www.canon.com/technology/detail/digi_35mm/lo_filter/index.html,
    and incorporates both a dichroic mirror *and* an IR-filter).

    In some camera's (e.g. the Canon 20Da) the characteristics for
    extended-red/near-infrared filtration are changed, but those changes
    will cause difficulties in regular pictorial photography.

    And then there is the variable amount of reflected Near-IR light ...,
    predictability is the issue.

    Bart
  10. Archived from groups: rec.photo.digital.slr-systems (More info?)

    Bart van der Wolf wrote:

    >
    > "Alan Browne" <alan.browne@freelunchVideotron.ca> wrote in message
    > news:d6ngej$2sr$1@inews.gazeta.pl...
    >
    >> Gisle Hannemyr wrote:
    >
    > SNIP
    >
    >>> How coloured dyes (such as those used in beyer filters) respond to
    >>> near-IR wavelengths is quite unpredictable.
    >>
    >>
    >> It appears to be uncontrolled, rather than unpredictable.
    >
    >
    > In fact, it is predictable in the sense that these dyes are transparent
    > to IR. What complicates the predictability is the IR filter that can
    > have different cut-off points. The IR filter could be omitted, or it is
    > part of an assembly with the Anti-Aliasing filter (e.g. Canon's assembly
    > looks like
    > http://www.canon.com/technology/detail/digi_35mm/lo_filter/index.html,
    > and incorporates both a dichroic mirror *and* an IR-filter).
    >
    > In some camera's (e.g. the Canon 20Da) the characteristics for
    > extended-red/near-infrared filtration are changed, but those changes
    > will cause difficulties in regular pictorial photography.
    >
    > And then there is the variable amount of reflected Near-IR light ...,
    > predictability is the issue.

    Unless you have the gain over wavelength curves for every bit of glass
    in the chain and include the sensitivity of the sensor at those
    wavelengths, it is unpredictable. (Which is not the same as saying it's
    not repetable). Even where the manufs attempt some degree of IR
    supression, enough gets through that a longer exposure will record it
    (if the visible wavelengths are suppressed by filtering and don't swamp
    out the IR).

    In the case of the 20Da, they appear to want the IR in order to help the
    overall image expsure; it is intended for astro work, after all.

    Cheers,
    Alan

    --
    -- r.p.e.35mm user resource: http://www.aliasimages.com/rpe35mmur.htm
    -- r.p.d.slr-systems: http://www.aliasimages.com/rpdslrsysur.htm
    -- [SI] gallery & rulz: http://www.pbase.com/shootin
    -- e-meil: Remove FreeLunch.
  11. Archived from groups: rec.photo.digital.slr-systems (More info?)

    DoN. Nichols wrote:

    > In article <d6ngej$2sr$1@inews.gazeta.pl>,
    > Alan Browne <alan.browne@freelunchVideotron.ca> wrote:
    >
    >>Gisle Hannemyr wrote:
    >
    >
    > [ ... ]
    >
    >
    >><SNP>
    >>
    >>>What hobbyist call "infrared photography" should really be called
    >>>"Near-infrared photography". It is about taking photographs of
    >>>/reflected/ infrared light in a narrow spectrum just above visble
    >>>light - from 700 nm up to about 950 nm.
    >>
    >>Probably. There is so little authoratitve information on the subject
    >>with resepct to what wavelenghts are passing the RGB filters, the sensor
    >>filters and being counted by the photosites. I haven't seen a gain
    >>chart for the whole chain (in segments). Do you know where there is one?
    >
    >
    > It isn't just the filters and the sensor which control things
    > when you get to the far IR regions. There are only two bands which pass
    > the atmosphere cleanly. (They were 3-5 and 9-14 nM IIRC.) All other
    > "colors" are blocked to one degree or another.

    I'm talking about the light that manages to reach the camera. Ideally,
    you would want the Red filter to filter everything except red
    wavelengths. And so on for G and B. What apparently is happening is
    that the filters have another higher gain region outside the visible
    region. So some of the OEM's slap on another filter over the sensor to
    block that. And that filter is not 100% effective, so IR photography is
    possible by longer exposures and by blocking the visible spectrum.

    Cheers,
    Alan.

    --
    -- r.p.e.35mm user resource: http://www.aliasimages.com/rpe35mmur.htm
    -- r.p.d.slr-systems: http://www.aliasimages.com/rpdslrsysur.htm
    -- [SI] gallery & rulz: http://www.pbase.com/shootin
    -- e-meil: Remove FreeLunch.
  12. Archived from groups: rec.photo.digital.slr-systems (More info?)

    In article <d6qk11$jm0$1@inews.gazeta.pl>,
    Alan Browne <alan.browne@freelunchVideotron.ca> wrote:
    >DoN. Nichols wrote:
    >
    >> In article <d6ngej$2sr$1@inews.gazeta.pl>,
    >> Alan Browne <alan.browne@freelunchVideotron.ca> wrote:
    >>
    >>>Gisle Hannemyr wrote:
    >>
    >>
    >> [ ... ]
    >>
    >>
    >>><SNP>
    >>>
    >>>>What hobbyist call "infrared photography" should really be called
    >>>>"Near-infrared photography". It is about taking photographs of
    >>>>/reflected/ infrared light in a narrow spectrum just above visble
    >>>>light - from 700 nm up to about 950 nm.
    >>>
    >>>Probably. There is so little authoratitve information on the subject
    >>>with resepct to what wavelenghts are passing the RGB filters, the sensor
    >>>filters and being counted by the photosites. I haven't seen a gain
    >>>chart for the whole chain (in segments). Do you know where there is one?
    >>
    >>
    >> It isn't just the filters and the sensor which control things
    >> when you get to the far IR regions. There are only two bands which pass
    >> the atmosphere cleanly. (They were 3-5 and 9-14 nM IIRC.) All other
    >> "colors" are blocked to one degree or another.
    >
    >I'm talking about the light that manages to reach the camera.

    Well ... the 3-5 and 9-14 nM would reach the *camera*, but are
    unlikely to make it past the first piece of glass. :-)

    > Ideally,
    >you would want the Red filter to filter everything except red
    >wavelengths. And so on for G and B. What apparently is happening is
    >that the filters have another higher gain region outside the visible
    >region. So some of the OEM's slap on another filter over the sensor to
    >block that. And that filter is not 100% effective, so IR photography is
    >possible by longer exposures and by blocking the visible spectrum.

    Note that it is pretty difficult to make a really wide-spectrum
    filter, so most are made to cover only the part of the spectrum needed
    in a given situation.

    I've worked with scientific filters which were three layers:

    1) A basic high-pass filter to block everything with a wavelength
    longer (or shorter) than the general region of interest.

    2) A more general color filter to select a region of what the
    high-pass lets through.

    3) An interference filter (which gives a very narrow bandpass or
    cutoff) to sharpen the cutoff at the edge of response as much as
    possible.

    Having seen and worked with these, I'm not going to expect the
    filter areas in the Bayer filter to do more than what is absolutely
    needed, and this makes the separate IR filter in the D70 make lots of
    sense compared to trying to block everything in that deposited pattern
    filter on the sensor itself.

    Enjoy,
    DoN.
    --
    Email: <dnichols@d-and-d.com> | Voice (all times): (703) 938-4564
    (too) near Washington D.C. | http://www.d-and-d.com/dnichols/DoN.html
    --- Black Holes are where God is dividing by zero ---
Ask a new question

Read More

SLR DSLRs Infrared Cameras