Sign in with
Sign up | Sign in

Researchers Develop Single-Photon Gate

By - Source: MIT | B 21 comments

For the first time, researchers were able to build a system in which a single photon could be transmitted while all others were blocked.

The research conducted at MIT is believed to be a milestone in quantum computing as the scientists were able to convert a laser beam into a stream of photons, send it into a cloud of atoms and allow only one photon to emerge on the other side. The achievement is dazzling as photons typically show only weak interactions with atoms. In their research, however, the scientists showed that it is possible for one photon to interact strongly with atoms and change the state of an atom so that it will not allow a second photon to pass through the cloud.

MIT said that the result was achieved by cooling rubidium atoms down to about 40 microkelvins, 40 millionths of a degree above absolute zero (-459.67 degrees F or -273.15 degrees Celsius). In this state, the cloud is opaque and would not allow photons to pass. However, a second laser is uses to facilitate an effect called electromagnetically induced transparency (EIT). A photon that enters via the second laser elevate the atoms to an excited state and pass the cloud at a slow speed. However, a second photon that does not meet the EIT will not be able to pass the cloud.

"So whenever a single photon enters, it passes through the temporarily transparent medium," MIT said. "When two or more enter, the gas becomes opaque again, blocking all but the first photon."

There is the hope that the discovery could lead to the development of a single-photon switch, which could lead to the development of quantum logic gates in quantum computing systems. MIT said that’ such systems "could be immune from eavesdropping when used for communication, and could also allow much more efficient processing of certain kinds of computation tasks."

 

Contact Us for News Tips, Corrections and Feedback

Display 21 Comments.
This thread is closed for comments
Top Comments
  • 10 Hide
    jhansonxi , September 16, 2012 2:21 AM
    walter87You have no clue what quantum computing is or just how large scale performance gains are compared to traditional computers...Using single photon-gates will replace transistors.
    I think you missed the sarcasm in freggo's post.
Other Comments
  • -8 Hide
    freggo , September 16, 2012 1:16 AM
    That's bad news.
    Once we can control the smallest amount of mass/energy we are at a point where we can no longer improve performance and/or storage density.

    We have to start research to see if we can find something smaller than a photon and while we are at it whatever we find should better be faster than the speed of light.

    (Yes, I know that this is asking the impossible as per our current understanding if sub atomic physics)


    Einstein just started rotating in his coffin :-)
  • 6 Hide
    walter87 , September 16, 2012 2:13 AM
    freggoThat's bad news.Once we can control the smallest amount of mass/energy we are at a point where we can no longer improve performance and/or storage density.We have to start research to see if we can find something smaller than a photon and while we are at it whatever we find should better be faster than the speed of light.(Yes, I know that this is asking the impossible as per our current understanding if sub atomic physics)Einstein just started rotating in his coffin :-)


    You have no clue what quantum computing is or just how large scale performance gains are compared to traditional computers...

    Using single photon-gates will replace transistors.
  • 10 Hide
    jhansonxi , September 16, 2012 2:21 AM
    walter87You have no clue what quantum computing is or just how large scale performance gains are compared to traditional computers...Using single photon-gates will replace transistors.
    I think you missed the sarcasm in freggo's post.
  • 0 Hide
    alidan , September 16, 2012 2:35 AM
    freggoThat's bad news.Once we can control the smallest amount of mass/energy we are at a point where we can no longer improve performance and/or storage density.We have to start research to see if we can find something smaller than a photon and while we are at it whatever we find should better be faster than the speed of light.(Yes, I know that this is asking the impossible as per our current understanding if sub atomic physics)Einstein just started rotating in his coffin :-)


    stop thinking 2d and think 3d. you could possibly pile the ram and crap that takes up space on one layer, and put the crap that does stuff on another, bringing the two closer together, further reduceing the amount of space the data tracvles, increasing performance, by how much, i dont know, but thats a concept at least.
  • 3 Hide
    jupiter optimus maximus , September 16, 2012 2:39 AM
    freggoThat's bad news.Once we can control the smallest amount of mass/energy we are at a point where we can no longer improve performance and/or storage density.We have to start research to see if we can find something smaller than a photon and while we are at it whatever we find should better be faster than the speed of light.(Yes, I know that this is asking the impossible as per our current understanding if sub atomic physics)Einstein just started rotating in his coffin :-)

    Well... there is something called Higgs theory that beats your point, and it is being tested now at the Large Hadron Collider in France.
  • 6 Hide
    azraa , September 16, 2012 3:11 AM
    freego was being sarcastic. Hence the phrase about Einstein at the end.

    And anyway, why is everyone who has read about the LHC, now a freakin quantum physicist?. Most of us are scientific-type people, engineers most probably, but nearly none of us has a deep understanding oh parcicle physics. Even I that was a Chemistry student years ago can say that I am knowledgeable, for my understanding of quantum physics is only basic (solved schrodinger's equations, along with other photon-related mathematical analysis). Everything below that is merely know what the media tells us this is all about.

    @topic: my mind is blown. With my limited knowledge I find this amazing, considering how little we know of actually CONTROLLING sub-atomic particles. This, as someone stated before, could be the discovery behind a new era of electrical nano-devices, considering a photon as a 1 and no-photon as a 0. Imagine the possibilities.
    Great article, Tom's, I love to see this kind of news, keep digging up stuff like this!
  • 6 Hide
    A Bad Day , September 16, 2012 3:22 AM
    Quote:
    considering a photon as a 1 and no-photon as a 0. Imagine the possibilities.


    I recall reading an article about the development of a processor that uses more values than 0 and 1, with silicon transistors. If they can increase precision enough to fit in additional values, imagine the performance if software could take advantage of it.

    The only issue would be OCing. Standard processors can tolerate some voltage variation as long as the 0s don't get reported as 1s and 1s as 0s...
  • 0 Hide
    azraa , September 16, 2012 3:31 AM
    I agree. And even then, we need to get our feet back to the ground: this happens near zero kelvin.
    If this process develops thermal resistance (the system is tweaked in a way that makes this possible closer to atm temperature) then the OC shouldnt be much of a problem, because its just that, increasing the cicles per clock. And I doubt this supposedly new gen of transistors, based on diminute photons, could generate MORE heat than a system where a current is dissipated (elec. resistance) within a transistor.
  • 3 Hide
    bustapr , September 16, 2012 4:05 AM
    dudes, if this process is ever implemented into circuits, there would really be no need more OCing. this would be pretty much as fast as it can get wherever it can be implemented.
  • 1 Hide
    spectrewind , September 16, 2012 4:06 AM
    A Bad DayI recall reading an article about the development of a processor that uses more values than 0 and 1, with silicon transistors. If they can increase precision enough to fit in additional values, imagine the performance if software could take advantage of it.The only issue would be OCing. Standard processors can tolerate some voltage variation as long as the 0s don't get reported as 1s and 1s as 0s...


    Link? Usage?
    With there being 10 kinds of people in the world... those which read binary, and those that don't, which are you? ;o)
  • -3 Hide
    blazorthon , September 16, 2012 4:11 AM
    Second Paragraph of ArticleHowever, a second laser is uses to facilitate an effect called electromagnetically induced transparency (EIT).


    Last Paragraph of Article MIT said that’ such systems "could be immune from eavesdropping when used for communication, and could also allow much more efficient processing of certain kinds of computation tasks."


    Com on Tom's, proof-reading shouldn't be so difficult that it seems that no article is free of errors.
  • 4 Hide
    bustapr , September 16, 2012 4:13 AM
    blazorthonCom on Tom's, proof-reading shouldn't be so difficult that it seems that no article is free of errors.

    please notice the blue text hyperlink immediately below the last sentence of the article.
  • -2 Hide
    blazorthon , September 16, 2012 4:22 AM
    Quote:
    please notice the blue text hyperlink immediately below the last sentence of the article.


    I know about the edit function, my screw-up there. I would've noticed it and fixed it anyway even if the next comment after mine wasn't referring to it since. Tom's doesn't seem to fix mistakes like that all too often (especially on minor articles) even after they are mentioned.
  • 1 Hide
    jkflipflop98 , September 16, 2012 4:30 AM
    This is a pretty big deal. Individual photonic control will open up inroads in the computing industry we haven't even imagined yet.
  • -2 Hide
    fuzzion , September 16, 2012 6:14 AM
    2028 is the year we shall see quantum based computing. Yes i am a time traveller from the future. And yes Obama will be re-elected. And no, there will be no major world war until 2052
  • 0 Hide
    alyoshka , September 16, 2012 6:29 AM
    I agree with the "More Efficient Computing" part..... the rest of the setup just seems like eons away from becoming something that could save the planet..... more so the journey to reach that sort of computing itself is going to cost us the planet and all it's resources.... I really have been finding it more and more difficult lately, to come to terms with the cost that we are paying today for the technology of the future..... seems like a really vicious cycle....
    It's not that I am anti progress or something of that sort.... but geez..... "-273.15 degrees Celsius" plus the other stuff..... really..... they could change the climate of the Sahara with that sort of funding.....
  • 0 Hide
    doive1231 , September 16, 2012 8:17 AM
    And you thought bouncers only began in 70's discotheques.
  • 1 Hide
    john_4 , September 16, 2012 12:56 PM
    Call me when they get it down to quark size.
  • 0 Hide
    bombebomb , September 16, 2012 3:44 PM
    azraaI agree. And even then, we need to get our feet back to the ground: this happens near zero kelvin.If this process develops thermal resistance (the system is tweaked in a way that makes this possible closer to atm temperature) then the OC shouldnt be much of a problem, because its just that, increasing the cicles per clock. And I doubt this supposedly new gen of transistors, based on diminute photons, could generate MORE heat than a system where a current is dissipated (elec. resistance) within a transistor.

    I read it also, I think it was on the IBM news post on this website. About quantum computing at that, talking about bits being half on, or half off, etc. etc.
  • 4 Hide
    photonboy , September 17, 2012 2:50 AM
    *Quantum Computing is NOT around the corner.*

    It's one thing to manipulate a single Atom or Photon with lots of hardware. It's a completely different issue to actually build a computer comprised of millions of these.

    How do the parts CONNECT together?
    Where are the bottlenecks?

    Don't expect anything in Quantum Computing to affect real-world computers in the next ten years. In fact, it's possible that computers will simply get faster through the normal, evolutionary update cycle and never, ever benefit from Quantum Computing research.
Display more comments