Researchers Develop Single-Photon Gate
For the first time, researchers were able to build a system in which a single photon could be transmitted while all others were blocked.
The research conducted at MIT is believed to be a milestone in quantum computing as the scientists were able to convert a laser beam into a stream of photons, send it into a cloud of atoms and allow only one photon to emerge on the other side. The achievement is dazzling as photons typically show only weak interactions with atoms. In their research, however, the scientists showed that it is possible for one photon to interact strongly with atoms and change the state of an atom so that it will not allow a second photon to pass through the cloud.
MIT said that the result was achieved by cooling rubidium atoms down to about 40 microkelvins, 40 millionths of a degree above absolute zero (-459.67 degrees F or -273.15 degrees Celsius). In this state, the cloud is opaque and would not allow photons to pass. However, a second laser is uses to facilitate an effect called electromagnetically induced transparency (EIT). A photon that enters via the second laser elevate the atoms to an excited state and pass the cloud at a slow speed. However, a second photon that does not meet the EIT will not be able to pass the cloud.
"So whenever a single photon enters, it passes through the temporarily transparent medium," MIT said. "When two or more enter, the gas becomes opaque again, blocking all but the first photon."
There is the hope that the discovery could lead to the development of a single-photon switch, which could lead to the development of quantum logic gates in quantum computing systems. MIT said that’ such systems "could be immune from eavesdropping when used for communication, and could also allow much more efficient processing of certain kinds of computation tasks."

Once we can control the smallest amount of mass/energy we are at a point where we can no longer improve performance and/or storage density.
We have to start research to see if we can find something smaller than a photon and while we are at it whatever we find should better be faster than the speed of light.
(Yes, I know that this is asking the impossible as per our current understanding if sub atomic physics)
Einstein just started rotating in his coffin :-)
You have no clue what quantum computing is or just how large scale performance gains are compared to traditional computers...
Using single photon-gates will replace transistors.
stop thinking 2d and think 3d. you could possibly pile the ram and crap that takes up space on one layer, and put the crap that does stuff on another, bringing the two closer together, further reduceing the amount of space the data tracvles, increasing performance, by how much, i dont know, but thats a concept at least.
Well... there is something called Higgs theory that beats your point, and it is being tested now at the Large Hadron Collider in France.
And anyway, why is everyone who has read about the LHC, now a freakin quantum physicist?. Most of us are scientific-type people, engineers most probably, but nearly none of us has a deep understanding oh parcicle physics. Even I that was a Chemistry student years ago can say that I am knowledgeable, for my understanding of quantum physics is only basic (solved schrodinger's equations, along with other photon-related mathematical analysis). Everything below that is merely know what the media tells us this is all about.
@topic: my mind is blown. With my limited knowledge I find this amazing, considering how little we know of actually CONTROLLING sub-atomic particles. This, as someone stated before, could be the discovery behind a new era of electrical nano-devices, considering a photon as a 1 and no-photon as a 0. Imagine the possibilities.
Great article, Tom's, I love to see this kind of news, keep digging up stuff like this!
I recall reading an article about the development of a processor that uses more values than 0 and 1, with silicon transistors. If they can increase precision enough to fit in additional values, imagine the performance if software could take advantage of it.
The only issue would be OCing. Standard processors can tolerate some voltage variation as long as the 0s don't get reported as 1s and 1s as 0s...
If this process develops thermal resistance (the system is tweaked in a way that makes this possible closer to atm temperature) then the OC shouldnt be much of a problem, because its just that, increasing the cicles per clock. And I doubt this supposedly new gen of transistors, based on diminute photons, could generate MORE heat than a system where a current is dissipated (elec. resistance) within a transistor.
Link? Usage?
With there being 10 kinds of people in the world... those which read binary, and those that don't, which are you? ;o)
Com on Tom's, proof-reading shouldn't be so difficult that it seems that no article is free of errors.
please notice the blue text hyperlink immediately below the last sentence of the article.
I know about the edit function, my screw-up there. I would've noticed it and fixed it anyway even if the next comment after mine wasn't referring to it since. Tom's doesn't seem to fix mistakes like that all too often (especially on minor articles) even after they are mentioned.
It's not that I am anti progress or something of that sort.... but geez..... "-273.15 degrees Celsius" plus the other stuff..... really..... they could change the climate of the Sahara with that sort of funding.....
I read it also, I think it was on the IBM news post on this website. About quantum computing at that, talking about bits being half on, or half off, etc. etc.
It's one thing to manipulate a single Atom or Photon with lots of hardware. It's a completely different issue to actually build a computer comprised of millions of these.
How do the parts CONNECT together?
Where are the bottlenecks?
Don't expect anything in Quantum Computing to affect real-world computers in the next ten years. In fact, it's possible that computers will simply get faster through the normal, evolutionary update cycle and never, ever benefit from Quantum Computing research.