NASA and USAF Looking for a Next-Gen Space Processor
NASA and the US Air Force Research Laboratory are looking for two to four companies to perform an evaluation of “advanced space-based applications” in 2020 to 2030.
In the search for an advanced next-generation processor to power the spacecraft computing needs of the future, NASA and the US Air Force Research Laboratory has launched the Air Force Next Generation Processor Analysis Program (AFNGPAP).
The program offers a $2 million contract (with an option for a further $20 million in funding) for between two and four companies to perform a year-long evaluation of advanced space-based applications that would use spaceflight processors in the decade of 2020 to 2030.
The hope is that this research will help yield future spacecraft processors that are powerful enough to perform tasks such as autonomous pinpoint landing with hazard detection and avoidance, real time segmented mirror control on telescopes, onboard real-time analysis of hyperspectral images, autonomous situational analysis, real-time mission planning, and real-time mode-based fault protection for spacecraft.
“Computer processors and applications aboard spacecraft will need to transform dramatically to take advantage of computational leaps in technology and new mission needs,” said Michael Gazarik, associate administrator for NASA's Space Technology Mission Directorate at the agency's headquarters in Washington. “NASA's Space Technology Program is teaming with the Air Force to develop the next generation spaceflight processor requirements and propose solutions to meet future high performance space computing needs in the upcoming decades.”
As kunzite already put out, processors for space based and radiation-hardened applications have a completely different set of requirements than their commercial terrestrial counterparts.
Radiation wreaks absolute havoc on unhardened electronics and even a single error in computing can spell disaster in a mission critical system.
Extreme ECC usage, exotic manufacturing techniques and robust design principles for rad-hard chips virtually ensure that a chip of any reasonable size will not offer performance comparable to commercial counterparts.
Cooling is also much more complex in space as you have to deal with extreme temperature swings and there is no external moving substance to conduct heat away from the spacecraft.
to remove excess heat from the system, you need to either use an ineffective radiative system or have a somewhat quickly depletable store of cooling gas/fluid stored onboard.
Increasing performance on a produced chip generally leads to increased its power usage and cooling requirements; both of which are hard constraints that need to be carefully balanced in the system.
Additionally, QC needs to be extremely high, to the point of producing 100% defect free chips (probably to the effect of having 90%+ scrap rates on less than perfect chips).
Some redundancy should be built in to deal with the inevitable damages that occur from long usage in hostile environments, further increasing chip complexity and reducing die area for pure performance gains.
en.wikipedia.org/wiki/Radiation_hardening
science.nasa.gov/science-news/science-at-nasa/2001/ast21mar_1/
Come one people....... Did everyone forget that space is a different beast than what it is on Earth?
You just cant "throw in" a everyday computer hardware into space and "hope" it last......
Nope... Vacuum = heat not transferring (efficiently) into it's environment.
Now it can still cool down via radiating heat (like the sun heating earth) but.... that's a highly ineffective way of cooling our cpu's as all our cooling designs is based on transferring heat from some mass to another mass (Heatsink to air).
In a vacuum, to simply put, If your going to let the cpu live, your going to have to do the exact opposite of overclocking.
Not completely, they're typically processors that have been tested for a quite a while, typically older processors, that are known to have longevity. Case in point, when I was in college my program was working with Motorola on their Iridium satellite network (this was early 90s). We were irradiating their chips to simulate time in space. I asked one of the guys coming down every other week or so what the processors were and he mentioned that they weren't a new design, just tweaks to an older design (the chips were originally designed in the mid 80s/mid-late 80s).
Typically what you will see is a processor that has larger transitors (likely using processes like 65nm or 90 nm for the transitors compared to the present running at 32nm and soon 22nm). Because they're large transitors, they aren't as likely to be harmed by the gamma and x-rays flying around in space or take longer to degrade than some of the newer processes would.