Hi.
I am unsure where to post this in the www.tomshardware.com forums.
I have a few questions about quantum computing and how computers work.
I watched this video: https://www.facebook.com/NowThisNews/videos/1092652100824913/
What I have been wondering for a while is why do computers only use Zero's and One's (0's and 1's). Why not 0, 1, 2, 3, etc? 0's and 1's are so basic.
How long will it take to have 0, 1, 2 or 0, 1, 2, 3 in the computer code?
What kind of computer would it take to move beyond zero's and one's (0's and 1's)?
Is there such thing as sub-code or sub-numbers? E.G.: 0, 0-0, 0-1, 1, 1-0, 1-1, etc? E.G.: Or in future 0, 0-0, 0-1, 0-2, 1, 1-0, 1-1, 1-2, 2, 2-0, 2-1, 2-2, etc?
All the mistakes of coding can be fixed and backwards compatible.
Thanks.
I am unsure where to post this in the www.tomshardware.com forums.
I have a few questions about quantum computing and how computers work.
I watched this video: https://www.facebook.com/NowThisNews/videos/1092652100824913/
What I have been wondering for a while is why do computers only use Zero's and One's (0's and 1's). Why not 0, 1, 2, 3, etc? 0's and 1's are so basic.
How long will it take to have 0, 1, 2 or 0, 1, 2, 3 in the computer code?
What kind of computer would it take to move beyond zero's and one's (0's and 1's)?
Is there such thing as sub-code or sub-numbers? E.G.: 0, 0-0, 0-1, 1, 1-0, 1-1, etc? E.G.: Or in future 0, 0-0, 0-1, 0-2, 1, 1-0, 1-1, 1-2, 2, 2-0, 2-1, 2-2, etc?
All the mistakes of coding can be fixed and backwards compatible.
Thanks.