Sign in with
Sign up | Sign in
Your question

I'm gonna get flamed to no end for this but...

Last response: in CPUs
Share
January 10, 2006 3:19:13 AM

Hey everyone. I don't know if anyone remembers my question a few months ago I posted on here about a theory that I have. It basically was that I believe that computers can exist without memory.

I want some input as to what I should do with my testing of this theory. I fully expect that if I do get any responses, I will be called stupid, go home kid, this is not a place for homework, idiotic, etc. But I hope that maybe atleast one reader will take me seriously.

I am putting my theory to the test currently that computers can indeed function without memory. What I am doing is getting very old computers and purposely overheating them with a heat gun. Why am I doing this? Well, I've seen video controllers overheat and produce artifacts on the screen. These artifacts are what you would consider a computer 'malfunction'. I propose that a computer malfunction, aside from software bugs, could possibly be the computer operating 'on its own'. A malfuction would be something that is not doing what it was intended to do.

Since computer circuitry that gets too hot is permanently damaged and will die prematurely, the key is finding the right balance of heat that can produce these artifacts without damaging the circuitry. This is proving to be very difficult, and since I don't know of anyone doing something like this, I don't know how to do it yet either.

I know that this all sounds crazy and a big waste of perfectly good computers, but I feel as though this could lead to something big. Please post some constructive criticism, and not just 'you are an idiot.' i get that enough each day.

More about : gonna flamed end

January 10, 2006 4:21:07 AM

I see what you are saying. Any random result could be considered as a result of a higher order.
I read a story about a guy who randomized his life, by the use of dice.
A little too out there for me though.
January 10, 2006 4:22:09 AM

A human-example of what you're talking about. Consider what kind of 'computing' you would be able to do if you couldn't remember more than 1 second of your life at a time. Perhaps you could do simple math, but you'd have no idea that the reason you were doing it was to file your tax return, finish a test, etc. You couldn't do it.

Take it one step further. You're computing the sum:
43 + 89. First of all, where do you store the numbers 43 and 89? Where do you store the fact that you want to add them? When you add 9 + 3 to get 11, where do you store the 1 in the ones place, and the 1 in the ten's place?

Basically, I'm afraid you couldn't get very far like this. (In fact, strictly speaking, you couldn't get anywhere, per the example above. Processors themselves must store the instruction to be executed somewhere in order to process it) Both computers and humans require context for almost all non-trivial applications. If your word processor has no memory, where is your term paper? If you don't have any memory, how could you even spell a word, let alone form a coherent thought.

In the end, lack of memory may work if the entire problem can be solved in a single step, but that is about it.

Sorry.
Related resources
January 10, 2006 5:03:44 AM

So you're proposing that computers have some degree of natural intelligence?

The thing is artifacts and malfunctions are random by nature (take genetic diseases or retardation for example) Malfuctions cannot be totally predicted by an algorythm or a mathematical solution- even in computers. (Some might argue against this opinion based on a book called "A different kind of Science" written by Mathematica star Steven Wolfram. Steven belives there is in fact a mathmatical constant in nature that explains things like the patterns on a tiger and even life. I do not however, believe this to be true. Mathmatics are great for measuring and quantifying the universe, but they cannot begin to explain the complexity of it. And yes, I read his book so I can speak to it if anyone wants to argue!) Anyways, I argue that malfunctions or artifacts are random reactions to predictable outcome. But because these reactions (malfuctions in this case) are random this does not mean they are intelligent. (And until you prove otherwise, artifacts are indeed random side effects or malfuctions) DNA itself isn't intelligent; it's simply a set of instructions that have evolved and changed based on (among other things) environment conditions. Computers are at a very basic level no different from DNA. What you are trying to do is no different from what the effects on DNA might be if you tried the same thing- you are changing an environmental condition (in this case heat) to a device that has been programmed to operate within certain environmental conditions. We know something bad will happen, but what? The result is a random (though predictable) effect that results in artifacts or malfunctions. It is predictable in that we know malfuctions will appear if too high a temperature is reached, it is unpredictable in that we don't know exactly where how the reactions (or malfuctions if you will) will manifest themselves. Now if you apply this experiment to human DNA- you'd get the same results basic results as you would a computer. We know that if we expose an excessive amount of heat to a DNA strand- malfunctions will appear (if of course the strand was replicated and cloned). The result can be predicted on a very general level (like the clone would be unable to fuction correctly), but exactly HOW the clone would not be able to function would be a completely random effect. Take this rule and apply it to a computer. We know something bad will happen to the computer, but exactly what and what would the reactions be during the experiement? We cannot get the same result 100% of the time.

Perhaps this is a bad analogy to what you're trying to do. I think a better example would be applying direct heat to the brain. We know that the brain will eventually fry, shut down, and possibly die. However we cannot predict how exactly the person will react (emotions and behavior are in my opinion simply pre-programmed reactions that manifest themselves based on things like upbringing, experiences, age, sex, etc and hence have a random outcome). Certainly some people would die sooner than others in the same control settings. Others may scream out and cry while some may not make any noise. It is thus a completely random effect with a predictable outcome. Death. Just like your chip experiment.

I do believe that computers will one day become self aware and have the same type of natural intelligence that humans possess. The idea that computers are just machines and cannot ever have a conscence or "soul" is ridiculous. ("Souls" are nothing more than a belief in my opinion, so calling a thing "souless is ignorant anyways.) After all, what are humans? If you take "God" out of the equation, we are essentially no different than computers (on a very basic operating level of course). Additionally, computers are absolutely no different from ameboa's or protozoa or even viruses. Computers will one day no doubt be able to design organic existence for themselves (assuming of course this is an advantage to living orgranically). Since we're all made of the same basic elements, no one can say for sure which is supreme; bags of water and carbon that use chemicals to achieve intelligence or metal and silicon that use electricity to achieve intelligence. Both have their advantages and disadvantages; most of which are obvious so I won't go into it.

I think looking for intelligence in the form of malfunctions as a result of heat is the wrong place to start. True computer intelligence will almost certainly be purposely invented by humans and eventually evolve on its own. But do not let me discourage you. Contiune on with your quest.

If you are speaking of something else entirely, please ignore this reply. :-)

-mpjesse
January 10, 2006 5:11:57 AM

For memory, do you just mean RAM or do you memory of any type?

E.g. processors can store data in registers, L1 cache, L2 cache, L3 cache, RAM etc.
January 10, 2006 5:39:48 AM

Quote:
Hey everyone. I don't know if anyone remembers my question a few months ago I posted on here about a theory that I have. It basically was that I believe that computers can exist without memory.

I want some input as to what I should do with my testing of this theory. I fully expect that if I do get any responses, I will be called stupid, go home kid, this is not a place for homework, idiotic, etc. But I hope that maybe atleast one reader will take me seriously.

I am putting my theory to the test currently that computers can indeed function without memory. What I am doing is getting very old computers and purposely overheating them with a heat gun. Why am I doing this? Well, I've seen video controllers overheat and produce artifacts on the screen. These artifacts are what you would consider a computer 'malfunction'. I propose that a computer malfunction, aside from software bugs, could possibly be the computer operating 'on its own'. A malfuction would be something that is not doing what it was intended to do.

Since computer circuitry that gets too hot is permanently damaged and will die prematurely, the key is finding the right balance of heat that can produce these artifacts without damaging the circuitry. This is proving to be very difficult, and since I don't know of anyone doing something like this, I don't know how to do it yet either.

I know that this all sounds crazy and a big waste of perfectly good computers, but I feel as though this could lead to something big. Please post some constructive criticism, and not just 'you are an idiot.' i get that enough each day.


I think a more basic question is in order here...how does overheating a cpu prove that a computer can operate without memory? The machine obviously had memory in it for you to turn it on and get it running in the first place did it not?

Secondly...A computer without any memory would be useless. It would be unable to process information because it couldn't store it. It would also be unable to return the results of any computation it was somehow able to make because that result would be instantly lost.

I think you also need to consider rewording your theory a little bit. You state "computers can exist without memory". Well, yes they can, but gasoline powered cars could exist without a gas tank. Neither would be useful however.

You also need to define the term 'memory'. At the most basic level, the cpu spends most of it's time moving data too and from various storage locations. Things such as harddisks, System Ram, CPU Cache, Registers, GPU RAM ect. All of these could be interpreted as 'memory'. So what do you mean by memory? Maybe you don't even know yourself?
January 10, 2006 6:46:23 AM

a computer can definatly exsist witout memory there is no need to test that... but you cant do anything usefull with it.

the artifacts your refering to are a result of the silicon in the chip reaching a heat where the electrons are no longer controlled to moving throug the circut paths created but insted moving about ranbomly in the chip. so while you get a result (a garbled image) its not nessacarly the result of the chip working.
January 10, 2006 6:47:41 AM

When I mean memory, I mean ALL forms of memory. hdd, ram, rom, cache, registers, everything. A computer is comprised of input, processing, memory, and output. Take out memory from that list and you would get input, processing, and output. I'd venture to say that this would make a pretty powerful computer indeed!

Again, this is merely a THEORY, and I am not attempting to 'prove' that computers can exist without memory by applying heat.

Like mpjesse said, we all know what the end result will be if too much heat is applied: the computer will die. But what I'm trying to find out is what happens in the moments before the computer dies. At what temperature, or maybe possibly other conditions, can you 'observe' a computer on the verge of death.

The reason why I am starting to have a strong desire to pursue this seemingly mad quest is that I don't know what happens either. Neither does anyone that I've told my idea to. I've told my college professors about my idea, and even they do not know the answer.

I agree that yes, today's computers would be no use if they couldn't store data in memory. I'm not thinking in terms of current computing, and one person told me that possibly in the most likely VERY distant future that the paradigm of computing could drastically change into something chaotic and possibly indomitable.
January 10, 2006 7:48:51 AM

u need to get out a bit more mate !

what are computers thinking nanoseconds b4 u fry them with a hairdryer, oh dear......
January 10, 2006 8:41:40 AM

You sir (and I really mean this) are an idiot.
January 10, 2006 9:01:27 AM

Quote:
When I mean memory, I mean ALL forms of memory. hdd, ram, rom, cache, registers, everything. A computer is comprised of input, processing, memory, and output. Take out memory from that list and you would get input, processing, and output. I'd venture to say that this would make a pretty powerful computer indeed!

current processors need a program to tell the CPU what to do --->we need rom to store the program in .if we use a chip with the program just printed in it ...wow for every change we would make to the program we need a new chip -very economical indeed :lol:  - its too costly and hard to maintain ,and the CPU will come like the old game consoles with program cartages. 8) .
by the way the computers today not input, processing, memory, and output. but input, processing, and output the memory and the processor is the heart of the mashine and contend as one ,all the other things are peripheries.
and your theory is not hard to make; by putting the memory in the input and the output devices. but it is a costly solution and will depend mostly on interfaces .
the future computers will have their memory in the processor itself and that what the scientists is working very hard on it today.
January 10, 2006 9:32:19 AM

Quote:
A computer is comprised of input, processing, memory, and output. Take out memory from that list and you would get input, processing, and output.
Yes, but the input and output all reside in some form of storage. You're saying no form of memory, absolutely none. That means there's no physical way for any input to arrive at the processor anyway. You just end up with wires with a small current going through them - or lightbulbs, as they're otherwise known.

Directly input from a camera? Then the phsyical medium the camera is staring at becomes a form of storage. Same for a human inputting data - the human's brain becomes the storage device. Ad infinitum. YOU have to draw us a line somewhere, otherwise we're back to the lightbulb.

And as others have said, You NEED some form of memory to tell it what to do with the data. Add a small header to the data to describe what to do? Well It'll have to have some amount of memory somewhere on the chip to tell it what that header means... So That isn't going to work.

You cannot have a computer, in anything remotely resembling the current sense, without some storage medium.

And stop Bogarting. :tongue:
January 10, 2006 12:02:02 PM

Can people live, walk and talk without a brain?
I think so, and see this every day.
Take the study of a chicken, who with a severed head ran around for days, and was fed by a tube.
Or a frog leg jolted with current jumps. Give a dead frog enough current in the right spots you could have it appear normal.
But that is not the frog on its own, it is something else causing this, and something else with a memory...
Take away the memory and what do you have, I forget.
January 10, 2006 12:39:09 PM

Quote:
These artifacts are what you would consider a computer 'malfunction'. I propose that a computer malfunction, aside from software bugs, could possibly be the computer operating 'on its own'. A malfuction would be something that is not doing what it was intended to do.


This is your mistake: you watched the "PI" movie too many times probably and that is just science fiction kid.

The artifacts are just leaked currents as bits that flow in a wrong way in a chip working in stress conditions (maybe right before sommething melts in it). You don't want that to happen with your little Prescot core, although maybe it is happening :D  and you don't know it :!: These Intel guys are very deceaving :roll: get an AMD :) 

I'm going to propose you sommething REAL for a change, there is a scientist outthere that is building chips by connecting randomly millions of transistors, most of the chips do not work, but somme patterns were found that were impossible to think of! How do you think about that?

And by the way, forget about the missing memory, it must exist in one form or the other, otherway it is like talking about a storm on the moon !!!
See, there can't be wind without air (or somme sort of fluid).

Another thing, I'm impressed of how many people responded to this post, maybe everybody dreams of a super intellingent CPU that has self knowledge and is built by mistake (and costs $50 right?)
January 10, 2006 1:11:54 PM

When a transistor flips from one state to another and keeps that state as short or long time as needed, it is memory. So your quest after an apparatus without memory is a bit vague at least.
Even a broken light switch has a memory.

Maybe you are trying figure out how to make an Absolute Random Information Generator, or ARIG for short. The legends tell that the people of planet Mantsala in the eastern part of our galaxy ones tried that and managed to make one ARIG. They had invested their whole annual planetary budget in the project and they were very keen to sell it to somebody. Well, it turned out that nobody wanted anything as useless thing as an ARIG.
So they ran out of Coors and the whole humanoid population of planet Mantsala died in a global hangover and the pissants took over.
January 10, 2006 1:34:40 PM

First off, what drugs are you on - as your thinking about the overheating is just 'out there'.

As someone already pointed out, a register within the cpu can be used as a form of memory - only 'one value per register' - but hopefully we can all agree that a register is part of the 'core cpu' and not like cache or other 'memory'. In other words, a register is a integral part of the cpu - and it is the simplest form of memory (one value held in cpu as long as the cpu has external electrical power).

A cpu can count and do lots of things with only registers... no external memory... so yes, a CPU can work fine without "memory".

It could even write to video cards, etc.

the real issue is that software pretty much is written to expect system memory (RAM)... and how to get the cpu to RUN your instructions... they tend to be stored in RAM.

But no reason you couldn't create a custom CPU and write microcode (on-cpu programming) that did something without RAM...
January 10, 2006 1:42:08 PM

Take a few more chemistry and engineering courses. College professors are not always the smartest people on earth....more than a few are college professors because they can't make it in the real world market. You will apparently fit that category as well.
hopefully the extra classes will help you understand how things work.
January 10, 2006 1:44:39 PM

Well the only thing I can say about this thread is that I think its a commendable pursuit to try to think outside the box. I personally don't think what you are doing will work but sometimes the journey to enlightenment is cracking several eggs repeatly to understand the concept of a big friggin mess or the beginning of an omlet. I have a feeling this is indeed a huge friggin mess as has been stated repeated by people that I beleive have a much better handle on what is actually going on then I. But sometimes you only learn from the process....keep cracking those eggs maybe someday you will eventually end up with an omlet and not yolk over your face like this particular pursuit will probably end.....man I'm getting hungry. Good Luck! :wink:
January 10, 2006 1:59:00 PM

Okay, about the chicken, it still has nerves, it still has power, it just has no processor, ie a mobo can still run bios if no cpu is present.

Now those artifacts you are finding are just random crap, like stuff flying around in a vacuum. (Though there is nothing that is truly random).

But what you are attempting to do is use a static object to find intelligence, stasis precludes intelligence. Iow, if you find a computer that can fix itself then you can do what you are doing, but they are still only in development, and that is software-wise they still use memory. ALL inteligence has memory, ALL life has memory, DNA and RNA, so whatever you do you need memory, what you are doing, unless i am mistaken will not work. You need memory. Sorry to burst your bubble. :( 
January 10, 2006 2:02:51 PM

ok, he said WITHOUT REGISTERS.

Well... then I supposed you probably have millions of such "computers" around us all the time.

You are taking away a lot of what I would consider part of a "general purpose" or "programable computer". Without memory of some type to store the program to execute... then it becomes a set of "hard wired circuits" that does only a set of pre-defined tasks.

Can some hardware person help point this guy to the name of such "basic computing" such as "descreet logic" or other "non programable" "COMPUTING"?
January 10, 2006 2:33:04 PM

Quote:
Without memory of some type to store the program to execute... then it becomes a set of "hard wired circuits" that does only a set of pre-defined tasks.


But it won't even do those "pre-defined tasks" without memory of some sort. Those tasks have to be stored somewhere in order to be executed... that means memory.

A computer won't function without memory... period. Even the most basic of computers (calculators and such) have some sort of memory. Even if a computer were intelligent... it would still require memory in order for it to perform any sort of function whatsoever.

So, a computer may exist without memory... but then it's not really a computer, as it wouldn't be able to perform any of the tasks that define a computer.
January 10, 2006 2:35:09 PM

Your brain has cells that act as registers (Actually synapses kinda do this) so to do what he wants will slow the computer like you can't believe. and then you will still need something like a register on the hdd.
January 10, 2006 3:32:49 PM

Quote:
As someone already pointed out, a register within the cpu can be used as a form of memory
A form of memory? Register *is* memory!
Quote:
but hopefully we can all agree that a register is part of the 'core cpu' and not like cache or other 'memory'.
Now you better define your concept of "memory".
Register no memory. RAM yes memory.
Quote:
so yes, a CPU can work fine without "memory".
How?
Quote:
It could even write to video cards, etc.
How? What would write there if it could? A purple dot?
Quote:
the real issue is that software pretty much is written to expect system memory (RAM)... and how to get the cpu to RUN your instructions... they tend to be stored in RAM.
You don't understand much about computers or any other kind of logical devices. Do you?
January 10, 2006 3:44:50 PM

what if memory could be made from a calculation or the answer to a calculation is repeated over and over again while another processor reads the calculation... this means you would need a heck of a lot processors to 'make' memory where there is none, thus the system is able to 'function' without memory even though you have 'memory'

Ara
January 10, 2006 3:52:01 PM

The computer still needs instructions to perform the calculation... without memory, there is no where to store the instructions; therefore the calculation could not be performed.

Since a computer can be defined as a machine that performs calculations according to a set of instructions... you could NOT have a computer without memory. If you did, it wouldn't be a computer.
January 10, 2006 3:58:31 PM

Quote:
Pentium 4 630 3Ghz - 64 Bit Windows
:roll:
January 10, 2006 4:17:11 PM

Quote:
what if memory could be made from a calculation or the answer to a calculation is repeated over and over again while another processor reads the calculation... this means you would need a heck of a lot processors to 'make' memory where there is none, thus the system is able to 'function' without memory even though you have 'memory'

Ara


WTF -- This qualifies under the category of "not even wrong".

Again, in order to do a calculation you need to have memory.
January 10, 2006 4:18:12 PM

Problem: Processors have memory, technically your FSB has memory (1=on, or a 0=off). Memory is the only way that anything on a computer works. The overheated VIDEO MEMORY is what causes artifacts, not usually an overheated GPU, so your entire theory is flawed. Sorry.
Anonymous
a b à CPUs
January 10, 2006 4:39:47 PM

Quote:
Can some hardware person help point this guy to the name of such "basic computing" such as "descreet logic" or other "non programable" "COMPUTING"?


Unfortunately am not the foremost in digital design expertise. However, I am familiar with a certain class of hardware which fits the input -> process -> output model, Finite State Machines. Finite State Machines are the building blocks of digital controllers. Digital controllers are used extensively in modern processor designs. Guess what: finite state amchine building blocks are fip flops, the exact structure which is the basis of all computer memory. So...this was no loophole either. Memory is necessary for doing computation.

OH and BTW any experienced technician with proper equipment could eventially figure out the fauilure mechanisms for a given IC and predict the outcome from overheating. Of course it would require figuring out the manufacturing idiosyncracies of that particular chip or at least the electrical byproducts of them. It's not magic being witnessed during this experiment but the expected failure which shows up as seemingly random due to the imperfect manufacture processes of the silicon electronics.
January 10, 2006 4:51:54 PM

computer cannot function without the memory as you cannot function without a brain!!! PERIOD!!!

THERE IS NO GHOST IN THE MACHINE...there is only programing (you can program ghosts though:-)

ohhhh......and before computer dies...it gives up one :wink:

i really suggest that you drop this insanity....otherwise it looks like you have too much time on your hands.....you need to get out more.
January 10, 2006 6:17:20 PM

Quote:
You are taking away a lot of what I would consider part of a "general purpose" or "programable computer". Without memory of some type to store the program to execute... then it becomes a set of "hard wired circuits" that does only a set of pre-defined tasks.


RoundSparrow seems to have hit the nail on the head with this one oolceeoo. A know of many computers without 'memory' they are called light switches.

I still dont really see how heating up a computer to the point that it is destroyed links to its use of memory? Obviously Im completly missing the point of this. However I am inclinded, at this point, to agree whole heartedly with Ned Flanders
January 10, 2006 6:19:41 PM

Quote:
Take away the memory and what do you have, I forget.


LOL.

-mpjesse
January 10, 2006 6:25:31 PM

Now you will agree silicon is man made material where sand is super heated and then refined.
Humans are composed of organic material severly more complex than simple silcon.
The thing that happens when you overheat CPU is that are on a certain clock speed, meaning that there is a certain amount to time it takes for a CPU to do a calculation. During this process when heat is applied this calculation takes more time to be completed because the resistance in the CPU is increased causing two calculations to be mixed. Hence artifacts or even random outcomes. Not to say anomilies no not happen even during normal temp because they do. To find out why these happen one who need to analize the million for transistors that are in a microprocessor which is not feasible. To do this you would need to repeat the same calculation over and over and come up with different reasults, then annalize witch transistors behaved differently and why.
What happens when a microprocessor reaches what we will call dieing point is the silicon transistors are fusing together creating a solid connection. This point of fusion depends on the material used, the quality of material used, process in which it was manufactured.
January 10, 2006 6:31:28 PM

Try to ignore those who are calling you an idiot. They have no imagination.

After thinking on this topic further, I felt it necessary to elaborate further on your experiment.

Indeed, if you could create a computer with no memory you would have an extremely powerful system. Everything would run in real time. Memory creates latency. Latency = waiting. Thus your theory is indeed an interesting one. The problem with creating a computer with absolutely no memory is this: every single component would absolutely have to be in complete sync. Even a nanosecond of de-sync would cause the computer to crash. Memory exists so that components can operate out of sync- a necessary evil.

Furthermore, if you were to create a computer without memory, it would be impossible for the computer to evolve and become self aware/intelligent. Our memory defines who we are. If we could not remember anything, what good would we be? We would not create, learn, or even survive. We'd all forget (or not know) that things like fire are dangerous.

Memory is probably the single most important capability that we humans have. It would be logical to assume the same of computers. Thus, while a computer without memory may be a powerful one, it's probably an impractical one. It's only use would be doing things like calculating PI which would still require memory in some form or fashion.

-mpjesse
January 10, 2006 6:40:05 PM

You where right on the subject!!! FLAMER!! :tongue:
January 10, 2006 6:47:45 PM

first off your an idiot..

Quote:
input, processing, and output


how would the "proccessing" work without memory?

ok lets put it this way, i wouldnt be able to type this msg without memory, cause each character is being stored.. if i where to print this page it would goto a buffer(memory) and be sent to my printer which has another buffer of memory... see the problem.
January 10, 2006 6:49:50 PM

If it looks like memory, tastes like memory or if it smells like memory, it IS memory...

The same cannot be said for chicken.
January 10, 2006 7:04:00 PM

Quote:
If it looks like memory, tastes like memory or if it smells like memory, it IS memory...

The same cannot be said for chicken.


so thats how you make sure its compatable memory i did wonder. :roll:
January 10, 2006 7:12:44 PM

At least i tried to help, you arrived late. And on my old matsonic motherboard you could set the ram frequency.
January 10, 2006 9:28:08 PM

OK, apparently my previous comment was removed. I appologize for calling anybody an idiot. I do, however, have quite a good imagination.

The original hypothesis states that "computers can run without memory".

Our distinguished and honorable oolceeoo is asking the community how he could test this hypothesis. So far the method he has chosen to test this hypothesis is to run old computers in a hot environment to the point of thermal breakdown. While I could certainly imagine a table full of steaming hot 386's speaking to each other in their best Turing meta-language... somehow pragmatism gets the best of me.

A computer without memory is like a square circle. It makes no sense, even conceptually. Think of the theoretical Turing machine. It still needs tape, doesn't it? Take the tape away and you don't have a computing machine anymore.
January 10, 2006 10:10:50 PM

I do get out often enough. Please don't take any jabs at my character, keep it to my idea. I appreciate all those who offer constructive criticism to my theory. I'll read through each of your posts again since I didn't expect so many replies.

I can only defend a new idea so far, and as of right now it is nothing more than that. I'm using my imagination, something that I'm finding few people still use today.

No one I meet in real life is going to care about this idea except for my close friends or professors. It is extremely difficult to continue when everyone either doesn't understand, doesn't care, or just thinks its crazy. But I believe in it and this is just the beginning.
January 10, 2006 10:28:45 PM

If you do not already, smoke some reefer...you will be flooded with many like ideas...and if you do it with the company of some friends, you will find the majority thinking along with you at how deep and introspective you are. :?
January 10, 2006 10:33:21 PM

Quote:
But I hope that maybe atleast one reader will take me seriously.


Hi, everyone!
(Just a remark: i'm new at this forum so i haven't read that much... which - partially - explains my extactic further statement: 'It almost cut off my breath!')
I find it one of the best threads i've been through, so far, because it defies common-sense logic.com
I'm not an expert in computing, although a very interested... non-expert (...)
Actually, the basic statement "computers can [exist], [work], [function] without memory", can be proven both ways, 'right' & 'wrong' simultaneously, without stepping into metaphysics. It's the modus operandi which is - definitely - wrong, according to the second law of thermodynamics (see 'entropy'.) [Reductio ad absurdum: take the device's temperature down to [close] absolute zero and infer what happens then.] Perhaps surprisingly, it's the same 2nd law taken into the quantum realm that does not forbid both solutions: 'right' & 'wrong'.
It all comes down to electrons: taken as a current (voltage differential in a transistor's gate), it only basically matters their charge, their energy and the path width on whatever medium. It doesn't take into account (yet!) their spins, for instance (see 'spintronics'.). For a given amount of time, charge (current) can be stored on whatever medium, allowing output coherency with the voltage input submitted. Then, you can say you have 'memory': a coherent, expected result. But, you know nothing about each of the electrons in the process (it gets tougher with photons...).
Now (it's already happenning...), suppose the transistor's gap width narrows in such a way that, only one electron suffices to change the transistor's state. According to the Uncertainty Principle of Quantum Mechanics, even if you're able to store a single electron (and you are), you cannot rely on the output result since it will not be coherent with the given input. The end result might be utterly unexpected, to say the least (of course, there's Quantum Computing, which has limitations too.). Although everything is there, physically, can you now say that you have 'memory'?
After all, even randomness must be stored, somewhere & somewhen. But, can we call it 'memory'?

I'm not proposing to give a quantum physics course, here; i'm not a physicist, anyway.

[Here are some references, if you're interested: Einstein-Podolsky-Rosen paper (EPS paper); The Mach's Principle [Max, Ernst]; John S. Bell inequalities; Allain Aspect's experiment (1978); and, of course, lots of quantum thermodynamics!]

As for a device working without memory, i don't know what the future will bring (i do have some thoughts though...); as for your modus operandi, i think you should stick just with the concept and try to address some expert, technical advice.

Hope this [rather long but not ended!] dissertation helped, in any way.
January 10, 2006 11:12:04 PM

I'm afraid all the artifacts you mention amount to is the famous room with thousands of monkeys at typewriters - sooner or later one will turn out the complete works of William Shakespeare. ie, sooner or later those graphics artifacts will, for a split second, resemble your current desktop, the Mona Lisa, or something meaningful. (Ignore that the paper the monkeys type on can be considered memory.)

As somebody mentioned earlier, I also appreciate people thinking "outside the box." People called some of our best known inventors idiots (probably worse), but they persevered. While this particular implementation might be waaay beyond left field, keep on playing outside of the box with different ideas. Learn from the ones that don't work & press on.

As an aside, I've considered the opposite of heat, cold - superconductivity - (and scientists are coming up with new materials that super-conduct at warmer and warmer temperatures all the time) holds many answers to bringing computers to the next level, whatever that will be. :D 
January 10, 2006 11:31:20 PM

Quote:
When I mean memory, I mean ALL forms of memory. hdd, ram, rom, cache, registers, everything. A computer is comprised of input, processing, memory, and output. Take out memory from that list and you would get input, processing, and output. I'd venture to say that this would make a pretty powerful computer indeed!


Dude some one beat you to this idea LONG ago - they called it your *BASIC* (Electric Typewriter) not the ones that had memory and stored docs and word check

and before that they even invented one that even elliminated the *processing* they called it (The Manual Typewriter)

THINK "real" hard about this for a little while, please
January 10, 2006 11:48:58 PM

this is NOT to say you are stupid - but my thinking is mabey you are barking up the wrong tree

compared to what technology could really be .... we have enough dreamers out there (watch sci fi television for a while) a little star trek or somethin

a computer is a device - when it comes to the process of building a computer its designed for something very specific - regardless of the technology that it requires to make the computer, i know everything is really small and its cool to think about real AI - real anomolies being something significant

but its as simple as this - what you are doing is short circuting a very simple machine - CPUs arent anything at all - they respond they way we tell them to - even an a Blue Screen of Death is something we are telling it to do - unintentional, yes, but anyway

shorting out a battery is a malfunction but it doesnt mean anything besides a dead battery or a fire(in this case)....

the current technology isnt advanced as your thinking
January 11, 2006 12:06:23 AM

All of these responses give me a lot to contemplate. I think that I'm being misunderstood somewhat or I'm not wording right what I'm trying to do. It's sort of like the man whose name escapes me that proposed that parallel lines could indeed intersect, and he invented this new geometry that lead to the development of the atomic bomb.

I know theres no little magic man inside computers that is going to pop out. I'm going to try and see if I can better explain this after some thought.
January 11, 2006 1:08:01 AM

ya know this really sorta does have my interest piqued
i've had to reread everything you've said so far several, several, several times to get my brain to latch onto your basic idea here (your fault :p  )

i dont think removal of all memory is the key

since all computers have memory built into the CPU the test you are trying to do isnt an exact science.

removal of the BIOS might be an idea though since thats the basics of all computers - then at least the test will be non tainted by the motherboard telling the computer what to do - this way nothing will be recognized or operable - then aplly variable heat/cold/warm to see what type of output you get - in your case the temperature being the input - process is seriously missing from this equation - because with technology being what it is - this will be very hard to reproduce and get any kind of standard output

but it could lead to something alot more - mabey you understand my vagueness (is this a word?)
January 11, 2006 1:28:30 AM

Quote:
A human-example of what you're talking about. Consider what kind of 'computing' you would be able to do if you couldn't remember more than 1 second of your life at a time. Perhaps you could do simple math, but you'd have no idea that the reason you were doing it was to file your tax return, finish a test, etc. You couldn't do it.

Take it one step further. You're computing the sum:
43 + 89. First of all, where do you store the numbers 43 and 89? Where do you store the fact that you want to add them? When you add 9 + 3 to get 11, where do you store the 1 in the ones place, and the 1 in the ten's place?

Basically, I'm afraid you couldn't get very far like this. (In fact, strictly speaking, you couldn't get anywhere, per the example above. Processors themselves must store the instruction to be executed somewhere in order to process it) Both computers and humans require context for almost all non-trivial applications. If your word processor has no memory, where is your term paper? If you don't have any memory, how could you even spell a word, let alone form a coherent thought.

In the end, lack of memory may work if the entire problem can be solved in a single step, but that is about it.

Sorry.


OMFG!!! Hey MacCleod... I can't believe someone else hasn't jumpped all over this already.... I think you have to pull the heater away from your own memory module. I think your math processor is a little overheated. 9+3= ... Ummm let me take my socks off.... well, it ain't 11!!!

:twisted:
January 11, 2006 5:20:28 AM

Being an electrical engineer i would have to say that this is without a doubt the dumbest idea i've ever heard... are u serious?

This idea should go in the trash can next to some really bad robin williams movie..... like bicentennial man.

I dont understand why people think computers can be brought to life...
my best guess it that its from their lack of general principles on which they work and most of all loneliness.

A computer is a machine, kinda like a car.
If you heat up the engine of the car will start burning oil and malfunction nothing good will ever come of it. It'll certainly not start think on its own.
It only does what we engineers have designed it to do... in the case of a prcoessor theres a whole lot of logic gates that process commands that have a certain desirable outcome. It is all programmed.

Im sure that with enough complicated circuits you could mimick human thought but this wouldn't be a randomly created machine.. it would take years if not decades to make something like this. It would also require far more advanced technology than we have today.

You could say that a million monkeys typing on a type writter could write shakespeare... i dont think thats gonna happen any time soon. Same with your idea.

Even if we could bring a computer to life... you should ask yourself why?
What good would this do?

Just the thought of a computer impersonating human emotions or thinking on its own makes me cringe... probably because i've watched way too many Terminator movies.

But seriously,
what would be the benefits of having a machine that acts like a human?
Humans make mistakes machines don't, how do you account for that?
would you want a machine that makes mistakes?
Afterall, that's the reason we made calculators... to not make mistakes and do it faster.

Another thing to think about...
If you make a machine better than a human... wouldn't the theory of evolution state that the human would be weeded out? :?

This is why i really hate people who try to make computers think for themselves. In a sense you're trying to eliminate humanity, but you don't know it.

And if you were in a certain movie.. i'd hunt you down with a double barrel shotgun on a harley w/ cool sunglasses. 8)
!