SoDNighthawk

Splendid
Nov 6, 2003
3,542
0
22,780
Heating your test computers up simply causes different angates & nangates the small micro switches in the computer chips on the circuit cards to stick open or closed and prevent current to cycle through the overall parallel circuits therefore causing data to halt and no longer respond to instructions sent my the CPU or other control chips that are also failing do to mechanical failure.

200 cars stopped on either side of a train bridge that has been lifted over the river so a ship can pass under it. None of the cars can cross the river because the gate is stuck open and none of the data or people in the cars can reach their destination.

Your artefacts are stalled bits of data that cannot move electrically, our eyes pick up images not unlike a computer creates it's images with angates and nangates or series of zeros and ones.

Your looking for a computer that can run without memory and this cannot ever occur because of simple Human and machine logic. However as we should understand a CPU gives instructions based on what it is asked to do by code input into it from human or automated software. We have all heard of cache memory in CPU's and a few years ago we called this hit or miss cache because it in fact did miss correctly getting data for the CPU.

How it works is like the this the CPU will call to the memory for information stored logical as 1's or 0's and it will require it very quickly in order to perform the functions it is being asked, so what engineers decided to do in order to improve CPU performance was to design an onboard cache for the CPU that performs an amazing AI or sort of false intelligence that humans use every day of their lives.

The Cache is located schematically-Electrically between the hardware computer memory and the Physical CPU. What the Cache job is, is to anticipate or GUESS what kind of data the CPU will next ask the Memory for and the cache will go to the memory and request the data ahead of time and wait for the CPU to bark an order or instruction set and the cache will dump its contents of cached memory directly to the CPU therefore preventing the CPU from having to go and search the memory for the data it needed. It does this based on previous system calls the CPU has made at various clock cycles and decides based on the sequence of occurring events what data the CPU might next call for and obtains it from the memory ahead of time. All this happening faster then you can of course think.

This lowly servant or work horse called the cache is bound directly by what the CPU is doing and the RAM can provide from the database so thinking that it uses a form of artificial intelligence is not correct as it is not thinking on it's own it is reacting to instructions it receives from other sources and some times even then gets the wrong data.

If you could say provide a Robot that had 1 million cache diodes within CPU's therefore 1 million CPU's with 1 or perhaps 2 cache say level 1 and level 2 cache you might get a functioning computer that could start to understand how to program it's self at a very basic level. However you would need to develope a software environment for that hardware to function in and then provide enough Memory space for the cache and CPU's to function say around a million Teraquads of hardware memory but then you run into the most basic flaw of any mechanical based system.

Hardware angates & nangates failure on massive levels preventing the system from functioning.

As an example the Starship Voyager in the T.V series uses bio-metallic Gel or something like that to function as parts of its working ships computer. This enables the large amounts of data to be stored and recalled without any moving mechanical parts simply by electrical frequency or light frequency we use now in fibre optics that can provide 2400 phone calls and data streams back and forth on 1 strand of a 25000 thousand strand cable next to 25 more 25000 strand fibre optic cables.
 

RichPLS

Champion
I know it was your idea, but is it morally right to reveal your ideas if they are not yours, just because the light from the energy burst hit your retina before it actually happened???
**grabs cig, even tho they all taste stale and harsh as the smoke wafts across my larynx, down the trachea, and into my lungs**
 

Rabidpeanut

Distinguished
Dec 14, 2005
922
0
18,980
One you nicotine lovers just gave me the a nasty headache, now i have to walk around in light areas wearing 8) unfortunately 8( is not an emoticon on this forum.
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
Since I'm bored, here's a summary for you Ned:

Heating up 'pooter = BAD;
Memory=Necessary;
Artifacts=Data That has nowhere to go;
SoD=Greatness :trophy:

[The Rest]=Waffle/Filler - par for the course.

:wink:
 

oolceeoo

Distinguished
Jan 25, 2004
57
0
18,630
I understand what you are saying about cache.

In today's computers, more cache=faster processing.

Cache is static ram. More expensive, due to its need for more transistors per memory cell (flip flops), but much faster than dynamic ram because the electrons do not need to be refreshed in a capacitor. But CPU's need fast ram for those reasons you stated above.

I understand what memory IS in a modern computer. I don't even like science fiction. I've never seen the movie PI, I've never seen Star Wars or Star Trek. The only science fiction that I like that you might consider science fiction is books on Albert Einstein deemed 'fiction'.

Much of your explanations tend to get entirely too complex, and the more complex you make an explanation the less you truly understand the concept. All of the replies, save a few, pretty much said the same thing. Different words, some more or less complex, but the same thing. Computer's cannot exist without memory.

I have openly stated that I do not fully understand my concept, so how can one be criticized about something he doesn't fully understand and believe that the people reading his idea understand? Isn't most new technology or science barely understood when it first emerges?

As to how this relates to artifacts, I do not understand. But I am curious to see what artifacts look like on different computers. Are they even different on each computer? Are they permanent? Do different colors and shapes, lines, etc. appear? These are questions that I want the answers to and unless I see it I won't know for certain.
 
As to how this relates to artifacts, I do not understand. But I am curious to see what artifacts look like on different computers. Are they even different on each computer? Are they permanent? Do different colors and shapes, lines, etc. appear? These are questions that I want the answers to and unless I see it I won't know for certain.

It's random. Of course the artifacts aren't going to be exactly the same on every computer. Whether they are permanent or not depends on the damage you've done to the memory / GPU. If the damage is permanent, the artifacts will quite obviously be permanent. Now, the same artifacts may not appear all the time with permanent damage, but you will be guaranteed to get artifacts.

Much of your explanations tend to get entirely too complex, and the more complex you make an explanation the less you truly understand the concept. All of the replies, save a few, pretty much said the same thing. Different words, some more or less complex, but the same thing. Computer's cannot exist without memory.

I have openly stated that I do not fully understand my concept, so how can one be criticized about something he doesn't fully understand and believe that the people reading his idea understand? Isn't most new technology or science barely understood when it first emerges?

Well, you see, it's quite easy for us to understand. Because you don't seem to understand, people try to beat it into your thick skull by trying to explain the same basic point many different ways:

Computers as we define them simply cannot exist without memory.

If you carefully analyze what a computer is and what it does, you'll see that your "concept" is physically impossible. In order to process instructions, computers must have a place to store the instructions for processing and a place to store the results after processing. This means... you guessed it... memory.

I understand what memory IS in a modern computer. I don't even like science fiction. I've never seen the movie PI, I've never seen Star Wars or Star Trek. The only science fiction that I like that you might consider science fiction is books on Albert Einstein deemed 'fiction'.

Don't like science fiction? What kind of geek are you? :p

Even in sci-fi... we'll use Star Trek: TNG here... Lt. Commander Data. He is an andriod... completely artificially intelligent and recognized as a sentient being. Now, even he requires memory in order to function. He is the most advanced computer ever conceived up to that point... and yet he still needs memory.

Do you see it yet?
 

SoDNighthawk

Splendid
Nov 6, 2003
3,542
0
22,780
joset that site you sent us to is an information gathering site not an explanation sight.

It's probably linked directly to world wide terrorist websites and they gather or pick unsuspecting peoples brains about various topics and coalesce all that information annually into a Bombe some place once a month.

That's how I feel even if I can't prove it. :p
 

oolceeoo

Distinguished
Jan 25, 2004
57
0
18,630
So artifacts are random? I get two different points of view on this. Some say artifacts are random, and some say that they are not. Which one is it?
 
If the damage is permanent, the the artifacts could be a constant... but that would depend on the data. If the data is constant, so would be the artifacts... but if you have random data... you would get random artifacts.
 

oolceeoo

Distinguished
Jan 25, 2004
57
0
18,630
I thought that computers couldn't produce truly random data. Seemingly 'random' numbers are based upon the time inside the internal clock. So again I ask are artifacts, regardless of what program or data is being processed, random or not?
 

Schmide

Distinguished
Aug 2, 2001
1,442
0
19,280
Wait if found a terrorist link. It seems this poster bombed NewYork in 1968. It was pro-ported to be none other than the owner of that site.
head.jpg
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
So again I ask are artifacts, regardless of what program or data is being processed, random or no
Like anything involving modern computers, no. They are as random as the data used to generate them.

A particular computer will produce the same stuff under the same conditions, including artifacts. Given exactly the same data, at the same clockspeeds and the same temperatures, then it will always produce the same result (not allowing for the degredation of components of course). The 'Randomness' all comes from differences in these few things. No two cards will necessarily be the same of course, but it's still pseudo-random at best.
 

joset

Distinguished
Dec 18, 2005
890
0
18,980
The Edge Annual Question — 2006

WHAT IS YOUR DANGEROUS IDEA?

The history of science is replete with discoveries that were considered socially, morally, or emotionally dangerous in their time; the Copernican and Darwinian revolutions are the most obvious. What is your dangerous idea? An idea you think about (not necessarily one you originated) that is dangerous not because it is assumed to be false, but because it might be true?

This was my intent, when i forwarded the link. As you can see, neither Copernic nor Darwin were terrorists, in the literal sense of the word. At best, they were considered dangerous because their IDEAS were revolutionary. And yes, if you take the time to go through some of the scientific articles/interviews, they're more than just... informative. It'll be up to you to dare search further...
As for terrorism, well most of the authors (the late Brockman included) are american; so, are you implying anything? Hum? Here, in Europe, terrorism have always been part of our daily lives for ages. I can even affirm - without much error - that, POTENTIALLY, each of us is a terrorist (just think about it, if you wish.). But that, would lead us to another matter, wouldn't it?
Main aim, was to give some perspective to mr. "oolceeoo", on what regards "dangerous ideas", and not to know who was/is/will be a terrorist.

Cheers!
 
The data fed into a computer is always random... it what's done with that data that's constant. For example, playing a 3D game, the scenes are going to be rendered differently each time, because you are not moving in exactly the same way each time. Now 3DMark might produce some constant results... but once an error occurs, the end result may be completely random.
 

joset

Distinguished
Dec 18, 2005
890
0
18,980
The data fed into a computer is always random... it what's done with that data that's constant.


Hello you all, once again!

(It's a little fresh in this Northern Hemisphere, is it not?)

Time to heat up (no pun intended).

No, "data fed into a computer" is not "always random". That's why you need a compiler, first. To give some order to... data (among other compiler-role features.). And no, it's not "what's done with (...) data that's constant." From the "back end" to the "front end" (or vice-versa...), of a CPU/GPU, you have 'linear' operations (arithmetic), 'non-linear operations' (floating point), vectorial, ... and, different approaches to the execution of that data (Out-of-Order, Predication & a lot others...).

Anyway, i 'see' your point; but, that's not THE point, here.
Random behaviours are - actually - included in several branches of physics (...), known by "non-linear chaotic phenomena" and such. Ever heard of "Mandelbrot sets", for example? Well, there you have it!


Cheers!
 

joset

Distinguished
Dec 18, 2005
890
0
18,980
Better.


But only for the understanding of what you were trying to say, in your previous post. Wrong, again, because different, varied [data] input is not the same as [data] randomness. And, [data] processing AND execution (two different issues: see "Back End" & "Front End" CPU stage units) are also different and 'varied', if you like.
As far as my understanding goes, you merely stated that, a (conventional) computing device can be fed with different [data] input (that's true AND a "constant"!), that processing the [data] input is a "constant" and, that the [data] output varies, function of the input... another "constancy"!

Don't take me the wrong way: All i'm trying to clear up here is that, random phenomena work more like this: you give the input and you can - no-longer - predict (with certainty) what's going to happen next (the 'processing' part), let alone the output! It's as if the 'process' keeps feeding itself, out of control and with unpredictable results.

In this branch of research, it's very common to cite a famous cliché, known as the Butterfly Effect: "a butterfly flapping its wings in Peking can trigger an eventual storm in New York"...

There's a good (non technical) book on the subject: "Chaos", by James Gleick (1987).

Cheers, again!