Tuesday 7 October 2014

ITS ALL ABOUT PC 2

1.2PERSONAL COMPUTER HISTORY
In order to properly understand and appreciate the progress we have made and to
anticipate the continued evolution of the industry, let's look at the progress of the
computer. What precisely is a computer? Machines that helped people do computation
have been around for almost 150 years (Brigham Young invented a device to calculate the
number of miles a wagon traveled by counting the number of wheel rotations). There
have been all types of machines built to compute or measure various things (there’s even
one that will compute a logarithm).
Most of these machines are “analog” or value-based. So they can represent any value
between zero to one equally as well as zero to a million. An example of an analog device
is the odometer on your car (please note that these may not be true analog, but the concept
still holds). Whether you move the car one inch vs. one thousand miles, it makes little
difference—your car still retains the distance, thus further depreciating it’s value.
There was another type of machine which used a magnet-powered switch which would
close the switch when the electromagnet was turned on (this kind of switch is a “relay”).
Telegraph used crude relays. The advantage of using switches (either “on” or “off”—
called “digital”), the results would always be predictable (the value will always be zero or
one). Analog devices always have to be tuned (just try to put a different sized tire on your
car). The problem with relays is the power required and delay experienced was too great
to make them into a computational device. Early computers went a different route by
using electron (or vacuum) tubes.
Vacuum tubes have been used for power amplifiers, but they could also be used as
switches as well and would function many times faster than the relays would. The idea
was pretty simple: the tube had three plates. The first plate was the source power, the
second was the destination, and the third was the “switch.” The electrons would at the
source would gather but would not be able to get to the destination unless power was
applied to the “switch-plate.” Think of it like having scuffed your feet on the floor to
generate static electricity then getting close to something (or someone) you want to zap—
still they’re too far away. You need something to close the gap. That’s something like
what the “switch plate” does.
Memories and calculations were held and completed by turning on and off thousands of
these switches.
However, vacuum tubes still required tons of equipment and megawatts of power. The
proverbial add/subtract/multiply/divide calculator on your wrist used to take up an entire
building floor and require many megawatts.
In the mid 1950s, a special little switch was invented that has thus reshaped our history:
the transistor. The power (no pun, honestly!) in this little thing was it’s size (less than .
1”—compared to 3-4” for a vacuum tube) and its power (much less than a watt—
compared to 5-10 watts).
Modern computers are composed of millions and millions of these transistors switches.
Like the vacuum tubes, the transistors are arranged in arrays to accomplish what we ask
of them. Your computer memory alone has millions of transistors—one megabyte has
one million bytes or eight million bits. That’s more than 8,000,000 transistors!
Personal computers have been around as early as the mid-1970s. The companies
involved include Apple, Commodore, Atari, Synclair, to name a few. The first chips for
these computers only had 10-50 thousand transistors. At the time personal computers
were not taken seriously and were infrequently found in the workplace.
The first attempt to make a business-directed personal computer was by Apple when they
introduced the LisaÔ computer. That was a failure—mostly do due to the $7000 price
tag. The next was the MacintoshÔ which was better received. The growth of personal
computers did not really take off until IBM entered the market. From their open
architecture, hundreds and thousands of computer companies sprang up. All the while
technology advanced at a tremendous rate. Also, due to advances in chip manufacturing,
the prices plummeted.
1.2.1MICROPROCESSOR REVOLUTION/EVOLUTION
The microprocessor, from the onset of personal computers, has been the driving force of
hardware and software technology. There a couple primary claims to the first
microprocessor, but the concept of placing all the computing power on a single literally
was revolutionary. In fact, many of the first microprocessors (as old as 25 years!) are still
being manufactured and used as simple control units in various appliances and machines.
As the processor became more powerful, the supporting hardware and software became
more powerful and complex. Simply, a processor takes commands from memory and
does things with them. Think of it like your math teacher telling you each step to solve a
problem. These steps are repeated over and over—the computer does not learn, rather
has to follow each command issued to it the same as the day before. Here we need to
clarify a few things: processor families and clock speeds.
1.2.1.1Processor Families
Think of some kind of blender in the kitchen. It originally had a dial to select the speed:
either blend or liquefy; then, a new model of the same blender came out with buttons.
Now, the modern model has “flash” or “chop” modes that only run while you are holding
the button. These extra capabilities which may not seem all that significant are called
“features.” The sequence of blenders from the original design is called a “product
family.” Microprocessors have these families as well—in fact, they are far more crucial to
the industry than that of the blenders'. Because software was written for an old processor,
the industry does not want to rewrite the software for the new one. This is called
“backwards compatibility.”
1.2.1.2Clock Speeds
The clock is like the rhythm to a song: each word is sung to each beat; the faster the beat,
the faster we can complete the song. Generally, the processor obeys (called “executes”)
each command in a certain number of clock beats (called “clock ticks”). The faster the
clock the faster things get done.
“Wait a minute, why is the next family member faster than the first at the same clock
rate?” you might hear (e.g. the Pentium/66 is twice as fast as a 486/66). It is the aim of
each processor generation to do things faster and better than the generation before. The
80286 processor required about 50 clock ticks to complete a multiplication. The next
generation (80386) only took 10 clock ticks! Again, the more you can do within a clock
tick and the faster the clock, more can be done in less time.
You may think that naming processors with numbers might be rather sterile and boring.
Well, they are. In fact, until recently processors and chips were given numbers for names
(e.g. Z80, 6502, 68000, etc.). Now we have the Pentium and the Pentium Pro (this last
one was “quite original”). Most of the time, the numbers follow a sequence: the bigger
the number within a family, the faster/better/more complex it is. Then as time passed
individual processors could go varying speeds (the 80386 entered the market running a
dazzling 16MHz; about six months later Intel introduced the 20 and 25MHz versions,
25% and 56% faster, respectively).
1.2.1.3Faster is Better
In 1985, the primary competing processors were 8088/8086, 68000 and Z80 (these only
had 20-50 thousand transistors). These processors were considered “state of the art”.
However, if you think about the sheer speed processing we can do now, these were plain,
dog slow. For example, the Pentium/100 is easily 200x faster than the first IBM PC.
That means that the Pentium processor can run the same software as the first IBM PC but
will be blazingly faster. The latest processor (Pentium Pro) has 5.5 million transistors for
the CPU alone.
Lastly, processor prices have continued to drop. To fully appreciate this, consider: when
the Pentium/60 (60MHz version of the Pentium) was introduced to the market it retailed
for about $800 per chip. Now (if you can find them), they are less than $50 per chip. So
getting the “latest and greatest” does not always make sense. Sometimes—no—often,
one- to two-year old technology is the best priced, most reasonable and all that we really
need [this is my opinion, of course].

0 comments: