SANTA CLARA - If ever there was a mecca for geeks, surely the Intel museum would be it.
Now, to answer the first question - yes, there actually is an Intel museum. And to answer the second question - yes, its primary purpose is to document the history of the microprocessors that drive the world's computers.
But the museum is far more interesting than it sounds. It doesn't appeal to just the uber-nerds, but to visitors keen on learning the history of personal computers, and the subsequent, continuing technological revolution they spawned.
Situated on the ground floor of Intel's spacious Santa Clara headquarters, about an hour south of San Francisco, the museum sees its fair share of geek pilgrimages. About 80,000 people visit each year, Intel officials say. The museum is particularly popular with Japanese and Korean tour groups, as well as local students.
Entrance is free. Visitors can wander round and look at the various exhibits by themselves, or they can arrange for a guide or take an audio tour with a PDA device, which is available in eight languages.
Larry, our tour group's guide, starts off by explaining that he works "upstairs" in a technical capacity. He, like all the guides, shows people round the museum on a purely voluntary basis.
Larry's father also worked for Intel, he explains, so the company has a special place in his heart. Apparently, he just plain likes talking about Intel.
We start off with a brief history lesson in front of a glass case displaying the company's early days. Intel was founded in 1968 by chemist/physicist Gordon Moore and physicist/co-inventor of the integrated circuit, Robert Noyce.
The original one-page business plan, as typed by Noyce himself, suggests that people outside California might someday be interested in the company's products. It's comedic, in retrospect.
Interestingly enough, these products started out as random access memory (RAM), used in devices ranging from calculators to traffic lights to the original, ridiculously large cellphones, which are also on display. It wasn't until 1971 that the company made its first microprocessor, the 4004, and the rest, as they say, is history.
Larry introduces us to the first PC - a clunky-looking IBM box with a tiny monitor sitting atop it.
"It had a colour monitor, provided your favourite colour happened to be green," Larry jokes.
Launched in 1981, the IBM PC had two 5.25-inch floppy drives. Larry explains that in the thinking of the time, nobody could really see a reason why you would want to leave data on one machine, thus the absence of an internal hard drive.
I can't help but shudder at the irony, given the huge move to mobility that we are currently experiencing - we've obviously come full circle and are back to the thinking of 25 years ago.
Nevertheless, Larry then drops a highly trivial morsel, which I immediately file in my brain for later use in impressing people at cocktail parties. IBM engineers eventually saw the value of including an internal drive, so they removed one of the floppies - the "B" drive - and replaced it with that internal disc. This is why most computers since have had an "A" and "C" drive - the hard disk - but no "B".
Hey, if you like useless trivia, you'll love the Intel museum.
We're whizzed through the rest of the company's history, which from 1981 onward was mostly the continuing invention of faster processors - or the application of the co-founder's "Moore's Law", which holds that the number of transistors on a chip doubles every 18 months.
Larry then brings us to the explanation of how chips are made, a topic he takes to with some zeal.
Silicon, he explains, is one of the world's most plentiful elements and the most pure. It is mashed up and melted into large slabs, which are then sliced super-thin to form round wafers.
The microprocessor circuits are then etched onto the wafers with ultraviolet light in a process not unlike photography. Hundreds of processors are put onto each wafer. They are later individually cut out with a diamond saw and packaged.
I can't help but wonder why, since the finished chips are square, they are put on to round silicon wafers. Isn't that a lot of waste, with a whole bunch of incomplete chips round the edges, I ask.
Larry explains: A square wafer is more likely to be damaged on its edges during the manufacturing and shipping process, whereas a round shape is much stronger. Some processors round the edges are naturally wasted because of the shape, but it's much safer and cheaper to make the wafers round.
More useless trivia, but I find it fascinating.
With the brief tour over, Larry leaves and we are free to wander and explore the museum's numerous interactive exhibits.
A large crowd has gathered round a giant digital board that demonstrates how a microprocessor works in its simplest form.
Visitors push a few giant buttons nearby to work out the equation 2+3, and the board flashes a series of lights that replicate the flow of electrical information through a microprocessor.
I spend a good deal of time playing with the "What's a semiconductor" exhibit. Here, you take two metal electric-charge-emitting poking devices and touch various substances - wood, plastic, aluminium, silicon - to see how conductive they are, as displayed by a meter.
Silicon, apparently, is an ideal semiconductor, which makes it perfect for use in microscopic electronics, such as computer processors.
The whole thing makes me feel like I'm back in school on a science trip. In fact, there are many schoolchildren here this day doing just that, and they are also enthralled by the simple semiconductor exhibit.
Obviously, they are the geeks of tomorrow.
* Peter Nowak visited the Intel Developer Forum in San Francisco as a guest of the company.
Intel museum spreads geek joy
AdvertisementAdvertise with NZME.