"Sure, it's a little like having bees live inside your head—but there they are." Firesign Theater: I Think We're All Bozos On This Bus
I try to live at the intersection of Technology and Art (but somebody stole the darn street signs). I will strive to share the unusual—OK—weird, and give a tall guy's perspective on what passes for reality at this nexus of the plexus—this major hub of the multiverse.
Monday, August 22, 2005
Cue Theremin and synthesizer dirge...
Robert Moog has passed away after battling a brain tumor for several months. He is generally considered the inventor of the analog synthesizer, heard in the late sixties on everything from Wendy Carlos'Switched On Bach to the Beatles'Maxwell's Silver Hammer.
Though the analog synthesizer (complex, hard to tune and keep that way, limited sound palette) has mostly given way to the digital synthesizer (simpler, computer-controlled through MIDI, easy to change the sound by changing samples), there is still a demand for that "old school" analog sound as found on albums like Dark Side of the Moon by Pink Floyd—though the synthesizer used there was the British-made Putney VCS-3, which has a sound all its own. Check out the Moog Cookbook CDs (an acquired taste, but great stuff) for some entertaining aural landscapes.It should also be mentioned that the synthesizer did not spring full-blown from Mr. Moog's mind like Athena from the head of Zeus, but has its antecedents going back as far as the electronic musical instrument known as the Theremin (as used on the Beach Boys'Good Vibrations--check out this excellentdocumentary)—in fact, Moog's current company still makes them.Another major influence on the Moog synthesizer was the electronic work of his one-time mentor, Harry Warnow—still practically unknown as the composer Raymond Scott—whose music was used by Carl Stalling in countless Warner Bros cartoons under license; for example: his most famous composition, Powerhouse (MP3 sample clip) appears in more than 35 Warner Brothers cartoons itself. Mr. Warnow built synthesizers and sequencers that were primarily used for sound effects in commercials (MP3 sample clip), though he did design other instruments.
Another interesting bit of Raymond Scott trivia: Johnny Williams, his Quintette drummer—the drummer on thePowerhouse sample linked above—was the father of John Williams, the Star Wars composer. You can definitely hear Scott's influence, especially on tunes like the Cantina Band sequence from the first Star Wars film.
When I first saw this image over a friend's fireplace about 35 years ago, I was fascinated. In spite of its rather florid graphic style and overt sentimentality (maybe that's part of its appeal?), the play of light, the androgyny of the figures, and the overall composition intrigued me. It took quite a bit of research to find out more about the artist, Maxfield Parrish.
Nowadays, we have the Internet—and research like this is much easier. For example, it was not until researching for this post did I discover that the print I saw and linked to at the beginning actually is supposed to look more like this.
Parrish's teacher was Howard Pyle, whose work was clearly influenced by the work of the pre-Raphaelite brotherhood—though he may well have protested otherwise, I'm sure—and whose work inspired a number of other illustrators and painters.
What does this have to do with technology? It was musing on the process of rotogravure—which allowed newspapers to print their Sunday supplements in color starting in the period around World War I and brought color illustrations to the masses on a weekly basis, often in an advertising context. Magazine and advertising illustrations brought the work of Parrish and N.C. Wyeth to a mass audience in new ways.
While I may be of a technical bent (yes, I'm bent—but not twisted, I swear!) this is not primarily a tech blog (that would be this one). Yes, I'm an electronic technician, computer teacher, and technical writer—but I'm usually much more interested in how technology impacts people. It's also fun to speculate about the future—for example, what will the long-term effects of buckyballs and nanotubes be on human culture?
I dropped out of engineering school in the late '60s because I felt the prevailing attitude was too much people for the machines and not machines for the people; if there was a poor fit, you bent the people to fit the machine, not the other way around. Part of my initial interest in computers was that I viewed them as a near-infinitely-malleable machine—potentially programmable to the Nth degree.
The present problem is that we have what has mostly become a computer monoculture of IBM PC-compatible machines running Microsoft Windows, to the point that most people wind up viewing the failings and flaws of Windows as problems with computers and the Internet in general—when this is (usually) not the case. Viruses, spyware, adware, and (to a lesser extent) worms are almost entirely a consequence of using the Windows operating system and not an inevitable part of the computer experience. People are being bent (and twisted) to the Windows Way of doing things and do not have as much control over their PCs as they should. Digital rights management (DRM) is a looming problem, as well. Once again, control is slipping out of the hands of the average user. How can we take it back?
Admittedly, a learning curve with computers is not entirely avoidable. Using a computer will never be like driving a car—it's more akin to learning a language. Think about the massive complexities of the English language, yet we still communicate effectively every day, for the most part. :-) Still, you don't normally have to refer to a dictionary or encyclopedia every thirty seconds to speak!
It shouldn't have to be this hard. To get around some of these problems, I mostly use Linux. I don't necessarily think that everyone should use Linux, too—yet—though for many folks it can be a useful (and free) alternative to Microsoft Windows 10, with its baked-in privacy issues. To my mind, the Apple Macintosh is not the ultimate solution either—though based on an open source BSD version, it's much too proprietary.Frankly, I don't know what the ultimate solution to these problems will be—but I promise that I will share any useful gleanings that I find along the way. Not all problems are solvable, but there may be ways of avoiding them or making them easier to live with.
In some ways, I have been pretty fortunate to be in the right places at the right times when it came to the development of personal computing.
It all started around March of 1974. At the time, I had been working as an electronic technician (TV/audio repair, sound systems) for about ten years. My friend Roger Gregory (scroll down to chapter six) showed me a book: Computer Lib/Dream Machines by Ted Nelson. Knowing Roger as the genius he is--combined with my knowledge that Intel had recently introduced the first primitive microprocessors--I realized that the first personal computers were probably less than 2 years away.
My knowledge of computers of any kind was pretty minimal at the time, but it was exciting--though I didn't really have any way to learn about computers then, as I was pretty minimally employed (a recurring problem I have never fully resolved) and didn't have much of a research budget. I did read whatever I could lay my hands on, however; when the Altair 8800 computer kit was announced in the January 1975 issue of Popular Electronics magazine (but not shipped in any quantity until after the second article the following December), I knew, to paraphrase the immortal words of Sir Arthur Conan Doyle, that "the game was afoot."
I will be writing more on this later as a series of articles--including virtually-unknown information on the development of such well-known computers as the IBM PC and the Macintosh, as well as lesser-known computers such as the Amiga and Atari ST (some of the information in this article is not totally accurate)--which I will link to from here. Some of it will be moderately technical, but I will do my best to not leave useful nuggets in a dry heap. I intend to give the information a meaningful context so you can have some idea of "how we got here" in personal computing.
I will leave you with one teaser, though: the original Macintosh was never intended to be a computer--and leave it at that, for now.
Tradeoffs abound when making electronics for the mass market. These days, most of the effort goes to cost reduction. As soon as an electronic product is "good enough" most of the remaining effort goes into squeezing out the last nickel (or less). This trend homogenizes the products in the marketplace, especially since many of the products come out of the same Chinese contract-manufacturer factories (as well as the parts that go into them). An early example of electronic cost-reduction is the story of Earl "Madman" Muntz and his television sets (the article includes a discussion of the "science" of cost reduction).
Combined with the trends toward increased manufacturing capacity, integrated microprocessors, and containerized shipping and handling, this has led to some spectacular increases in the value-for-dollar of electronic equipment over time. This trend is covered in a more general sense in the article Time Well Spent--but I'm aiming for the relatively specific here.
To give you a handle on this trend: in 1952, a TV set cost between about $200 and $300, which got you a 20" diagonal black-and-white screen (what we would now call 19"). When adjusted for inflation, this is $2,200 in today's money--which would buy you a 40" HDTV receiver in color and full stereo sound today. Furthermore, the 1952 model used vacuum tubes--which were inefficient, wore out, and often had to be replaced, an expensive proposition.
Perhaps an even greater example of this trend is the VCR. In 1978, a big, bulky (so it could be assembled on a primitive robotic assembly line) Panasonic VHS VCR cost about $1000--about $2,800 in today's dollars. I just bought a decent Funai VCR for $42, about 1.5% of the earlier price--but it's also built much more cheaply. Now you see why I'm not a TV repairman anymore...
The point of this post (you were hoping there was one, right?) is that it is now possible to put value back into some cheap electronics after the fact. For those of you who know how to wield a soldering iron, swapping out the cheap parts used in some equipment for better-quality components can bring big benefits. The article Tweaks for Geeks gives an excellent example: taking a $150 DVD player, adding $100 in parts, and getting the equivalent performance of a $1500+ player. There may be a use for my electronic skills yet!
My friend Rick Lieder just sent me a link to a website of his I had not seen before, Bug Dreams. I have always enjoyed his art, including his book covers and magazine illustrations. His photography and paintings are perhaps a bit off-beat--but certainly deep, evocative, and often highly-charged with emotion; the bug photography is surprisingly colorful, involving, and suffused with a sense of light and heightened awareness. A bug may not have much of a brain, but it's certainly "plugged into" its environment in ways our more-highly-evolved human consciousness has left behind on its way to the top of the evolutionary ladder--and you can feel it here.