The Boston Diaries

The ongoing saga of a programmer who doesn't live in Boston, nor does he even like Boston, but yet named his weblog/journal “The Boston Diaries.”

Go figure.

Thursday, August 14, 2008

Mass media, digital art, and talent

Since Bunny had never seen either “Willy Wonka and the Chocolate Factory” nor “Charlie and the Chocolate Factory” I told her she should get both so we could watch both and compare the two. That was last night.

We watched them in release order, “Willy Wonka &c” followed immediately by “Charlie &c.” It was during “Charlie &c” that we ended up having an extended discussion about artistic media and whether talent plays a role anymore. It came about because I found myself very annoyed with the CGI effects in “Charlie &c”—I personally found them way noticable and that detracted from my enjoyment (especially during the opening credits).

I was distracted because it fell deep in the Uncanny Valley, which is odd, because there were no humans actually rendered (at least, during the opening credits—everything rendered was certainly well within the computer's bailiwick rendering-wise) but it just seemed that Burton & Co. didn't bother spending the time or the money for these shots. And it's not like it can't be done (“Jurassic Park” for instance—the dinosaurs were incredibly well done; so was the CGI in “The Phantom Menace,” although the story and dialog left a lot to be desired). Perhaps I would have accepted it better had open credits been more cartoony.

Bunny felt that modern technology (read: the digital computer) has cheapened artistic endevours to the point where talent is no longer really needed. Heck, music producers can pitch-correct singers, so even a talent for singing is no longer needed. And obviously, computers have advanced to the point where amateur film makers can do special effects on par with the pros, so there's nothing special there.

But I countered that talent still does matter. Even though Britney Spears is pitch-corrected, lipsyncs during her concerts and is more a product than a person, talent still matters. In her case, a music producer saw she was comfortable in front of a camera on the Mickey Mouse Club, and could hit cues and follow choreographed dance moves, and maybe even had a passible voice (which really didn't matter that much—pitch correction and all that).

“But don't people get upset that the music is identical to the album?” asked Bunny.

“I've met people who get upset if the music doesn't match the album version,” I replied. Yes, such people exist (and to an extent, I'm one of them, but I rarely, if ever, attend concerts). But you go to a Britney Spears concert not exclusively for the music, but for a show.

And it's not like bands haven't been “created” before. N'Sync. Backstreet Boys. New Kids on the Block. Menudo. The Monkeys.

Years ago, I was hanging out with a friend who had set aside a portion of his basement as a small recording studio, albeit with consumer-grade equipment. One of the devices he had was a small box with a few controls on it that allowed you to pick not only the tempo, but the style of drum beats and riffs it would play. And yes, the music that came out of the device sounded much like your run-of-the-mill techno type music. But you could change the tempo and style as it played, and it would slowly shift to the new settings over perhaps half a minute or so.

My friend, as I explained to Bunny, had no music background (that I knew of), and yet here he was, creating a type of music he enjoyed, and yet she found the whole idea distasteful. A “dumbing down” as she called it. “Where does talent fit in?” she asked. “Who needs talent any more?”

But I replied that talent still exists, but that those with the talent might not be known to the greater mass population. I bet not many people have heard of Buddy Rich, but those who have know the man has talent. And those who know are in the industry (music industry, in this case). “Have you ever worked on a spreadsheet?” I asked Bunny.

“Yes,” she said.

“Then you've programmed a computer.”

“But it's nothing compared to what you can do.”

“That doesn't matter. You programmed a computer.” And it's true, despite what computer programmers might think—she was able to instruct the computer through a series of calculations to derive a result. Program. QED.

Spreadsheets have allowed people who would otherwise consider themselves “not a programmer” to program a computer. A simple language like PHP can enable someone to jazz up a website.

“But don't you hate PHP?” asked Bunny.

“Yes, I can't stand the language,” I said. “And the thought that I might have to maintain a program written by Joe Sixpack scares me to death, but still, PHP allows Joe Sixpack to program. However badly.” And it's not like those who are talented are lost. Certainly, Richard Stallman, Donald Knuth, Guido van Rossum, Michael Abrash and John Carmack aren't household names (and Bunny had never heard of them), but within the Computer Industry, they're extremely well known, and known to have a lot of programming talent (now, whether you agree with or respect them, is another matter).

Get into any field, and soon enough, you'll learn who has real talent, and who doesn't. I pulled out a random Uncle Scrooge comic book, opened to a random page and handed it to Bunny. I then spent a few minutes searching through the pile of comics for another Uncle Scrooge comic and handed that one to her, opened to a random page. “Now, of those two, which is better—don't read the words, just go by the art work.”

She looked at the two comics for a minute or two. “I'm drawn,” she said, pun unintended, “to the first one.”

“Exactly,” I said. “That one was drawn by Carl Barks, the Uncle Scrooge artist at Disney. That one,” I said, pointing to the second comic, “was drawn by some two-bit hack.” And the reason it took me several minutes to find that one, the bad example, is that even as a 9-year-old kid reading comics, I came to “know” that some Uncle Scrooge comics were just inherently better than others and at the time, I couldn't say why. Now, I can say why—Carl Barks. But that doesn't fully explain why though. It's not as if Carl Barks' backgrounds were more realistic. Heck, I can't even say his Uncle Scrooge was more realistic, since no comic version of a duck looks remotely like a real duck.

It's hard to pin down why, but Bunny agreed—Carl Barks' art was just “better” than the other artist (and I have no idea who the other artist was, for Disney never allowed their artists to sign their work, so I find it even more amazing that my 9-year-old self could recognize the work of a single unnamed artist). And even my crack of the other artist being a “two-bit hack” is a bit disingenuous—his artwork at least passed the editors at Disney to be published, so he obviously had some “talent.”

Pitch changers, computer graphics, digital photographs, MIDI, samples, all new media. That's it. It's nothing to be afraid of, and it certainly isn't “dumbing down” talent in my opinion. Bunny slowly came to a similar conclusion—that all this new media is allowing more people to express themselves, however badly it might be done. “And while PHP might be a bad language, it at least lets them express themselves in code. And could it lead to better languages?” she asked.

“Yes, if the person takes the time to really learn, or finds herself hitting limitations in PHP, she can certainly find other, more expressive languages to use. At the very least, she will eventually learn to recognize real talent in whatever media, music, paint, programming, film, she might use.”

“So,” said Bunny, pointing to Johnny Depp's deeply creepy Willy Wonka on the TV, “I should approach this film on its own merits and appreciate it for what it is.”

“Oh no,” I said, pointing to the TV, “that movie's crap.”

“Oh thank God!” said Bunny. In the end, we both didn't care for Tim Burton's take on Charlie and the Chocolate Factory. Johnny Depp's Willy Wonka was more effeminately creepy than Gene Wilder's ascerbic eccentric, even if Burton's version was closer to the book than the 1971 version.

“Personally, I wouldn't touch PHP with a 10 foot pole,” I said. “But just because I don't like it doesn't mean it's bad per se. It's just another means of expression. But remember Sturgeon's Law: ‘90% of everything is crap.’”

“It's just how you use it.”

“Yup.”

Obligatory Picture

[The future's so bright, I gotta wear shades]

Obligatory Contact Info

Obligatory Feeds

Obligatory Links

Obligatory Miscellaneous

You have my permission to link freely to any entry here. Go ahead, I won't bite. I promise.

The dates are the permanent links to that day's entries (or entry, if there is only one entry). The titles are the permanent links to that entry only. The format for the links are simple: Start with the base link for this site: https://boston.conman.org/, then add the date you are interested in, say 2000/08/01, so that would make the final URL:

https://boston.conman.org/2000/08/01

You can also specify the entire month by leaving off the day portion. You can even select an arbitrary portion of time.

You may also note subtle shading of the links and that's intentional: the “closer” the link is (relative to the page) the “brighter” it appears. It's an experiment in using color shading to denote the distance a link is from here. If you don't notice it, don't worry; it's not all that important.

It is assumed that every brand name, slogan, corporate name, symbol, design element, et cetera mentioned in these pages is a protected and/or trademarked entity, the sole property of its owner(s), and acknowledgement of this status is implied.

Copyright © 1999-2024 by Sean Conner. All Rights Reserved.