And I don't mean your sci-fi nerd hat, although that will inadvertently be our topic for today. I mean to get your computer-geek nerd hat (or shirt) on. I've mentioned before that I'm really not a super extreme sci-fi guy. Have I seen all the Star Trek movies? Sure, who hasn't. And I recently got into Firefly and Farscape because their complete series were on Netflix, as was the remade Battlestar Galactica. I can even tell an X-Wing fighter from a TIE fighter. But if I were to go down this list I'd have twenty nope's for every one yup. I especially can't stand spinoff shows of spinoff shows: Star Trek, then Star Trek Next Generation, then Star Trek DS9, then Star Trek Voyager, then Star Trek Enterprise. To me, it's beating a dead horse.
So like I said, given that Firefly and Farscape are in the "Recently Watched" section of my Netflix account, it really didn't come as any surprise that Babylon 5 showed up in the "Recommendations for Ernie" section. Last night, eh, I decided to give one a try and within the first thirty seconds almost snorted milk out of my nose. It wasn't the cheesy acting -- all good sci-fi has cheesy acting -- it was the computer graphics man! They were fucking horrible! I stared slack jawed for a couple of seconds -- the space scenes look like they were rendered by some retarded 8th grader working on a school project. Then it hit me to check the date of when Babylon 5 was released -- 1993. HOLY SHIT man, I'm watching video technology that was cutting edge 18 fucking years ago. I could picture myself watching this when I was in the dorms and saying, "WOW! THIS LOOKS SO FUCKING REAL!" And so I decided to do a little probing around and see what I could dig up on the technology behind the visual effects of a science fiction show that at the time, evidently sucked hind tit next to Star Trek Next Generation.
Per Babylon 5's wikipedia article, the two hour pilot film premiered on February 22, 1993, while the regular series aired from January 26, 1994 and ran for five full seasons until the last episode of season five aired November 25, 1998. And after some more Googling I found this article which explained, "The B5 effects teams, both at Foundation and at NDI, use Lightwave 3D by NewTek and specialized software to design and render the visual effects. For the pilot, the effects were rendered on a network of Amiga computers; later, Foundation used 12 Pentium PCs and 5 DEC Alpha workstations for 3D rendering and design, and 3 Macintoshes for piecing together on-set computer displays."
What really caught my attention is the mention of Lightwave 3D. I head heard that term before, but couldn't put my finger on where. Another Wikipedia search and I get this: "In 1988, Allen Hastings created a rendering and animation program called Videoscape, and his friend Stuart Ferguson created a complementary 3D modeling program called Modeler, both sold by Aegis Software. NewTek planned to incorporate Videoscape and Modeler into its video editing suite, Video Toaster." AH HA! That's where I heard it. The VIDEO TOASTER -- that's the shit my friend Greg Short used to fuck with back when we were in high school. He was the only kid I've ever known who after school, would spend his time animating commercials for the local Ford dealership. That's right, I made my money by stealing shit from Chase-Pitkin, Greg made his with his computer prowess. Deal with it. He's now Greg Short, the Trucker, aka Creative Director of Interactive Media for a a company called Crystal Pix.
Anyway, with this early age of computer animation as a point of reference, let's take a minute to reflect upon Moore's Law:
Moore's law describes a long-term trend in the history of computing hardware. The number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years. This trend has continued for more than half a century and is expected to continue until 2015 or 2020 or later. The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras. All of these are improving at exponential rates as well. This exponential improvement has dramatically enhanced the impact of digital electronics in nearly every segment of the world economy. Moore's law describes a driving force of technological and social change in the late 20th and early 21st centuries. The law is named after Intel co-founder Gordon E. Moore, who described the trend in his 1965 paper.
And yes, Moore's Law did indeed take effect as, "In 1993, NewTek released the Video Toaster Screamer, a parallel extension to the Toaster, with four motherboards each with a MIPS R4400 CPU running at 150 MHz and 64 MB of RAM. The Screamer accelerated the rendering of animations developed using the Toaster's bundled Lightwave 3D software. The Screamer could produce as many frames of animation a day as 16 of the fastest Amigas. Screamers were used to render CGI for many different science fiction shows. Later generations of the product run on Windows PCs." Here's a Newtek Video Toaster 4000 commercial from 1994, featuring none other than our very own Wesley Crusher. Swoon.
"Later generations of the product run on Windows PCs." -- Now I would assume this coincides with Babylon 5's info, "later, Foundation used 12 Pentium PCs." -- But neither one specified what speed Pentiums or exactly when the changeover fro Amiga to Pentium took place. So I'll split the difference -- 1994 to 1998 and look for what Pentium processors were released in 2006. Viola: Intel's Pentium 200Mhz processor was released on 10 June 1996, so given a television show probably had the budget to buy the latest and greatest technology, we'll use the P5-200 processor for our comparison.
But first a quick tangent. Without getting into too much detail, there are two basic architectures to computer processors: Complex Instruction Set Computer (CISC) and Reduced Instruction Set Computer (RISC). Your standard run of the mill desktop is CISC based. Now clockspeed-for-clockspeed (megahertz-for-megahertz), RISC processors are more powerful than CISC processors. Kind of the same way that displacement-for-displacement, a 2-cycle engine is more powerful than 4-stroke engines. But being task oriented and thus more expensive, RISC processors were usually reserved for high end workstations or for purpose build computers like those used in video rendering. Now normally processors are rated by how many Instructions-Per-Second they can perform, but since we're comparing apples to oranges (RISC to CISC), then a more accurate comparison is to benchmark Floating-Point-Operations-Per-Second. Which is just a nerdtastic way way of saying how many complex calculations per second.
Okay, the aforementioned Toshiba R4400 processor was released in January of 1993, and was rated at 22 million FLOPS per second (22 MFLOPS). The Pentium 200 is rated at 31.95 MFLOPS. So when the crew for Babylon 5 were rendering with, "four motherboards each with a MIPS R4400 CPU," they were working at (22 x 4) 88 MFLOPS per second. Think about that. 88,000,000 million complex math calculations per second. Wow. And when they jumped up to, "12 Pentium PCs," they increased their computing power to (31.95 x 12) 383.4 MFLOPS -- a 335% jump. Pretty fucking sweet, eh?
Well, with that in mind, let's bring that Moore's law shit into play again and compare mid-1990's processors to what I can roll down and pick of from Best Buy today, shall we? Verizon's new Droid X2 smartphone with its dual core Tegra-2 processor -- you know, just a fucking handheld phone -- can do 36.18 MFLOPS. The new iPad-2's that Steve Jobs touts on stage? Get exactly 170.9 MFLOPS -- easily capable of rending the entire first season of Babylon 5, in almost half the time (88 MFLOPS vs 170.9 MFLOPS) it took the Amiga based Newtek Video Toaster 4000.
And for desktop computing power? As of 2010, the fastest PC processor (Intel Core i7 980 XE) has a peak performance of 107.55 GFLOPS. Point: that's GIGAFLOPS, not MEGAFLOPS. And that's just the central processor, which isn't shit when compared to what the Graphics Processing Unit (GPU aka your video card) can do. Because of the complex mathematical computations they have to perform when rendering games, decoding High-Definition Television, etc, GPUs are considerably more powerful than your computer's CPU. For example, On March 8th, 2011 AMD released the AMD Radeon™ HD 6990 Graphics card capable of 1.27 TFLOPs in double precision calculations -- or a staggering 5.10 TFLOPs in single precision. That fucking XBOX-360 that you get drunk and play Call of Duty on all weekend? The CPU alone can do 115 GFLOPS, with it's GPU adding another 240 GFLOPS -- combined it's 948 times more powerful than all 12 of the Babylon 5 editing team's fastest computers all strung together.
But who's the big daddy of them all? In June 2011 Japan's slipped back into the fast lane -- in the world of supercomputing, that is, after its "K computer" sped into the top spot of the TOP500 list at the International Supercomputing Conference. The prizewinning machine is, unsurprisingly, a bit of a beast. Comprised of 672 computer racks equipped with a current total of 68,544 CPUS it churns out a performance of 8.162 petaflops -- that's 8.162 quadrillion floating-point operations per second. If I were to chart that kind of power on a linear scale, it would look like this. So to make that kind of leaps and bounds even somewhat understanable, I have to use a 10x logarithmic scale (1, 10, 100, 1000, 10000, etc) and it looks like this. Think about that the next time you're sipping a Starbucks, mindlessly tapping away at Plants Vs Zombies on your iPad and wondering why you can't get a job after law school.
CRANK IT UP TO 11: And without such awesome video editing abilities, we would be left without two of my favorite staples: awesome mocumentaries and sexy videos of sports hotties.
What happens when the space shuttle retires? This: SpaceX Dragon. Greg
Ernie... I sent you a link to my Star Trek car a few years ago but you never posted it. Mike
I dunno man, I just don't get a warm fuzzy feeling about having to depend upon parachutes again. It just seems like a step backwards to me. When the Shuttle program came along, NASA said that was the new hotness, while parachutes were old and busted. Now it's the opposite? It's kind of like when Dodge invented the minivan which in turn killed off the station wagon, and then they tried to sell us the Dodge Magnum as an alternative to the minivan. Riiiiiiight. More of what's old and busted: a hailstorm destroys a backyard in Georgia. What's the new hotness: a hailstorm (and high winds!) destroy a backyard in Oklahoma.
Still think that pharmacist "murdered" that cretin [link]? Patrick
Yep, I most certainly do. The only thing those two events have in common is they're both robberies. That's it. Everything else unfolded completely differently so it's like comparing.... well... RISC to CISC.
QUOTE: "I've been working out from, like, 5 a.m. to 7 p.m. for two months now," she told reporters in Las Vegas, "I've been working out really hard because I had this pool party and I was like, I have to be in shape ... I was actually a lot overweight ... It was the most I've ever been because I've kind of been in hiding eating pie with my husband and puppies, so I needed to get back in shape." Now weighing 103 pounds, the 5'2" blonde who is 24 years old, added, "I've been running a lot, and I've been doing weights ... When you work out, you boost your metabolism, so you have to (make sure you eat enough)." Who am I talking about? Why none other than our very own KNOCKERS MONTAG! Yay! Welcome back, Knockers! I've missed you so much!
Oh, and it's Bruce Fucking Campbell's birthday. Hail to the King, baby.
how to paint your automobile (circa 1922)
drop in at the nyny las vegas hotel & casino - top 10 underground walks of the world