Tuesday, November 08, 2005

History's Worst Software Bugs

Story location: http://www.wired.com/news/technology/bugs/0,2924,69355,00.html


Last month automaker Toyota announced a recall of 160,000 of its Prius hybrid vehicles following reports of vehicle warning lights illuminating for no reason, and cars' gasoline engines stalling unexpectedly. But unlike the large-scale auto recalls of years past, the root of the Prius issue wasn't a hardware problem -- it was a programming error in the smart car's embedded code. The Prius had a software bug.

With that recall, the Pruis joined the ranks of the buggy computer -- a club that began in 1947 when engineers found a moth in Panel F, Relay #70 of the Harvard Mark 1 system. The computer was running a test of its multiplier and adder when the engineers noticed something was wrong. The moth was trapped, removed and taped into the computer's logbook with the words: "first actual case of a bug being found."

Sixty years later, computer bugs are still with us, and show no sign of going extinct. As the line between software and hardware blurs, coding errors are increasingly playing tricks on our daily lives. Bugs don't just inhabit our operating systems and applications -- today they lurk within our cell phones and our pacemakers, our power plants and medical equipment. And now, in our cars.

But which are the worst?

It's all too easy to come up with a list of bugs that have wreaked havoc. It's harder to rate their severity. Which is worse -- a security vulnerability that's exploited by a computer worm to shut down the internet for a few days or a typo that triggers a day-long crash of the nation's phone system? The answer depends on whether you want to make a phone call or check your e-mail.

Many people believe the worst bugs are those that cause fatalities. To be sure, there haven't been many, but cases like the Therac-25 are widely seen as warnings against the widespread deployment of software in safety critical applications. Experts who study such systems, though, warn that even though the software might kill a few people, focusing on these fatalities risks inhibiting the migration of technology into areas where smarter processing is sorely needed. In the end, they say, the lack of software might kill more people than the inevitable bugs.

What seems certain is that bugs are here to stay. Here, in chronological order, is the Wired News list of the 10 worst software bugs of all time … so far.

July 28, 1962 -- Mariner I space probe. A bug in the flight software for the Mariner 1 causes the rocket to divert from its intended path on launch. Mission control destroys the rocket over the Atlantic Ocean. The investigation into the accident discovers that a formula written on paper in pencil was improperly transcribed into computer code, causing the computer to miscalculate the rocket's trajectory.

1982 -- Soviet gas pipeline. Operatives working for the U.S. Central Intelligence Agency allegedly (.pdf) plant a bug in a Canadian computer system purchased to control the trans-Siberian gas pipeline. The Soviets had obtained the system as part of a wide-ranging effort to covertly purchase or steal sensitive U.S. technology. The CIA reportedly found out about the program and decided to make it backfire with equipment that would pass Soviet inspection and then fail once in operation. The resulting event is reportedly the largest non-nuclear explosion in the planet's history.

1985-1987 -- Therac-25 medical accelerator. A radiation therapy device malfunctions and delivers lethal radiation doses at several medical facilities. Based upon a previous design, the Therac-25 was an "improved" therapy system that could deliver two different kinds of radiation: either a low-power electron beam (beta particles) or X-rays. The Therac-25's X-rays were generated by smashing high-power electrons into a metal target positioned between the electron gun and the patient. A second "improvement" was the replacement of the older Therac-20's electromechanical safety interlocks with software control, a decision made because software was perceived to be more reliable.

What engineers didn't know was that both the 20 and the 25 were built upon an operating system that had been kludged together by a programmer with no formal training. Because of a subtle bug called a "race condition," a quick-fingered typist could accidentally configure the Therac-25 so the electron beam would fire in high-power mode but with the metal X-ray target out of position. At least five patients die; others are seriously injured.

1988 -- Buffer overflow in Berkeley Unix finger daemon. The first internet worm (the so-called Morris Worm) infects between 2,000 and 6,000 computers in less than a day by taking advantage of a buffer overflow. The specific code is a function in the standard input/output library routine called gets() designed to get a line of text over the network. Unfortunately, gets() has no provision to limit its input, and an overly large input allows the worm to take over any machine to which it can connect.

Programmers respond by attempting to stamp out the gets() function in working code, but they refuse to remove it from the C programming language's standard input/output library, where it remains to this day.

1988-1996 -- Kerberos Random Number Generator. The authors of the Kerberos security system neglect to properly "seed" the program's random number generator with a truly random seed. As a result, for eight years it is possible to trivially break into any computer that relies on Kerberos for authentication. It is unknown if this bug was ever actually exploited.

January 15, 1990 -- AT&T Network Outage. A bug in a new release of the software that controls AT&T's #4ESS long distance switches causes these mammoth computers to crash when they receive a specific message from one of their neighboring machines -- a message that the neighbors send out when they recover from a crash.

One day a switch in New York crashes and reboots, causing its neighboring switches to crash, then their neighbors' neighbors, and so on. Soon, 114 switches are crashing and rebooting every six seconds, leaving an estimated 60 thousand people without long distance service for nine hours. The fix: engineers load the previous software release.

1993 -- Intel Pentium floating point divide. A silicon error causes Intel's highly-promoted Pentium chip to make mistakeswhen dividing floating-point numbers that occur within a specific range. For example, dividing 4195835.0/3145727.0 yields 1.33374 instead of 1.33382, an error of 0.006 percent. Although the bug affects few users, it becomes a public relations nightmare. With an estimated 3 to 5 million defective chips in circulation, at first Intel only offers to replace Pentium chips for consumers who can prove that they need high accuracy; eventually the company relents and agrees to replace the chips for anyone who complains. The bug ultimately costs Intel $475 million.

1995/1996 -- The Ping of Death. A lack of sanity checks and error handling in the IP fragmentation reassembly code makes it possible to crash a wide variety of operating systems by sending a malformed "ping" packet from anywhere on the internet. Most obviously affected are computers running Windows, which lock up and display the so-called "blue screen of death" when they receive these packets. But the attack also affects many Macintosh and Unix systems as well.

June 4, 1996 -- Ariane 5 Flight 501. Working code for the Ariane 4 rocket is reused in the Ariane 5, but the Ariane 5's faster engines trigger a bug in an arithmetic routine inside the rocket's flight computer. The error is in the code that converts a 64-bit floating-point number to a 16-bit signed integer. The faster engines cause the 64-bit numbers to be larger in the Ariane 5 than in the Ariane 4, triggering an overflow condition that results in the flight computer crashing.

First Flight 501's backup computer crashes, followed 0.05 seconds later by a crash of the primary computer. As a result of these crashed computers, the rocket's primary processor overpowers the rocket's engines and causes the rocket to disintegrate 40 seconds after launch.

November 2000 -- National Cancer Institute, Panama City. In a series of accidents, therapy planning software created by Multidata Systems International, a U.S. firm, miscalculates the proper dosage of radiation for patients undergoing radiation therapy.

Multidata's software allows a radiation therapist to draw on a computer screen the placement of metal shields called "blocks" designed to protect healthy tissue from the radiation. But the software will only allow technicians to use four shielding blocks, and the Panamanian doctors wish to use five.

The doctors discover that they can trick the software by drawing all five blocks as a single large block with a hole in the middle. What the doctors don't realize is that the Multidata software gives different answers in this configuration depending on how the hole is drawn: draw it in one direction and the correct dose is calculated, draw in another direction and the software recommends twice the necessary exposure.

At least eight patients die, while another 20 receive overdoses likely to cause significant health problems. The physicians, who were legally required to double-check the computer's calculations by hand, are indicted for murder.



GeForce 6800 GS: The New Price/Performance King?

It's been some time since we last ran our last GPU Price-Performance shootout. Despite nine months having passed, not a whole lot has changed the landscape. Nvidia introduced the GeForce 7800 GTX and GeForce 7800 GT, while ATI started shipping its Radeon X1800 XL, with the top-tier X1800 XT soon to follow. These are all very expensive, high-end cards, and they're great performers, but they'll never lead the price/performance charts.

Though there's been some general price depression all around, our conclusions from the previous "best GPU bang for the buck" shootout still hold true: GeForce 6600GT and Radeon X700 Pro cards are good for the $150 range, but while their price/performance ratio is high, they don't offer the kind of raw speed necessary to let you really enjoy games like F.E.A.R. and Quake 4 the way they're meant to be played. The real deals are the GeForce 6800 GT and Radeon X800 XL, especially since both have now fallen in price to under $300. Either card will give you all the features and performance necessary to play all of today's games without turning all the detail levels down.

Is there a better option? Nvidia thinks there should be, so the company is adding a new SKU just in time for the holiday season, ahead of ATI's new Radeon X1600 cards (expected to ship at the end of this month). The GeForce 6800 GS aims to offer the best performance at the most reasonable price. Is this our new price/performance champ?
GeForce 6800 GS
click on image for full view

The sweet spot

We are often asked "Which video card should I buy?" We always answer with "well how much do you want to spend?" The inevitable reply is that everyone wants to run all the latest graphics-heavy games at high resolutions with all the features enabled, but they only want to spend $100 to $150 to do so. Sorry to say, but that's just not going to happen.

The real sweet spot for graphics is in the $250 to $300 price range. Sure, some of those sub-$200 deliver more frames per second per dollar spent, but they're generally just not fast enough to run those really graphics-intensive games at decent resolutions, unless you're willing to go into your game options menu and turn the details down to "medium."


GeForce 6800 GS GeForce 6800 GT Radeon X800 XL

Estimate price $249 $279-$299 $279-$299

Core clock speed 425MHz 350MHz 400MHz

Memory clock speed 1.0GHz (500MHz DDR) 1.0GHz (500MHz DDR) 1.0GHz (500MHz DDR)

Amount of RAM 256MB GDDR3 256MB GDDR3 256MB GDDR3

Pixel shader processors 12 16 16

Vertex shader processors 5 6 6

If $250 to $300 is the best price range to buy a graphics card that won't be obsolete within a year, what are your choices? The best two existing cards in that price range are the GeForce 6800 GT and Radeon X800 XL, both of which come in 256MB flavors at the upper end of that price range. We should note that the GeForce 6800 GT launched last year at $400, but has fallen in price considerably since then. The more recent Radeon X800 XL launched with an average street price of around $340 or so, and if you shop around you can find it for maybe $50 less these days.

Nvidia's new GeForce 6800 GS aims to compete with these cards, with a general estimated MSRP around $30 cheaper, but impressive specs nonetheless. Note that it has the same memory bandwidth as the slightly more expensive 6800 GT and X800 XL, but comes in at a higher clock speed. It's got 12 pixel pipelines (like the general GeForce 6800) and five vertex shader pipes, but it still compares well with the 6800 GT, due to a much higher core clock speed.

In all other ways, the 6800 GS is comparable to the other GeForce 6 series cards. It's got the same PureVideo processor capabilities, and exactly the same architecture, with the same Shader Model 3.0 support.

Benchmark and Testbed Setup

There's really not much of a story to tell here. It's simply a matter of "there's a new GeForce 6 series SKU, how does it perform?"

We ran through a score of benchmarks on the following system.

Component Athlon 64 FX-55 system

Processor Athlon FX-55

Motherboard and chipset ASUS A8N SLI Deluxe (nForce 4 SLI chipset)

Memory 2 x 512MB DDR400 (CAS 2-2-2-5)

Hard drive Seagate 7200.7 160GB SATA Drive

Optical drive ATAPI DVD-ROM Drive

Audio Sound Blaster Audigy 2

Operating system Windows XP Professional with SP2

We used the latest Catalyst 10 drivers for our ATI card and the recently released beta Forceware 81.87 drivers on the Nvidia card.

Our GeForce 6800 GS card is a reference model, running at reference clock speeds. If history is any indication, several manufacturers will offer overclocked models, sometimes at the same or lower price as the general estimated MSRP quoted here ($249).

We use a variety of applications to test 3D performance:

3DMark05: Futuremark's latest synthetic graphics benchmark, 3DMark 05, is a very forward-looking test. It's heavy on DX9 shaders, and runs optimized code for either Shader Model 2.0 or 3.0 depending on your card's capabilities.

Far Cry: Crytek's shooter Far Cry is one of the most graphically demanding games on the market. It will run optimized code paths for shader model 3.0 or 2.0b if your card supports it. Our test runs the four demos included in the 1.3 patch, and takes their geometric mean to produce a final score.

Doom 3: How can you review a graphics card without testing the latest id Software game? Nvidia almost always scores better on this title, whether it's because the developers optimized more for Nvidia's architecture, or Nvidia designed with id's engine in mind.

Half-Life 2: We run a couple of custom-recorded demos of single player play, and then take their geometric mean to produce an overall score. It's a great example of a high-end, very efficient and optimized game engine.

Call of Duty 2 Demo: Infinity Ward has dropped the Quake engine in favor of their own new DX9 code. We recorded our own custom timedemo and ran through it with all the visual quality settings cranked up. We look forward to benchmarking the full game when it is released.

Splinter Cell: Chaos Theory: UbiSoft's latest Sam Fischer adventure uses an all-new engine, rich with detail and pretty heavy on shaders. It can operate in a Shader Model 1.1 mode, or a Shader Model 3.0 mode. We run both, and enable the optional high dynamic range rendering mode with SM 3.0 or 2.0. Because HDR and antialiasing only work together with this game on ATI's X1000 series cards, we disabled AA on our tests using Shader Model 3.0 or 2.0.

F.E.A.R.: Monolith's new shooter is one of the prettiest, grittiest, and most graphically demanding games ever. We patch the game to version 1.01 and use the built in performance test to measure average frame rate, with all settings turned up to the max. There is one exception: the Soft Shadows option doesn't work properly with antialiasing. We disable it for all testing.

3DMark05 results

Futuremark's latest 3D benchmark is the only synthetic test we use to judge video card performance. That is, it's the only test we use that isn't an actual game. Why? Simply put, we believe that it's a pretty good look at future game graphics. The test are well designed to mimic the type of code and content we'll see in games coming over the next year or so—those that utilize DirectX 9 shaders heavily.
3DMark05
click on image for full view

Given that the GeForce 6800 GS has a suggested retail price of $30 to $50 less than the 6800 GT or Radeon X800 XL, we're very pleased to see it score basically equal to those cards in 3DMark. This is at stock speeds, no less: Overclocked GeForce cards are quite common from Nvidia's board partners.

Half-Life 2 and Doom 3

Valve's Half-Life 2 is one of the more graphically advanced games available today, making fairly heavy use of shaders in some areas. It is well engineered and optimized to make the most out of almost any graphics card.
Half-Life 2 Performance
click on image for full view

Just as Doom 3 is "Nvidia's game" on the previous generation chips, Half-Life 2 is "ATI's game," though the gap is smaller here. The 6800 GS actually outperforms the more expensive 6800 GT on this one, though only by the smallest amount.
Doom 3 Performance
click on image for full view

Nvidia holds a sizeable lead in Doom 3 benchmarks; this should come as no surprise by now. The new X1000 architecture cards have a driver that takes advantage of the programmable memory controller to improve performance in Doom 3 and Quake 4, but you'll see no such benefit in the X800 cards.

Far Cry and Splinter Cell: Chaos Theory

With vast outdoor scenes flush with vegetation, complex water, and a fair amount of normal mapping, Far Cry pushes most graphics cards to the limit. The 1.3 patch ships with four built-in demos. We run all four and take their geometric mean to produce a final score.
Far Cry Performance
click on image for full view

Again the $249 (estimated) 6800 GS appears to be a pretty good match for slightly more expensive cards. This is really good news for those trying to get a good graphics card without spending insane sums of money.

Splinter Cell: Chaos Theory can run in a Shader Model 1.1 mode that is compatible with all modern video cards or a Shader Model 3.0 mode only available to Nvidia cards. The latter allows the use of some limited high dynamic range (HDR) effects. We run the benchmark in both SM 1.1 mode and SM 3.0 with HDR. Antialiasing is incompatible with HDR rendering on Nvidia's hardware, so we run our two test resolutions with and without anisotropic filtering instead to keep the playing field level. There's a SM 2.0 path that we use for the Radeon X800 XL, since that card does not support SM 3.0.
Splinter Cell: Shader Model 1.1
click on image for full view
Splinter Cell: Shader Model 3.0
click on image for full view

In this price range, ATI and Nvidia run Splinter Cell: Chaos Theory about equally. ATI is pulling ahead where AA and AF are enabled (always a strong suit of theirs), but remember that the X800 XL is a bit more expensive than the 6800 GS, too. That, and our 6800 GS is running at stock clock speeds, while we have heard from some vendors that they plan to offer overclocked cards.

Call of Duty 2 and F.E.A.R.

We use our own custom timedemo recording in the Call of Duty 2 demo to measure performance.
Call of Duty 2
click on image for full view

We have no idea what the heck is going on here. Nvidia scores far lower than expected in our Call of Duty 2 timedemo. We've seen it run much faster than this on Nvidia cards before, so we can only guess that this is due to some bug in the 81.87 drivers. We tried reinstalling drivers, running the demo in safe mode and reconfiguring it, we even uninstalled the demo, deleted the directory entirely, and reinstalled it. We simply can't figure out why the scores are abnormally low, but it was repeatable.

Especially odd is the way the GeForce 6800 GT dropped to 10 frames per second at 1280x1024 with AA and AF enabled. We're presenting these numbers because we were able to reproduce them over and over, even after spending too much time troubleshooting, but I wouldn't put too much faith in them. We know from experience that Nvidia cards run the game better than this—we just couldn't figure out what was wrong in time.

With rich real-time lighting all over, spectacular dynamic lighting, tons of normal mapping, lots of physics, and particle systems galore, F.E.A.R. will give your computer a real workout. We disable Soft Shadows in all our testing, since it is incompatible with antialiasing. Note that F.E.A.R. gives a warning if you turn the texture detail up beyond Medium with a 128MB card. Though it recommends this setting only be used with cards that have 256MB or more, we run it this way anyway. If a 128MB card scores poorly, that will only serve as a good example of why you'd want to buy one with more video memory. It's important to apply the 1.01 patch that was released simultaneously with the game, as it can drastically improve performance in some cases.
F.E.A.R. Performance
click on image for full view

Nvidia's sub-$300 graphics cards outperform ATI's in Monolith's impressive new shooter when AA and AF are disabled, but ATI catches up with those features turned on. The shortage of four pipelines hurts the 6800 GS's performance with AA and AF a bit, but it still holds up well against these slightly more expensive competitors.

Final Thoughts

We're pretty impressed with this new midrange SKU from Nvidia. At a suggested retail price of $249, with a core clock of 425MHz and a memory clock of 500MHz, it's a very good performer. On the whole, it matches the performance of a GeForce 6800 GT; a card which debuted at $399 and has since fallen to a $279 to $299 level. It also lines up quite nicely with ATI's $279 to $299 card, the Radeon X800 XL. That matchup is a bit less even, with the ATI part faster in a few games and slower in a few others.

Though we know there will be some cards with these clock speeds at a $249 price, we really can't in good faith call this a review. What we tested was a reference card from Nvidia, and we know companies like BFG and XFX plan to offer overclocked cards. We don't have exact speeds or prices from Nvidia's partners just yet, but if history is any guide, there will be overclocked cards available for $249 and cards at the stock clock speeds that you can find easily online for $20 to $30 less. This makes for an impressive bargain and a huge step up from the generic GeForce 6800.

Pointer Graphic for Fingerlinks Read more GPU reviews here.

The big question: How will this fare against ATI's similarly priced X1000 series card, the Radeon X1600 XT? In short, we don't know. X1600 cards aren't even supposed to ship out until the end of this month, the recent launch of the X1800 and X1300 cards have us doubting when you'll actually be able to buy them, and what the price will be. It wouldn't make sense for us to test that card now, when nobody can buy it, the price and real availability is undetermined, and when there will surely be drivers in the coming weeks that change its performance characteristics. What we do know is this: With the GeForce 6800 GS, Nvidia has set a difficult bar to leap, and in doing so has injected a lot of value into the most important price segment.

Product: Nvidia GeForce 6800 GS

Company: www.nvidia.com

Price: $249 (est., will vary depending on OEM.)

Pros: Higher clock rates and near-6800GT performance in a lower cost card.

Cons: No AGP version yet.

Summary: Nvidia may have a winner on its hands with the 6800GS, but we'll have to see how it fares against the similar ATI X1600.



While OSWD.org is down

OSWD.org is offline, appearently because of hosting problems. But there are still plenty of OSWD templates to find on other sites. Here is a list of designer sites where you can download some of the top designs from OSWD:

If you know of any OSWD designers who has posted their designs on some other site, let me know and I will add them here! There is also a thread in the Sitepoint forum where the current situation (and the future) of OSWD.org is discussed. The site admins have posted an official announcement in the thread, and it appears as if the site will be back online soon.

source:http://andreasviklund.com/blog/webdesign/while-oswd-is-down/


This page is powered by Blogger. Isn't yours?