Veteran animator Steve Bowler (pictured) got pretty angry when he bought Doom 3. And he’s still a mite agitated…
What was it, 12 years ago, that we first laid eyes on the original, the dark new 3D world that was Doom? Even before that, a select few of us recall with wonder the revival of one of our favorite gaming franchises, in a bold new direction, when Wolfenstein 3D hit the shelves.
For a dozen years Id has been the top dog, the guy to beat, the pater familia to the first-person shooter. It can look back on a legacy of six games, each one an unstoppable sales juggernaut, a technological milestone. You didn’t need to know what the review score was for an Id title. You only knew that you needed to buy it.
But one day, the industry changed. The consumer changed. It’s hard to put one’s finger on it. Maybe it was Counter-Strike. Maybe Unreal Tournament. Something happened to the genre between Quake III and Doom 3, and Id somehow didn’t take it into account. Call it braggadocio, or hubris, but Doom 3 is no longer the top dog in the FPS market.
Yes, it’s upsetting. I tried not to admit it either. But it’s undeniably true.
Some have even argued that Doom 3 is a step backwards in FPS gaming, that even when it hit the shelves we were already years past where it hoped to position itself.
The problem, it seems, lies at the core of where Doom came from, and the hopes we had for Doom 3. It was a tale of gameplay, graphics, and mistakes.
Zombie shuffle
We’re all familiar with the helter-skelter breakneck balls-to-the-wall pace that the original Doom set. So where is it in Doom 3? I can appreciate the slow zombie shuffle as much as the next guy, but when Halo’s Flood race existed years before Doom’s sequel, one has to ask why exactly we’re experiencing only one or two imps at a time.
Obviously, there’s a reason why we don’t have a dozen imps chasing us down a corridor, and I’m inclined to say that it’s because of the graphics engine. So much attention has been paid to rendering a realistic environment that there just isn’t a lot of room left for that many bad guys. This left the guys at Id with a bit of a conundrum: How could they still make the game tense and as terrifying as the originals?
The answer, evidently, is to have shit jump out of the dark at you.
Yes, I jumped. I was scared. And then I got tired. Tired of having secret panels open behind me after I’d already cleared the room of any possible beasts from hell, only to get clawed in the back. Who knew demons were capable of such stealth and chicanery? Hey, maybe I’ll open this door and--surprise!--here’s yet another instant 25 hit points removed from my health because an imp was waiting patiently for me to open a door. This isn’t gaming. This isn’t the Id I know. This is scripted nonsense.
And yet, in the face of such scripted trickery, the A.I. then proceeds to fall flat on its face when given an empty room and a box to hide behind. If it doesn’t have a gun, the A.I. just comes straight at you trying to claw your eyes out. If it does have a gun, it hides behind corners and boxes, but since the game lacks a headshot--something which has become so common in FPSs now that it’s no longer a boastable feature--it takes an implausible amount of time to dispatch them.
Maybe I’m crazy, but I recall levels in the original Doom where you were downright encouraged to trick the A.I. into fighting itself. Yes, it was a primitive A.I., but I recall being impressed by it. Hell, even the famed Reaperbot for the original Quake is still 10 times more entertaining than fighting drones in Doom 3.
I guess what it all boils down to is the fact that the gameplay is just too simplified for the graphics. It’s hard to stomach having to shoot a zombie in the head the same number of times as in the body (six rounds from a pistol, thanks for asking) to dispatch it, when you can shoot a light fixture and watch how realistically light dances around the room.
And don’t try justifying it with "well, the zombies obviously aren’t powered by brains, because there’s some zombies without heads," because you can still kill them by shooting them in the jaw six times. I’m sure they’d also die if I shot them in the foot six times, but honestly, I’m just too bored with the game to even try.
Fight mechanics
The problem here is that you can’t justifiably give us Doom 3 fight mechanics (shoot at the sprite till it's taken enough damage to die) and yet put us in a world that’s nearly as well rendered as our own reality. I’m willing to pass on the point that we’re still missing the run-and-gun feel of the original Doom. I (along with the rest of the gaming community, evidently) just want to see some realistic damage models, and when you have competition like Half-Life2, Far Cry, and the upcoming Unreal Tournament 3, you need to step up your A game and deliver.
It seems that for every iteration of the Doom/Quake series, Carmack and company have managed to set the bar for the industry, both graphically and in Netplay. Yet, after being out now for half a year, it is being surpassed in both. Its online game didn’t make even the tiniest ripple in the Internet pond. Ironically, these two items have always set precedent for Id, and Carmack has even been on record in the past as stating, “Who cares about single-player, it’s all about the multiplayer.” It used to be all about selling the engine, and now even that seems fated to despair as the Unreal 3 engine is winning awards and accolades for its ease of use, and is dominating the press as far as who’s using it for their next-gen titles. No one talks about Id’s Doom 3 engine, and in years past, it’s been the engine to have.
I could go even further and discuss how banal the whole metal + hell aspect of it is. One can only take so much future-goth architecture overflowing with demons. It’s sufficiently played out. I’ve played through every Id title to date, and after 12 years of the same thing packaged in a new box, suddenly, I just don’t give a damn about hell anymore.
It’s sad, but in the end, Doom 3 will no doubt be forgotten in the annals of first-person shooters. Will Quake IV be able to pull them out of the slump and get them back to their King of the Hill status? Not if Doom 3 was any indication…
Steve Bowler has been working in the videogame industry on and off for the past 10 years, currently working at Midway Games as a lead animator in Chicago. While his main passion is animation, occasionally he spouts opinions about games. Quite often, people disagree with him.
# posted by dark master : 7/11/2005 01:22:00 PM
0 comments
Feds Fear Air Broadband Terror
Federal law enforcement officials, fearful that terrorists will exploit emerging in-flight broadband services to remotely activate bombs or coordinate hijackings, are asking regulators for the power to begin eavesdropping on any passenger's internet use within 10 minutes of obtaining court authorization.
In joint comments filed with the FCC last Tuesday, the Justice Department, the FBI and the Department of Homeland Security warned that a terrorist could use on-board internet access to communicate with confederates on other planes, on the ground or in different sections of the same plane -- all from the comfort of an aisle seat.
"There is a short window of opportunity in which action can be taken to thwart a suicidal terrorist hijacking or remedy other crisis situations on board an aircraft, and law enforcement needs to maximize its ability to respond to these potentially lethal situations," the filing reads.
The Justice Department hopes to do that with an FCC ruling that satellite-based in-flight broadband services are bound by the 1994 Communications Assistance for Law Enforcement Act, the federal law that required telephone companies to modify their networks to be wiretap-friendly for the FBI.
CALEA was originally passed to preserve the Bureau's ability to eavesdrop on telephone calls in the digital age. But last year the FBI and Justice Department persuaded the FCC to interpret the law so it would apply to internet traffic over cable modems and DSL lines. The FCC has already expressed the view that in-flight broadband would likely be covered as well.
The Justice Department is asking the commission to require that air-to-ground internet taps be equipped "forthwith, but in no circumstance more than 10 minutes" after the FBI requests them.
The filing comes as the FCC considers implementing a licensing scheme that would encourage more companies to enter the satellite-based in-flight broadband market. Currently, only Boeing is licensed to provide such services.
Boeing's Connexion system lets passengers plug in to a wired ethernet jack or connect wirelessly over 802.11b, and is available on select flights on a handful of international carriers, including Lufthansa, Singapore Airlines and Korean Air. No U.S. carrier has announced plans to offer the service.
In addition to seeking the rapid-tap technology, the Justice Department filing asks the FCC to require carriers to maintain fine-grained control over their airborne broadband links. This would include the ability to quickly and automatically identify every internet user by name and seat number, remotely cut off a passenger's internet access, cut off all passengers' access without affecting the flight crew's access, or redirect communications to and from the aircraft in the event of a crisis.
Officials also expressed concern that terrorists might use in-flight broadband to remotely trigger a bomb hidden on a plane. They asked the FCC to keep such services from being accessible from the cargo hull of an aircraft.
"The ability to turn on a broadband-enabled communications device located on board an aircraft ... presents the possibility that either a passenger or someone on the ground could reliably remotely activate a broadband-enabled communications device in flight and use that device as an RCIED (remote-controlled improvised explosive device)," the filing says.
Forrester Research analyst Brownlee Thomas supports the Justice Department's proposal, but admits it would raise the barrier of entry for companies wanting to enter the in-flight broadband market.
"It does favor the largest players in this space," says Thomas. "I would go so far as to suggest that I think it is the Justice Department's intention to ensure that the doors are not open too wide on this, for the requirement of national security.... That actually makes perfect sense."
Despite their safety concerns, federal agencies are generally bullish on airborne broadband, lauding its potential to enhance communications between the air and the ground during a crisis.
# posted by dark master : 7/11/2005 01:11:00 PM
0 comments
Big Screen Viewing Effect For Mobile Phone Videos
France Telecom’s wireless unit, Orange SA, will soon roll out a new mobile video service that will let cellular phone subscribers view TV, movies, photos and broadband Internet content with a big screen viewing effect using Kopin®-enabled video eyewear from U.S.-based MicroOptical Corp. Kopin Corp. (Nasdaq: KOPN - News), the largest U.S. manufacturer of microdisplays for mobile consumer electronics and military applications, has received an order for CyberDisplay® 230K microdisplays from MicroOptical for this application.
Orange SA, one of the world’s leading wireless companies with 52 million customers in 16 countries, will bundle a MicroOptical binocular video eyewear with Samsung’s SGH-D600 cell phone as part of its new “Orange World” wireless multimedia service. The bundled package, unveiled in June 2005 at the European Research and Innovation Exhibition in Paris, is scheduled to be available to Orange subscribers in October 2005.
MicroOptical’s video eyewear contains two of Kopin’s full-color, QVGA-resolution (320 x 240) CyberDisplay 230K microdisplays. The sleek eyewear allows users to privately view large-size video or pictures equivalent to a 12-inch screen as seen from three feet away, yet simultaneously view their surroundings thanks to the small size of the frame and MicroOptical’s patented optics which allow the user to see around the screen. Europe’s AFP news wire service called the bundled technology “a sure fire hit,” saying that the eyewear’s “big screen effect” is stunning, especially when combined with built-in stereo earpieces.
“Kopin CyberDisplays are becoming the standard microdisplays of choice for mobile video applications thanks to their ability to provide the highest video quality in the smallest footprint and with very low power consumption,” said Dr. Mark Spitzer, MicroOptical’s founder and CEO. “We are very happy with our partnership with Kopin and really excited about being a part of Orange’s multimedia wireless service. We are ramping up the production to meet the initial customer demand.”
“The mobile video revolution is unfolding in the cellular phone market as we speak,” said Dr. John C.C. Fan, Kopin’s president and CEO. “Consumers want to be able to watch movies, music videos and TV, browse the Web and check their e-mail on their cell phones on the go. But the phone’s small screen has inhibited widespread consumer adoption. MicroOptical’s innovative video eyewear is enabling the big screen capabilities that consumers demand, and yet is very lightweight and similar to eyeglasses.”
The Kopin CyberDisplay 230K’s tiny size (0.24-inch diagonal) enabled MicroOptical to design a featherweight (2.5 oz.), comfortable and stylish video eyewear solution for Orange SA. MicroOptical’s binocular video eyewear delivers crisp, full-color video with a 17-degree field of view. The eyewear is connected to a cell phone through a thin cable, and allows up to five hours of video with three AAA batteries. Since it accepts composite video input (NTSC or PAL), the eyewear can be plugged into other devices with composite video outputs such as portable DVD players.
Built with nanotechnology, the CyberDisplay 230K with approximately 230,000 pixel dots in 0.24″ diagonal is the highest resolution transmissive display of its size. In addition to displaying standard text and graphics, the display operates at traditional video speeds and consumes only five milliwatts of power. Kopin’s power-efficient CyberDisplay 230K is ideal for a range of portable consumer and industrial applications such as video eyewear and viewfinders for digital cameras and camcorders.
# posted by dark master : 7/11/2005 11:40:00 AM
0 comments
Inside the big switch: the iPod and the future of Apple Computer
If you've been following the Apple-to-Intel transition, you're going to want to read this whole article. Why? Because I'm going to do something that I almost never do: spill insider information from unnamed sources that I can confirm are in a position to know the score. Note that this isn't the start of some kind of new trend for me. It's just that all this information that I've been sitting on is about to become dated, so it's time to get it out there.
As I said in my previous post on the 970MP and FX unveiling, the new PowerPC processor announcements from IBM raise a number of questions about timing, like, when will these parts be available? how long has IBM been sitting on them? why the apparently sudden leap in performance per watt on the same process after a year with so little improvement?
The announcements also raise serious questions about why, if these great parts were just around the bend, did Apple really jump ship for Intel? Was it performance, or performance per watt, as Jobs claimed in his keynote speech, or were there other, unmentioned factors at work?
I have some answers to those questions, and I'll pass them along below. However, those answers come complete with their own vested interests, so feel free to interpret them as you will.
First, let's talk about the broken 3GHz promise. It's apparent in hindsight that 3GHz on the 970 was never going to happen on a 90nm process without lengthening the 970's pipeline, which is a fairly significant change. Who knows why IBM promised Jobs 3GHz? All I do know is that despite the objections of some within IBM the company tried to hit that target without the needed pipeline change, and missed it.
The laptop G5, which is the long-rumored and now-announced 970FX, has supposedly been ready to go into an Apple laptop since at least early last month. And for what it's worth, yes, Apple was offered the Cell and other game console-derived chips. In fact, IBM routinely discloses its entire PowerPC road map to Apple, so pretty much anything PPC that IBM puts out is not only not a surprise to Apple, but it's potentially available for Apple's use.
So why didn't Apple take any of these offers? Was it performance, as Jobs claimed in his keynote? Here's something that may blow your mind. When Apple compiles OS X on the 970, they use -Os. That's right: they optimize for size, not for performance. So even though Apple talked a lot of smack about having a first-class 64-bit RISC workstation chip under the hood of their towers, in the end they were more concerned about OS X's bulging memory requirements than they were about The Snappy(TM).
One of the major factors in the switch was something that's often been discussed here at Ars and elsewhere: Apple's mercurial and high-handed relationship with its chip suppliers. I've been told that the following user post on Groklaw is a fairly accurate reflection of the bind that Apple put itself in with IBM:
I've worked with Apple Authored by: overshoot on Sunday, June 12 2005 @ 08:56 PM EDT
and I can tell you, there's a very good chance that they outsmarted themselves into a "no bid" response from IBM.
Part of Apple's longstanding complaint against IBM was that Apple would announce a new computer with a new IBM processor, sales would skyrocket, and IBM wouldn't have adequate supply. We've all heard the story. Here's my take:
Apple negitiate for a new processor chip. Being Apple, they want "most favored customer" treatment, with fab-fill margins for the vendor. What's more, they want this for what amounts to a custom processor chip, so any oversupply will just sit on the shelf until Apple decides they want them, and sometimes Apple will let them sit a while to see if they can get a price break -- it always pays to remind the world that one is, after all, the Steve Jobs.
With terms like that, custom chip vendors only start as many lots as the customer contracts to accept right off the line. Apple, not exactly rolling in cash, isn't going to highball that estimate. In fact, they play it conservative and only order a small startup batch. The rest follows, of course: the product sells, Apple orders more to cover the demand, and IBM tells them that processors have a 6-month lead time.
Apple complains publicly about IBM (does this sound like anyone we know?) IBM, being grown-ups, doesn't say anything that might be perceived as negative about a customer.
Lather, rinse, repeat.
Well, time goes by and IBM has other customers who actually pay up front for custom designs and who don't insist on having IBM tailor their product roadmap around a few million units a year. Apple again demands that IBM dedicate their CPU design teams to making an Apple special that will never generate much revenue. If IBM won't play, Apple will go to Intel.
IBM does a Rhett Butler, and the rest is history. Note that you aren't hearing one way or the other from IBM on this story.
Class bunch, IBM.
Apple has been pulling these stunts for a long time, as anyone who followed the company's relationship with Motorola knows. Compare the quote above to the following selection from a five-year-old Paul DeMone article describing Apple's dysfunctional relationship with Motorola and the reasons for Motorola's long clockspeed stagnation:
In many ways Apple is the author of its own misfortune. Years of work and billions of dollars of investments are required to design, manufacture and maintain the competitiveness of a family of microprocessors for the desktop computer market. Time and time again Apple has changed business strategies abruptly, only to reverse itself again a short while later in ineffective attempts to stem its gradual but consistent losses in market share. The PowerPC semiconductor partners, Motorola in particular, has written off hundreds of millions of dollars in losses caused directly by the erratic actions of Apple Computer, such as encouraging and later crushing a nascent market for Macintosh clones. The mercurial nature of its primary customer, combined with its minuscule and generally diminishing share of the desktop computer market, have meant that at least the last two generations of PowerPC processors have been designed primarily with embedded control, and more recently, digital signal processing applications in mind. This has left Apple in the position of only being able to differentiate itself on the basis of curved system form factors and translucent plastic.
So this behavior has been going on for years and has spanned multiple CPU suppliers. The only thing that's different now is that the Mac is no longer the foundation for Apple's future growth.
For the real reason behind the switch, you have to look to the fact that it's the iPod and iTMS—not the Mac—that are now driving Apple's revenues and stock price. As I stated in my previous article on the switch, Apple is more concerned with scoring Intel's famous volume discounts on the Pentium (with its attendant feature-rich chipsets) and XScale lines than it is about the performance, or even the performance per Watt, of the Mac.
It's critical to understanding the switch that you not underestimate the importance of Intel's XScale to Apple's decision to leave IBM. The current iPods use an ARM chip from Texas Instruments, but we can expect to see Intel inside future versions of the iPod line. So because Apple is going to become an all-Intel shop like Dell, with Intel providing the processors that power both the Mac and the iPod, Apple will get the same kinds of steep volume discounts across its entire product line that keep Dell from even glancing AMD's way.
If you think XScale is too powerful for the iPod—it's used in powerful color PDAs—then you're not taking the device seriously enough as a portable media platform. The XScale is plenty powerful enough to do video playback, and I have reason to believe that Apple is currently working on a video iPod to counter the Sony PSP. (My guess is that we might even see it in time for Christmas.) When the video iPod hits the streets, Apple will have an iPod product that plays each of the media formats (music, pictures, video) represented in its iLife suite.
The cold, hard reality here is that the Mac is Apple's past and the iPod is Apple's future, in the same way that the "PC" is the industry's past and the post-PC gadget is industry's future. This transition mirrors the industry's previous transition/expansion from the mainframe to the networked commodity PC—a transition that is still ongoing in some sectors of the market. Of course the PC will stick around, but as the hub of a growing and increasinbly profitable constellation of post-PC gadgets. It's a shame that Steve Jobs can't be upfront with his user base about that fact, because, frankly, I think the Mac community would understand. The iPod and what it represents—an elegant, intuitively useful, and widely appealing expression of everything that Moore's Curves promise but so rarely deliver—is the "Macintosh" of the new millennium. There was no need to put on a dog and pony show about how IBM has dropped the performance ball, when what Jobs is really doing is shifting the focus of Apple from a PC-era "performance" paradigm to a post-PC-era "features and functionality" paradigm.
# posted by dark master : 7/11/2005 09:30:00 AM
0 comments
Fingerprint Recognition th Linux & IBM's T42
"UPEK, provider of popular fingerprint sensors to IBM's T42 notebooks and others, has announced that they will be providing a BioAPI compliant library to perform biometric authentication under GNU/Linux. Will Linux be the first operating system to have integrated biometric user authentication 'out of the box'?"
# posted by dark master : 7/11/2005 09:27:00 AM
0 comments
T-43 Hours and Counting
"As seen on NASA TV, for the first time in over two years, the countdown clock has started at 6:00 PM EDT for the Wednesday 3:51 PM EDT launch of Space Shuttle Discovery on the first of the return to flight test flights. The launch is not for certain due to weather issues associated with hurricane Dennis. Currently it is estimated for a 70% chance of launch on Wednesday, with the chances lowering later in the week. If you are confused on how T-43 hours equals almost 3 days, perhaps you should read Countdown 101."
# posted by dark master : 7/11/2005 09:26:00 AM
0 comments
Half Life 2 Talk
Listening to Bill Van Buren talk about Half Life 2 I realised a key reason for its excellence - it shows you the story rather than telling you, just like a good author showing you rather than telling you scene details. it doesn't parade the story in a cut-scene but rather puts you right in the middle of it.
It's little surprise only Valve have really gone down this path properly as it clearly took a lot of work making the "cut-scenes" unbreakable by the player. The powerful scripting system did often allow the designers to create scenes without the assistance of animators or story boards - they just threw together a rough cut with existing animations and rough voice over files (apparently Marc Laidlaw created some great ones, so much so they were tempted to leave in his Father Grigory).
As you may be aware they spent a lot of time getting eyes right - how they focus and even how your eyelids dip when looking down. They also used real people as character references (I wish I had a photo of the slide, it was really interesting to see the comparisons), though they ended up stylising them somewhat as having them too realistic was "just creepy" as Bill put it. They're continuing to move forward in the area of facial animation and have even hired Bay Raitt who worked on Gollum's facial animation.
Their character animation system is particularly impressive too - at one point Eli Vance was running, looking to the side and typing (!), all blended in real time. To create a scripted scene you kind of layer things (an eyebrow movement here, a wave there and so on) and adjust line graphs to alter movement intensity. It's all extremely intuitive looking stuff so the designers can more easily get on with making the game.
One thing I didn't realise was that Half life 2 rewarded the inquisitive - players who looked around not only saw newspaper clippings and photos but in doing so triggered revealing comments from other characters.
Someone pointed out how much time was spent alone in Half Life 2. Bill replied that they were aware of this and were working on keeping NPCs with you for more of the time in Aftermath. This brings with it the problem of ever-present characters becoming irritating, but they're aware of that and working to address it so they're helpful rather than annoying.
One final interesting detail - they narrowed the field of view from 90° to 75° in Half Life 2, narrowing it even further to around 50° during the final cut-scene with Breen.
It's pretty evident just how much attention Valve pay to details and how eager they are to keep moving forward with new ideas. Aftermath can't come soon enough.
# posted by dark master : 7/11/2005 09:25:00 AM
0 comments
Researchers explore whether parrot has concept of zero
A bird may have hit on a concept that eluded mathematicians for centuries—possibly during a temper tantrum.
July 2, 2005 Special to World Science
Researchers are exploring whether a parrot has developed a numerical concept that mathematicians failed to grasp for centuries: zero.
Alex, a Grey parrot (Psittacus erithacus). (Courtesy Jenny Pegg)
Oddly, it seems he may have achieved the feat during a temper tantrum, the scientists say.
Although zero is an obvious notion to most of us, it wasn’t to people long ago. Scholars say it came into widespread use in the West only in the 1600s; India had it about a millennium earlier.
Yet Alex, a 28-year-old Grey parrot, recently began—unprompted—using the word “none” to describe an absence of quantity, according to researchers at Brandeis University in Waltham, Mass.
Alex thus possesses a “zero-like concept,” wrote the scientists.
Years earlier, Alex had been taught another meaning of “none,” as a lack of information, they added. But his feat was to extend the concept to a context involving numbers, during a test of his counting skills.
The researchers, Irene Pepperberg and Jesse Gordon, described the findings in the May issue of The Journal of Comparative Psychology, a research journal.
Alex’s apparent insight into nothingness doesn’t necessarily extend to other arithmetical talents, the researchers noted: the researchers found these to lag in some respects behind those of young human children.
The scientists also said it will take further study to determine whether Alex—who has been the subject of intelligence and communication tests throughout his life—really understands zero.
Zero and none “are not identical,” Pepperberg wrote in a recent email. But since Alex never learned “zero,” the researchers said, it’s impressive that he started using a word he knew to denote something like it: an absence of a quantity.
Also unclear, though, was whether by “none” he meant no colors, no objects or something else.
“We just started yet another series of experiments to see if he can easily be trained to understand that ‘none’ can be used for true zero,” Pepperberg said via email. It looks like he can, she added, but it’s “far too early to make serious claims.”
Chimps and possibly squirrel monkeys show some understanding of zero, but only after training, the researchers said. So Alex’s feat is the first time this has been documented in a bird, “and the first time it occurred spontaneously,” Pepperberg said via email.
But the achievement didn’t come without a few bumps.
The story began when researchers started testing Alex to see whether he understood small numbers, between one and six. Zero wasn’t expected of him. The researchers would lay out an array of objects of different colors and sizes, and asked questions such as “what color four?”— meaning which color are the objects of which there are four.
Alex performed well on this, with no training, for dozens of trials, the researchers recounted. But then he balked. Alex started ignoring questions, or giving wrong answers, seemingly deliberately. He seemed to enjoy the experimenters’ frustrated reactions, they said.
There was evidence, they added, that his stubbornness stemmed from boredom with the rewards he had been getting for right answers. The researchers found some more interesting toys to give as rewards. After two weeks of obstructionism, Alex grudgingly returned to the game, though he occasionally seemed to lapse back.
One of these apparent lapses occurred one day when an experimenter asked Alex “what color three?” Laid out before Alex were sets of two, three and six objects, each set differently colored.
Alex insisted on responding: “five.” This made no sense given that the answer was supposed to be a color.
After several tries the experimenter gave up and said: “OK, Alex, tell me: what color five?”
“None,” the bird replied. This was correct, in that there was no color that graced exactly five of the objects. The researchers went on to incorporate “none” into future trials, and Alex consistently used the word correctly, they said.
“We cannot determine what cognitive process led to this behavior,” the researchers wrote. “We suggest only that his action, occurring soon after a period of noncompliance, resulted from a lack of interest in the given task and was a possible attempt to make the procedure more challenging.”
In the future, the researchers said they want to test Alex for his ability to add and subtract small quantities, including possibly zero.
As they investigate whether Alex really understands zero, they will also have to untangle the meanings of “none” and “zero.”
Merriam-Webster’s online dictionary defines zero as follows: “the arithmetical symbol… denoting the absence of all magnitude or quantity,” or “the number between the set of all negative numbers and the set of all positive numbers.” The entry continues with several more definitions.
By contrast, the dictionary defines “none” as not any, not one, nobody, not any such thing or person, no part, or nothing.
Of course, these words may well mean different things to the authors of a dictionary, and to a parrot.
A related question is the history of both words. “None” seems to be older than “zero.”
Zero was common in the West only from the 1600s on, though similar concepts appeared earlier in fits and starts, according to J.J. O’Connor of the University of St. Andrews in St. Andrews, Scotland.
In pre-zero times, O’Connor wrote in an online essay, some mathematicians tied themselves in knots to solve problems that would have been much easier with a zero. But ancient peoples as a whole probably didn’t think of it because they didn’t need it: “If ancient peoples solved a problem about how many horses a farmer needed,” he wrote, “then the problem was not going to have 0 or –23 as an answer.”
“None” is considerably older than “zero” in Western cultures. It’s related to a neinn—an early medieval Viking word—and is similar to the still older Latin word noenum, meaning “not one,” according to the Online Etymology Dictionary.
Whatever the etymological roots of Alex’s utterances, his performance has its limitations, the researchers said. Several years ago, they tried to teach him to recite a number line by presenting written numerals on their own, without reference to groups of items. Alex performed rather poorly. Schoolchildren, by contrast, can usually learn this fairly easily.
Thus Alex’s apparent insight into zero doesn’t necessarily reflect across-the-board mathematical brilliance. Alex’s abilities might parallel those of children “who have trouble learning language and counting skills,” the researchers wrote.