Monday, September 19, 2005
Multithreaded Game Scripting with Stackless Python
The very basic problem that a creator of a game engine faces when it comes to scripting the game engine is, what will be the language that will be used. Ofcource there is the option of creating a custom language for the purpose, but today with the plethora of scripting languages available it doesn’t really make sense to go through all the development process of a custom scripting language. No matter how big the development team will be, it will never be possible to match the man-months that have gone into an existing scripting language.
I will try to introduce the scripting language Python and more specifically a special version of it, the Stackless Python, as I used it in the implementation of the Sylphis3D Game Engine. Stackless is a modified version of the mainstream version based on the work of Christian Tismer. The main feature that makes Stackless so attractive for use as a scripting language is the support for tasklets. Tasklets make it possible to create “micro-threads”, which allows the programmer to switch between several execution threads that only exist in the python environment and have no dependencies on the underlying OS threads. Some would call these threads “green-threads”. These threads has very small footprint on memory and CPU. You can actually create hundreds of threads with almost no overhead. Every tasklet has only a few bytes of memory overhead. And the scheduling of threads takes O(1) time with a simple Round-Robin scheduling algorithm. If we where talking about native threads we would have almost 1MB of memory per thread and high cost scheduling to do things we don’t need. To all that add that the engine would behave very differently on different operating systems. Even on different versions of the same operating system.
Game Engine, the game’s operating system
In most today game engines there is fundamental flaw in the way the game engine and game code is viewed. The game engine is not viewed like a concrete system capable of providing the low level facilities that a game needs. Instead it is viewed at the best case as a library. It is viewed as a collection of functions that basically encapsulates the low-level hardware driving. This drives to a situation that the game code actually is aware of the underlying procedures of the game engine.
Instead the game engine should be the operating system of the game and the game entities should be like the processes running in it. The engine designer should have that in mind when creating any aspect of the engine. It is starting to be obvious that to accomplish that the game code should run at a different “level” that the actual game engine, so that the game engine can have complete control over the execution of the game code. In real operating systems this accomplished with support from the hardware. In our case this different “level”, is simulated by running the game code in a virtual machine, hence the scripting language.
Trying to create such an analogous design for the game engine we have to consider what the analog of the process of a real operating system will be in the game engine. In a game engine a process will be an “actor”. The “actor” will be the fundamental game creation block. The design tries to make everything an “actor”. The player is an actor, the objects in the game are actors and even the game world and the game itself is an actor. These actors are autonomous entities actually running in the game engine. The actor must be totally encapsulated and never has to directly worry about other actors.
Multithreaded design
The nature of a game is multithreaded. This because the game tries to simulate a world populated by objects that do various tasks in parallel. It is obvious that this must be simulated since we don’t have as many processors as objects in a game! Most engines simulate that by iteratively calling an update function of the actors in the world. The actor should update its state according to the time passed and return to the engine. Game engines with no scripting capabilities use this method and it is as a legacy used by even newer engines in the scripting environment. Most engines with scripting just got only one step further in implementing the update() methods in the scripting language. This doesn’t really gives as much. It would be a waste not to exploit the fact that the game code runs in a virtual machine to hide that implementation detail. The actor should run in it’s own context. In Sylphis every actor can have at least one thread, as an operating system’s process does.
In a multithreaded design the actors code is linear in execution and easier to write. For example lets consider the code that closes a door in a game. In a classic game engine (Quake I/II/III, Unreal, etc) the code looks something like this :
void update(t_actor *actor,double timedelta){
if(actor->state == CLOSING){
Cvector3 pos = actor->origin;
pos = pos + actor->velocity * timedelta;
if(pos == restpos){
actor->state = CLOSED;
return;
}
actor->origin = pos;
}
}
Although the above example is far from the complexity of true code that would contain all the states etc, it is still ugly. Below we can see the code that does the same thing using threads and written in python.
def close(self):
self.setVelocity(self.mMoveVelocity)
self.sleep(self.mCloseMoveTime)
self.setVelocity(CVector3.ZERO)
The benefits are obvious. In the python example one can tell what the code does in a glance. There is no need for state keeping. The state is encapsulated in the current code that is executing. When the door calls the close() method the door in closing state and when it will return it will be in closed state.
Why multithreaded design was avoided
Multithreaded environments can be a headache. Experienced programmers know that and try to avoid threads, while on the other hand inexperienced programmers find them quite attractive and usually make applications a mess. It all boils down to synchronization. Synchronization of threads can be very hard to get right and is wet ground for a big number bugs to grow. Add to that, that races and bugs in software using threads can be extremely hard to hunt down, since we almost enter the zone of non-determinism. The efficiency of threads was also a matter. When you have a game you must be fast. The game world consist of many actors that need to be updated at least every frame. You don’t want a scheduler to take up half of your CPU trying to decide which actor to run next when the number of actors goes up. Also if you have to spawn and delete bullet actors in the game coming from a fast machine gun, you should start looking for thread pools and other techniques since spawning a thread can take long.
To sum it up below is the list of reasons that multithreaded environments where overlooked by game developers :
- Scheduling overhead
- Memory cost per thread
- Inefficient thread creation
- Synchronization problems
- More bug prune
- Difficult to debug
Overcoming multithreaded design pitfalls
As we said before using Stackless Python and tasklets can solve the scheduling overhead, memory costs and slow thread creation. Having that granted we have a foundation to work on but the remaining problems like synchronization stays. We have to find ways to maintain the benefits of the multithreaded environment while removing the problems.
What makes the multithreading synchronization problems big and races common is preemptivity. Preemptive multithreading introduces a non-determinism in the code execution that requires the programmer to cover his back while doing anything. You can be interrupted at anytime at any part of code. Preemptive multithreading might be great for interactivity on the desktop, but gives nothing to game code. It turns out that not only there is no need for it, but also it would not fit in a game environment. To make that clear we will have to try and look closer to what a game engine does. The game engine simulates a world different from the world the computer is running in. This other world has its own time. Time that ticks away in specific steps, ether constant or variable. Time in the game world does not pass as the game code executes. So why would one want to preempt an actor running some calculations? There is no rush to put another actor to run, since time does not pass! To put it another way, it would be wrong since actors with more complex behaviors will progress slower in the game world since they take more CPU! The actors code executes in our world while the actor runs in the game world. The only solution is non-preemptive multithreading. This necessity also solves many synchronization problems. A non-preemptive environment rarely needs locks and can be deterministic. In a non-preemptive environment a semaphore is just a variable, and there is no need to lock between reading a variable and updating it.
Handling events
The next big issue of game code is passing and handling events. Most work in a game concerns passing around events. So the engine must support facilities to provide efficient and flexible event passing. Using the python scripting language alone is very helpful based on the dynamic nature of the language. The event framework can be very relaxed and simply provide the basic functionality to pass around objects. In Sylphis the event framework is based around the actor. When the game engine spawns an actor in the game scans the actual methods of the actor for a handleEvent() method. If the actor has the method implemented the engine creates an event handling thread that calls the handleEvent() method passing the event object that can be anything. This creates an asynchronous event passing system.
Using python in this implementation allows to implement certain constructs that further beautify the game code. Consider the case we discussed above with the door. Suppose now that we want the door to respond to collisions while closing and stop the move so that it will not crash the colliding actor. The original idea for handling such cases is to implement a callback function that is called by the engine when a collision happens and then change to the appropriate state. As it seems this can be more straightforward if we see collisions as language exceptions. The micro-thread implementation allows the engine to raise exceptions on threads. So we process the events in handleEvent() and if we have an interesting collision raise the collision object as an exception to the main actor thread or any other appropriate thread. Below is the modified close() method from above with the enhanced functionality of stopping when colliding.
def close(self):
try:
self.setVelocity(self.mMoveVelocity)
self.sleep(self.mCloseMoveTime)
except CCollition, col:
print "Oups.. sorry", col.name
self.setVelocity(CVector3.ZERO)
The above core feels natural. The collision is an exception in the door’s movement and it should be modeled like one. You can actually explain the above code to non-programmers of your team and be optimistic that they will understand it. They may even start tweaking and create interesting stuff with it. This is due to the fact that the code is laid out as people think of the procedure and don’t have to mess with complex state machines even for the simplest task.
Actions
The work that goes into setting up a game world, usually involves the connection of actors with event dependencies. For example we would like to setup a door that once opened an alarm goes on. This should be easily setup from inside the game world editor and should be easy for the artists to understand and use. Most games engines today support things like that through the use of special trigger actors that are capable triggering other actors when an event occurs. Although this works most of the time, it is not so flexible since at the end there is only one trigger signal. Every actor has to make good use of it, but even then it will not be enough. A door for example will probably open on a trigger event or close if is was open. What happens when we want to trigger the door to do something different, like unlock? Then we will have to start writing special case code to support it, and the door is just a simple case. To take it a bit further, consider that we need to make the player’s AI teammate say “Good job” when the door is unlocked. It is clear that a system that allows all that at the world editors level is necessary.
So additionally to events lets introduce another concept. The action. Below we can see the basic properties of actions :
- The action can be the effect of an event, or the effect of another action.
- An actor can have several actions.
- An action is a concrete and discrete procedure that the actor can follow.
- An actor can be taking an action or not. There is no middle state.
- One can monitor the actions an actor is taking.
- It is possible to make an actor follow an action.
The actors must be designed around the action concept. Below we can see the various actions some actors have.
CDoor
- Open
- Close
- Lock
- Unlock
- Toggle
CSpeaker
- Play
- Pause
- Stop
- Rewind
It would be nice is we could bring this conceptual level down to programming level. To program the above actions with regular state machines and other usual programming methods, is like trying to implement UML class diagrams in C. It can be done, but it is not going to be as smooth as it would be with an object-oriented language. So what we need is a language that will support action-oriented programming. It turns out that Python can support this model, through the use of threads and the dynamic capabilities of the language.
Lets examine in more detail what we are trying to design, in the highest level. An actor can have a number of actions, which are basically implemented as methods, of the actor class. These methods, is a good thing to have a special name so that it can be distinguished from normal class methods. For example an action method for when a door is opened is named like this:
def Action_OnOpen(self, other):
pass
Now the world editor should give the artist the ability to connect actions. For example the artist might connect the action OnOpen of a door with the OnTurnOn of a light. The engine will then dynamically intercept the calls to the method Action_OnOpen() of the door actor and also call the Action_OnTurnOn() method of the light actor. This interception is possible with the dynamic nature of the Python language. When the game engine spawns an actor in the game world looks at the action connection table that the world editor provides and replaces the appropriate methods with an interceptor method that calls the original method and the target action method on the target actor.
Is has to be clear that these methods are no dummy methods that are just used for signaling. These methods can be the methods that do actual work. To clarify that lets remember the close() method we of the door we described earlier in the document. That method can be turned into a action just by renaming it and adding the additional parameter.
def Action_close(self, other):
What we have now is an action method that can be connected to other
try:
self.setVelocity(self.mMoveVelocity)
self.sleep(self.mCloseMoveTime)
except CCollition, col:
print "Oups.. sorry", col.name
self.setVelocity(CVector3.ZERO)
actors to make them act on the event of the door closing. The action
can be connected in the other direction also, to make the door close
when something happens to another actor. Connecting the OnDie() of the player to the close() of the door will make the door close when the player dies, while connecting the close() of the door to the OnDie() of the player will make the player die when the door closes.
source:http://harkal.sylphis3d.com/2005/08/10/multithreaded-game-scripting-with-stackless-python/
With this system the actors gain the ability of broadcasting action about anything. Also the actor can listen for actions of other actors. All that without changing the actor’s code. All setup in the world editor and linked dynamically at initialization of the actors.
Actor hibernation and migration
Using micro-threads implemented in Stackless Python allow us to save the state of an actor without any hassle. Stackless has the ability to pickle (serialize) tasklet objects like normal objects. This means that you can suspend a running thread and get a string representation of it, with all the running state included like a normal object. That string can be saved on disk and later on the tasklet can be re-animated. So the use of a multithreading model in the scripting of the engine, does not remove any of the easy save/load benefits of running in a virtual machine.
With a bit more care we can also implement actor migration. For example we can have a distributed system running the game world. This can be common in a massively multiplayer online game where the number of players and game objects are enormous. In a smaller scale this can be a peer-to-peer multiplayer game. In the above cases it might be useful to be able to migrate an actor from one machine to an other. Using Stackless this can be possible without the actor having the slightest idea that it was moved. Unfortunately special care must be taken when the actor uses python objects implemented in C++.
IE flaw puts Windows XP SP2 at risk
The flaw, which also affects systems running Windows XP, is found in the default installations of Microsoft's IE, according to an advisory released by the security company on Thursday.
"The flaw is not wormable but allows for the remote execution (of code) with some level of end-user intervention," said Mike Puterbaugh, eEye's senior director of product marketing.
The discovery of this IE flaw comes just over a month after Microsoft issued a cumulative patch addressing three vulnerabilities for IE.
The new IE flaw also adds to another vulnerability, discovered last month, that affects systems using Windows XP SP2.
Microsoft's Windows XP with SP2 is designed to make it more difficult for attackers to run malicious software on users' computers.
A Microsoft representative confirmed that the company had received the report from eEye and said it will be investigating the issue. Because the details of the vulnerabilities have not been made public, users are not at risk of an exploit being developed to take advantage of the flaw, the representative said.
eEye has provided Microsoft with details about the flaw, but the security researcher does not release details to the public until a vendor has developed a relevant patch or issued an advisory.
source:http://news.com.com/2102-1002_3-5868867.html?tag=st.util.print
Trigonometry Redefined without Sines And Cosines
source:http://science.slashdot.org/article.pl?sid=05/09/17/1313249&tid=228&tid=14
A Microsoft switch: Vista feature to go cross-platform
As part of the announcement of the next generation look and feel for Windows Vista, Microsoft said that it will make a subset of the new presentation layer available for other platforms.
Windows Presentation Foundation, which provides the rich front end for Vista, will also eventually be available in compact form for other platforms, such as the Apple Macintosh, older versions of Windows, and smart devices such as phones or PDAs.
The portability is possible because the underlying technologies of Expression, including XAML, an XML-based user markup language for page layouts; and JavaScript, a scripting language for developing the controls on these pages, are not closely tied to any platform.
However, the impetus for Microsoft making parts of the rich Vista client available elsewhere are not simply because the technology makes it possible. Instead, the driver is that Microsoft and others are competing to deliver the de facto standard for the next-generation rich Internet client.
The cross-platform version, for now code named Windows Presentation Foundation Everywhere (WPF/E), would only get a subset of Vista richness.
According to Wayne Smith, UK-based Senior Product manager for Expression tools, WPF/E will likely support vector graphics, video, animation, text, and controls. However, 3D and hardware accelerators will probably not be part of the package. "Right now we're just scoping this out," he said, adding that WPF/E is mostly concept at this point.
It won't be part of the Vista release, set for the second half of next year.
When WPF/E becomes available, it will be in the form of an Active X control that can be embedded in applications or as browser plug-in.
source:http://www.cbronline.com/article_news_print.asp?guid=A88BEF62-83F2-4D8A-90DC-D8293FF51BDF
The Slurpee at 40
![]() |
Aging well |
These childhood memories seemed improbably significant the other day when I came across two news items: one informing me that the Slurpee has turned 40; the other that a 7-Eleven has opened in Manhattan, an island I migrated to eight years ago in order to pretend I was never the type of person who drank Slurpees every day. As much as I want to be thrilled, or at least ironically nostalgic, the pop sociologist in me can't help but raise an eyebrow. Here we have yet another instance of adults (in this case New Yorkers) co-opting the territory of teenagers (blended icy beverages) in such a way that the trashy and indulgent becomes hip and respectable. The Frappuccino, the Coolatta: At some point over the last few years, grown-ups developed an ability to order these things and take themselves seriously. Which makes me wonder: Is my beloved Slurpee looking to reinvent itself, Gatsby-like, as a pseudo-sophisticated drinkable dessert? How long until my editor asks me to grab a Slurpee so we can discuss my next piece?
The Slurpee, like so many great innovations and perfectly nice human beings, was an accident. In the late '50s, a Kansas Dairy Queen owner named Omar Knedlik found his soda machine was on the fritz. He tossed some bottles of pop in the freezer and discovered people went into conniptions for the slushy texture that resulted when the soda partially froze. Wheels turned. He invented a machine to slushify water, CO2, and flavored syrup. In 1965, 7-Eleven bought the machines from Knedlik, hired an ad copywriter to coin an irresistible name, and the Slurpee was born. Back then it cost a dime. Four decades on there have been more than 200 flavors, ranging from the earnestly goofball of yesteryear (Blue Blunder Berry) to the quasi-classy of today (Mochaccino). Michael Jackson reportedly plunked down $75.62 to install a Slurpee machine at Neverland Ranch. Eleven million Slurpees are sold each month and hit the eager palate at a cryogenic 28 degrees. In total some 6 billion brains have been frozen since the dawn of the Slurpee. Here in the United States the drink is most beloved in Detroit, but, curiously, it's up in the Winnipeg tundra where the Slurpee is most popular—further evidence, at least to this patriotic-when-convenient mind, that Canadians really just want to be Americans.
But enough with the logistics. Explaining the appeal of the Slurpee is a bit like explaining the appeal of pure oxygen or terrific sex: Those who don't get it are simply not to be trusted. Slurpees are divine because of their unapologetic garishness, a giddy reminder that no amount of sugar is ever too much. That the expression "brainfreeze"—meaning the needling headache brought on by drinking something too cold too quickly—was trademarked in 1994 says it all: The point is masochistic, to find pleasure in pain, to embrace evil over good. (Sometimes this is taken too literally: Near my Maryland home, a teen was recently convicted of murdering another teen for trying to buy a girl a Slurpee.) My point here is to say that it's not (too) hyperbolic to equate drinking a Slurpee with surrendering to the greed and gluttony that is being a chronically shortsighted, diabolically unthinking American. In this, the Slurpee serves as a precursor to everything else 7-Eleven is about: namely, smoking cigarettes and drinking too much beer. (The franchise is the nation's No. 1 retailer of Budweiser.)
Or, wait, scratch that: The Slurpee represents everything 7-Eleven was about.
Celebrations of any sort—even those for drinkable sugar—are always somewhat preposterous, and in the strain to muster up excitement, something darker is often exposed: that whatever we're celebrating no longer exists in the form we're busy praising. 7-Eleven may be purporting to rejoice over the Slurpee, its neon-bright mascot, but in truth the Dallas-based franchise has spent the last year trying to distance itself from its identity as a haven for loitering teens looking to ignore their parents and husbands looking to pick up some beer and smokes before heading home to ignore their wives. A great deal of money was spent on an ad campaign lauding 7-Eleven's new line of designer food—turkey and zesty havarti on wheat-nut bread, a blue-corn wrap with turkey and tomatillo sauce, even a "chili-lime" hot dog to compete with the classic Big Bite. In a bygone era the glory of 7-Eleven was simple: Buy the food when you're 16 and it'll still be edible when you're in a nursing home 70 years later. Now they're proud to tell you that the sandwiches are made "fresh" daily. In other words, 7-Eleven is singing the praises of the Slurpee at the very moment when they're aggressively reaching out to an un-Slurpee demographic: self-consciously refined, ambitiously healthy yuppies.
Which brings us to 7-Eleven's glistening new Manhattan outpost. Apparently the location is doing well, having been dutifully covered in the New York Times and worshipped by burnished, carb-counting types looking to dupe themselves into thinking they're not burnished, carb-counting types. Slacker-hating sophisticates can now pretend to be slackers, projecting a false sense of value onto the very suburban childhoods that felt so valueless at the time. What's glimpsed here is a small piece of a much larger and much stranger social machinery: With misguided nostalgia comes a tendency to fetishize the mundane because the truth is either too earnest (I miss being young!) or just plain sad (When did I become this person?). As a result, people no longer simply wander inside and drink a Slurpee, but wander inside and "drink a Slurpee." I'd be concerned about this, worried that the point of the Slurpee will be missed, except years of experience have taught me that after three furious sips, the overly self-aware brain will be frozen, all meta-oriented cells will be annihilated, and, for a few painful seconds, we will all be bumbling freshman again. Truly.
source: http://www.slate.com/id/2126309/
Enthusiast uses Google to reveal Roman ruins
![]() |
![]() |
![]() |
|
His eye was caught by unusual 'rectangular shadows' nearby. Curious, he analysed the image further, and concluded that the lines must represent a buried structure of human origin. Eventually, he traced out what looked like the inner courtyards of a villa.
Mori, who describes the finding on his blog, Quellí Della Bassa, contacted archaeologists, including experts at the National Archaeological Museum of Parma. They confirmed the find. At first it was thought to be a Bronze Age village, but an inspection of the site turned up ceramic pieces that indicated it was a Roman villa.
"Mori's research is interesting in its approach," says Manuela Catarsi Dall'Aglio, an archaeologist at the National Archaeological Museum of Parma. He says the find may be similar to a villa the museum is currently excavating at Cannetolo di Fontanellato, which was found during the construction of a high-speed rail network. "Only a scientific, archeological dig will tell," he adds.
The local authorities will have to approve any archaeological digs before they can take place.


Story from news@nature.com:
http://news.nature.com//news/2005/050912/050912-6.html
Troubling Exits At Microsoft
Once the dream workplace of tech's highest achievers, it is suffering key defections to Google and elsewhere. What's behind the losses? |
![]() |
COVER STORY PODCAST |
Things didn't turn out that way. In July, Lee bolted from Microsoft for Web search king Google Inc. (GOOG ), and once again his personal journey is emblematic of a shift in computing's balance of power. These days it's Google, not Microsoft, that seems to have the most momentum. Microsoft sued to stop Lee from working for the upstart, citing his noncompete agreement. But on Sept. 13 a state judge in Seattle ruled that Lee could work for Google, with some restrictions, pending a January trial. Microsoft said it was happy the judge limited the type of work Lee could do. Yet when court adjourned, Lee smiled broadly and threw both arms in the air. "I feel great," he said outside the courtroom. "I can't wait to start work tomorrow morning."
Contrast that with how Lee felt about Microsoft. During the two-day hearing he painted a distinctly unflattering picture of the company's inner workings. Lee, who opened Microsoft's research lab in China in 1998 and moved to headquarters in Redmond, Wash., two years later, fretted over what he saw as repeated missteps. In court he detailed how the more than 20 product-development centers in China tripped over one another, duplicating efforts and even fighting over the same job candidate. Lee called the company "incompetent." After the ruling he praised Google, noting, "the culture is very supportive, collaborative, innovative, and Internet-like -- and that's bottoms-up innovation rather than top-down direction."
For most of its three decades, Microsoft has faced intense criticism. But in the past it came from the outside world. Rivals complained about its heavy-handed tactics. PC makers griped that it was hogging the industry's profits.
Now much of the sharpest criticism comes from within. Dozens of current and former employees are criticizing -- in BusinessWeek interviews, court testimony, and personal blogs -- the way the company operates internally. This spring two researchers sent Chairman William H. Gates III a memo in which they wrote: "Everyone sees a crisis is imminent" and suggested "Ten Crazy Ideas to Shake Up Microsoft." Many workers, like Lee, are in effect saying: "I'm outta here." More than 100 former Microsofties now work for Google, and dozens of others have scattered elsewhere.
It's not a mass exodus. Microsoft has 60,000 employees, and many of them are undoubtedly happy with their jobs and the company's culture. While Microsoft's annual attrition rate rose one percentage point from fiscal 2003 to 2004, it's still just 9%, a bit lower than the industry average. Microsoft says it receives 45,000 to 60,000 job applications a month, and over 90% of the people offered jobs accept.
TOO BIG TO MOVE FAST?
Still, there's no doubt that Microsoft is losing some of its most creative managers, marketers, and software developers. Lenn Pryor, director of platform evangelism, left for Internet phone startup Skype Technologies, now being acquired by eBay (EBAY ). Stephen Walli, who worked in the unit set up to parry the open-source threat, split for an open-source consulting firm. A long list of talent has moved to Google, a trip made easier by the company's recent establishment of an office in nearby Kirkland, Wash. Joe Beda and Gary Burd, respected engineers, left and helped set up the instant messaging service Google Talk. Mark Lucovsky, who had been named one of Microsoft's 16 Distinguished Engineers, defected to Google last November. He blogged that Microsoft's size is getting in its way. "I am not sure I believe anymore that Microsoft knows how to ship software," he wrote.
Employees' complaints are rooted in a number of factors. They resent cuts in compensation and benefits as profits soar. They're disappointed with the stock price, which has barely budged for three years, rendering many of their stock options out of the money. They're frustrated with what they see as swelling bureaucracy, including the many procedures and meetings Chief Executive Steven A. Ballmer has put in place to motivate them. And they're feeling trapped in an organization whose past successes seem to stifle current creativity. "There's a distinct lack of passion," says one engineer, who would talk only on condition of anonymity. "We're missing some spunk."
No question, most companies would kill to have Microsoft's problems. It's comfortably the most profitable player in the tech industry. And it's making more money than ever, with net income of $12.3 billion on revenues of $39.8 billion for the past fiscal year. Its twin monopolies, the Windows PC operating system and the Office suite of desktop applications, give it important advantages when it thrusts into adjacent markets, such as server software for corporations and instant messaging for both businesses and consumers.
Ballmer maintains that the company is in terrific shape. In an interview in a Las Vegas hotel, he says one of Microsoft's strengths has always been its culture of self-criticism. What's different now, he says, is that the internal debate is spilling out into public view because of blogs and e-mail. He says internal surveys show that 85% of the company's employees are satisfied with their jobs, about the same level as in past years. "We have as excited and engaged a team of folks at Microsoft as I can possibly imagine," says Ballmer. "[Employees] love their work. They're passionate about the impact they're having on customers and society. [The 85% number] is a real, real powerful statement about where our people are."
Indeed, there are areas of excitement within Microsoft. One is MSN, the Internet operation, where the search group is the underdog competing against Google. Another is the Xbox group, which is racing full speed against Sony Corp.'s (SNE ) leading PlayStation 2 to win over the next generation of video gamers. It's launching Xbox 360 this Christmas, months ahead of PlayStation 3. "If you take a look at where we're going with innovation, what we have in the pipeline, I'm very excited. The output of our innovation is great," says Ballmer. "We won the desktop. We won the server. We will win the Web. We will move fast, we will get there. We will win the Web."
The company plans to release a series of major upgrades for most of its core products in the coming 18 months. That'll culminate late next year in the long-awaited update of Windows, called Vista. Analysts such as Drew Brosseau of SG Cowen & Co. expect it to help financial results -- he's predicting that revenues will rise 12% during the next fiscal year, to $44.5 billion, as profits increase 12%, to $13.8 billion. He thinks the stock, now at $27, will follow. "It can be a mid-30s stock in 12 months," says Brosseau.
Still, Microsoft faces serious long-term challenges: the rising popularity of the Linux open-source operating system, a plague of viruses attacking itssoftware, and potent rivals such as Google in the consumer realm and IBM (IBM ) in corporate computing. It's the company's ability to respond to these challenges that current and former employees fear is being compromised by Microsoft's internal troubles. They're concerned that Ballmer and Gates aren't taking seriously enough the issues of morale and culture. "Why in the world did I start [this blog]?" writes Mini-Microsoft, an anonymous employee who writes a blog that has become a gathering place for the company's internal critics and reformers. "I love Microsoft, and I know we have the innate potential to be great again."
To many employees, Vista, the Windows update, exemplifies the company's struggles. When the project was conceived half a decade ago, it was envisioned as a breakthrough: an operating system that would transform the way users store and retrieve information. But the more revolutionary features have been dropped, and Vista will arrive three years after researcher Gartner Inc. originally predicted that it would ship. Worse yet, they say, nobody has been held accountable. "People look around and say: 'What are those clowns doing?"' says Adam Barr, a program manager in the Windows group.
In the past, when Microsoft faced an emerging threat, Gates could be depended on to lead it in a new direction. Most famously, in 1995 he belatedly recognized the importance of the Internet and led a furious charge to catch up. But in 2000 Gates passed on the chief executive job to Ballmer. When Ballmer took over, he was determined to overcome the looming challenge of corporate middle age. He pored over how-to management books such as Jim Collins' Good to Great. But since Ballmer took the helm, Microsoft has slipped the other way. The stock price has dropped over 40% during his tenure, and the company, whose revenue grew at an average annual clip of 36% through the 1990s, rose just 8% in the fiscal year that ended on June 30. That's good for a company of Microsoft's size, but it is the first time the software giant has had single-digit growth.
The company's performance even has some anonymous writers on the Mini-Microsoft Web site calling on Gates to ask Ballmer to step down. That's very, very unlikely. Gates urged Ballmer to become chief executive nearly six years ago in the wake of the company's antitrust battles with the Justice Dept., when the top job became too overwhelming for him. The two have been close friends since their days at Harvard University, and together they hold 12% of the company's shares. And board members say they stand firmly behind Ballmer. "I am fully supportive of the transformation that Steve is leading the company through," James I. Cash Sr., a director and former professor at Harvard Business School, wrote in an e-mail to BusinessWeek. "He is one of the best leaders I've observed over the last four years I've been on this Board, and the Board stands in full support of him and his efforts."
Ballmer says he should be judged on his overall performance. "At the end of the day the proof is in the output. If you look at any of the critical dimensions, our company has performed well, and I'm bullish about how we will drive to continue."
While Microsoft's internal reformers don't directly criticize Gates, they're frustrated with the sluggish pace of product development. As the company's chief software architect, Gates bears that responsibility. He's the author of a strategy called "integrated innovation." The idea is to get Microsoft's vast product groups to work closely together to take advantage of the Windows and Office monopolies and bolster them at the same time. But with so much more effort placed on cross-group collaboration, workers spend an immense amount of time in meetings making sure products are in sync. It "translates to more dependencies among shipping products, less control of one's product destiny, and longer ship cycles," writes Dare Obasanjo, a program manager in Microsoft's MSN division, on his blog.
To shake Microsoft out of its malaise, radical surgery may be in order. "I think they should break up the company," says Raj Reddy, a professor of computer science and robotics at Carnegie Mellon University. Reddy is no passive industry observer: For the past 15 years he has served on Microsoft's Technical Advisory Board, a group of academics who help guide the company's research efforts. Reddy believes that a handful of Microsoft spin-offs, seeded with some of the company's $37.8 billion in cash, could compete more nimbly in the marketplace. Some insiders agree. Microsoft's Barr recently blogged that the company should be broken up after Gates and Ballmer retire.
There are plenty of bold thoughts floating around Microsoft. The two reseachers who sent the "Ten Crazy Ideas" memo to Gates are Kentaro Toyama and Sean Blagsvedt. The 12-page document, reviewed by BusinessWeek, suggests giving product groups increased autonomy and calls for the creation of "bureaucracy police" with the authority to slash through red tape. "It's said that large organizations won't change their ways until a crisis really hits," the authors write. "Everyone sees a crisis is imminent. Incremental changes aren't enough. Are we the kind of company that can dodge the crisis before it happens?"
MAINTAINING, NOT INNOVATING
It's a question that echoes through the corridors in Redmond. To succeed, Microsoft needs motivated workers camping out in their offices at all hours to compete with tenacious rivals such as Google, Yahoo! (YHOO ), Salesforce.com (CRM ), and a reborn Apple (AAPL ) Computer. Yet current and former employees say there are many demoralized workers who are content to punch the clock and zoom out of the parking lot. "At this point there's a traffic jam at 9 o'clock in the morning and 5 o'clock at night," says ex-employee Walli.
Over its three decades of life, Microsoft has become an icon of American capitalism, a company that started with the intellectual firepower and relentless drive of Gates and his high school buddy, Paul Allen. It made billionaires out of its founders and multimillionaires out of thousands of its staff. And it created two of the most lucrative monopolies in American history -- one of them, Windows, so powerful that it ultimately brought trustbusters down on the company.
Now, strange as it seems, those monopolies are at the root of the company's malaise. As Microsoft fought the federal government and litigious rivals, it developed an almost reflexive instinct to protect Windows and Office, sometimes at the expense of looking for groundbreaking innovations. "Every time Bill and Steve made a change to be more like other big companies, we lost a little bit of what made Microsoft special," says a former Microsoft vice-president.
One reason some employees say Microsoft isn't innovating enough: It's too busy upgrading Windows. With some of its key breakthrough features gone, Vista's improvements include better handling of peripheral devices, such as printers and scanners, and cutting in half the time it takes to start up. Those are needed improvements, and there's no doubt that hundreds of millions of copies will be sold as people upgrade to new PCs. But the changes are hardly the stuff of cutting-edge software engineering. "So much of what Microsoft is doing right now is maintenance," says Mike Smith, a former software architect at Microsoft who left the company in 2003 to work for a Bay Area startup.
And that leads to an even more worrisome problem: discontent among its software programmers. Instead of coming up with the next great technology, Microsoft programmers have to cater to its monopolies. But top-flight engineers want to tackle the next great challenge. "They want to create new worlds, not defend old ones," says a former senior executive at Microsoft. "They want to storm the Bastille, not live in Versailles."
If Microsoft loses too many top developers, it will be hard-pressed to succeed in the new markets on which it has pinned so much hope. Google, for example, embarrassed Microsoft in October, 2004, by coming out with software that lets users quickly search the files on their Windows desktop before Microsoft released its own version.
Adding to employee frustration is the company's bureaucracy. After Ballmer became CEO, he put in place processes he hoped would help manage a bigger organization better. But instead of liberating employees to do great work, Ballmer's moves have been stifling, some workers say. With so much effort placed on cross-group collaboration, employees spend more time in meetings making sure product strategies are in sync. The company schedules executive product reviews several times a year, and preparing for them is hugely time-consuming. That prep work cuts into the more interesting work creating new technologies and products.
SWEATING THE SMALL STUFF
To Ballmer's Chagrin, some of his up-and-coming programmers have left for Google. He was apoplectic about Lucovsky's departure, according to documents made public during the Lee trial. Lucovsky said in a sworn statement that after he told Ballmer about his plans to move to Google, the beefy CEO threw a chair and cursed Google's chief executive. "F__ing Eric Schmidt is a f__ing pussy. I'm going to f__ing bury that guy.... I'm going to f__ing kill Google," Ballmer said, according to Lucovsky. In a statement, Ballmer calls Lucovsky's account "a gross exaggeration of what actually took place."
Some workers express frustration that Microsoft is so busy protecting its PC-based businesses that it comes up short when competing on the Web. Take the customer relationship management (CRM) market -- software that companies use to track sales and customer service activities. Microsoft targeted it 2 1/2 years ago with a traditional software package, Microsoft CRM. Today roughly 4,000 companies run the software for nearly 100,000 staff. Not bad, but Microsoft hasn't been nearly as successful as Salesforce.com Inc., a trailblazer of Web-based CRM software, with 308,000 users at 17,000 companies.
The secret to Salesforce.com's success: the speed with which it can update its software. Microsoft last updated its original CRM software in January, 2004, with plans for a new version in first quarter, 2006. Meanwhile, Salesforce is constantly fixing bugs and adds features without interruption to the customer or added expense. All customers need to do is open a Web browser to run the program. Microsoft CRM boss Brad Wilson argues that business software is complex and best sold as a package that customers run on their own computers. "This is really about business process where you've got multiple steps," Wilson says. "It is a much more extensive thing that often requires a lot of people, a lot of time, and a lot of resources."
While upstarts like Google and Salesforce have Microsoft on the defense, the biggest threat to the company may be its own moves. With revenue growth slowing, Ballmer has tried to squeeze more down to the bottom line to make the company more appealing to investors. In the past fiscal year he slashed $2.6 billion out of operating expenses. But that came at a price. Microsoft sliced health benefits, introducing, for example, a $40 copayment on some brand-name prescription drugs. Within a week of announcing the benefits proposal in May, 2004, human resources received 700 e-mails. Of those, 80% were negative, and fewer than 1% were positive, according to an internal e-mail obtained by BusinessWeek. One employee wrote in an e-mail: "Small things like this chip away at employee loyalty and morale and in the long run do more harm than benefit."
Even the cuts that seem trivial have dampened morale. Just whisper the word "towels" to any Microsoft employee, and eyes roll. Last year, Microsoft stopped providing a towel service for workers who used company locker rooms after bike rides or workouts. Employees who helped the company build its huge cash stockpile were furious.
And don't even mention stock options. Employees long counted on them to bolster their salaries. Microsoft minted thousands of employee millionaires as the stock climbed 61,000% from its 1986 public offering to its peak in 2001. Now shares are trading exactly were they were seven years ago. Microsoft has doubled its payroll in that time, adding more than 30,000 new employees, not including attrition. That means more than half of Microsoft's employees have received virtually no benefit from their stock holdings. Instead, they're working for a paycheck and not much else.
And even if the stock does begin to climb, employees won't hit the kind of jackpot their predecessors did. Two years ago Microsoft stopped issuing big dollops of stock options, retreating to more modest helpings of stock grants. The idea was to help retain workers by giving them a sure thing -- stock with some value, since so many options were underwater. Meanwhile, 90% of the tech industry still rewards employees with stock options.
RECRUITING SLACK
Microsoft's compensation moves have created a haves-vs.-have-nots culture. Newbies work for comfortable but not overly generous wages, while veterans have a lucrative treasure chest of stock options. Now a new pay scheme, scheduled to go into effect this fall, threatens to make the gulf even wider. If they meet incentive goals, the 120 or so vice-presidents will receive an eye-popping $1 million in salary a year, and general managers, the next level down, will get $350,000 to $550,000, according to a high-ranking source. But the rest of the staff is paid at market rates.
The pay disparity is exacerbated by Microsoft's rating system. The company uses a bell curve to rate employees in each group, so the number of top performers is balanced by the same number of underachievers. But Microsoft has a long history of hiring top-notch computer science grads and high-quality talent from the industry. Under the rating system, if a group works hard together to release a product, someone in the group has to get a low score for every high score a manager dishes out. "It creates competition in the ranks, when people really want community," says a former Microsoft vice-president. A company spokesman says managers don't have to apply the curve with smaller groups, where it's not statistically relevant.
Even on college campuses, long a fountain of talent for Microsoft, the tide seems to be turning. On Sept. 7, Massachusetts Institute of Technology's Science & Engineering Business Club held its annual recruiting barbecue, with about a dozen companies setting up booths to recruit as many as 1,500 students. "There was a lot of buzz around the Google table and not a lot around the Microsoft table," says Bob Richard, associate director of employer relations at MIT. How much? When Richard walked through, he says, students were lined up six deep to talk to Google recruiters, while only two students stood at the Microsoft table. Carnegie Mellon's Reddy says his top students opt for Google and Yahoo ahead of Microsoft these days. Microsoft points out that in a survey conducted by market researcher Universum, the company ranks No. 1 among computer science students as employer of choice.
Microsoft is hardly the first company to struggle as it moves from adolescence to maturity. And it could learn some lessons from others that have made the transition more gracefully. Take General Electric Co. (GE ) The conglomerate has long boasted an entrepreneurial culture, with hundreds of managers running fiercely independent businesses. Those leaders are given free rein yet are held accountable for their own results -- meaning they can get the boot if they don't perform. "The process is transparent and rigorous and constantly reinforced," says Noel M. Tichy, a University of Michigan professor and leadership guru.
Ballmer says Microsoft is finding its way through the challenges of being a more mature company just fine and that the complaints of some employees simply reflect the kind of company Microsoft is. "We have for ourselves incredibly high expectations," he says. "And that's in some senses is the greatest blessing and opportunity anybody can ever have. Our people, our shareholders, me, Bill Gates, we expect to change the world in every way, to succeed wildly in everything we touch, to have the broadest impact of any company in the world. It's great they're saying: 'Come on, we can still do better.' Great." Ballmer smacks his meaty hands together for emphasis. "We need those high expectations."
Microsoft certainly is chock-full of smart employees who want to do better. Still, many of them say that jumping through bureaucratic hoops and struggling to link products together is preventing them from being the best they can be. There's a plea for action to Gates and Ballmer to do more -- slash the bureaucracy, tend to morale, and make it easier to innovate. But is anyone listening?
source:http://www.businessweek.com/print/magazine/content/05_39/b3952001.htm?chan=gl
Stolen UC-Berkeley laptop recovered
SAN FRANCISCO, California (Reuters) -- A stolen laptop computer holding personal information of more than 98,000 California university students and applicants has been recovered, but it uncertain whether the information had been tapped, the University of California, Berkeley said Thursday.
The laptop, which stored names and Social Security numbers, disappeared in March from a restricted area of the university's graduate division offices, forcing the university to alert more than 98,000 students and applicants of the theft.
The university said in a statement that a San Francisco man has been arrested and charged by the Alameda County district attorney with possession of stolen property after investigators discovered the laptop had been bought over the Internet by a man in South Carolina.
"UC police note that while a lab analysis could not determine whether the sensitive campus data was ever accessed, nothing in their investigation points to identity theft nor individuals involved in identity theft. It appears ... that the intent was simply to steal and sell a laptop computer," the university said in its statement.
Forensic tests showed files on the laptop had been erased and written over with a new operating system installation, leaving only residual data and making it virtually impossible to determine whether password-protected files had been breached, the university said.
"The San Francisco man who was arrested told police it is his practice to install a new operating system or erase and wipe clean old data from a computer before posting it for sale online," the university said.
Copyright 2005 Reuters. All rights reserved.This material may not be published, broadcast, rewritten, or redistributed.
source:http://www.cnn.com/2005/TECH/ptech/09/15/berkeley.id.theft.reut/index.html
Global warming 'past the point of no return'
A record loss of sea ice in the Arctic this summer has convinced scientists that the northern hemisphere may have crossed a critical threshold beyond which the climate may never recover. Scientists fear that the Arctic has now entered an irreversible phase of warming which will accelerate the loss of the polar sea ice that has helped to keep the climate stable for thousands of years.
They believe global warming is melting Arctic ice so rapidly that the region is beginning to absorb more heat from the sun, causing the ice to melt still further and so reinforcing a vicious cycle of melting and heating.
source:http://news.independent.co.uk/world/science_technology/article312997.ece
Missing lab mice infected with plague
The research lab, located at the University of Medicine and Dentistry of New Jersey, is operated by the Public Health Research Institute, a center for infectious disease research.
The mice reportedly were infected with the bacterium Yersinia pestis that causes bubonic and other forms of plague.
Scientists, however, said with modern antibiotics, plague can be treated if quickly diagnosed and is not the scourge that wiped out a third of Europe during the 14th century.
Richard Ebright, a Rutgers University microbiologist and a critic of the government's rapid expansion of bio-terrorism labs, said federal guidelines call for only minimal security at such laboratories -- a lock on the lab door and a lock on the sample container and cage.
"You have more security at a McDonald's than at some of these facilities," Ebright told the newspaper.
source:http://www.physorg.com/news6533.html
Analysis: Why Apple picked Intel over AMD
Steve Jobs sent a seismic shocker across the tech landscape in June when he announced Apple would phase out PowerPC chips and put Intel processors inside Macs starting in 2006. To some, the move seemed puzzling: Why would Jobs, the king of cool design, make a deal with half of the empire that conquered the world with cookie-cutter beige boxes? Jobs had an answer at the ready during his Worldwide Developers Conference keynote—a switch to Intel chips means better Mac hardware down the line. And analysts agree that the move ensures Apple’s ability to craft unique designs.
But one aspect of the “Why switch processor suppliers?” question hasn’t been answered. Intel isn’t the only X86 chipmaker in town. Why didn’t Jobs, ever the maverick, opt for the scrappy challenger, Advanced Micro Devices, instead of the old-money establishment, Intel?
The reason, industry analysts say, is that Jobs has a clear goal in mind: innovative designs. And such designs require the lowest-voltage chips, which IBM and Freescale were not going to make with the PowerPC chip core—and which AMD has not yet perfected.
“This is a practical, pragmatic Steve Jobs decision,” says Shane Rau, Program Manager, PC Semiconductors for market research firm IDC. Intel serves up the most complete line of low-power chips for mobile and small form factor computers, and a good-looking future roadmap for it. Also, Intel’s mammoth production capacity erases any supply worries.
Intel’s inside advantages
Mac users have come to see that Apple had good reasons for kissing PowerPC goodbye. The company knows trends when it sees them: mobile computing has moved past being a mere fad among a few users to become a way of life for many consumers. Yet PowerPC chips aren’t traveling down this road. Apple also needs faster chips, with more room to grow, and a chip partner with a clear roadmap for the future. Otherwise Wintel PCs could run too many miles ahead of Macs in the performance race.
Still, that doesn’t explain how AMD lost out to Intel. AMD has made a name for itself with super-fast machines, especially popular with gamers and bargain hunters, who value the couple hundred dollars you can often save by buying AMD-based PCs instead of Intel-powered ones. Jobs may have liked AMD’s hard-charging rep—but it’s possible he saw some problems he couldn’t ignore.
“One of the biggest considerations for Apple was getting a roadmap in all possible markets where they may play,” says IDC’s Rau, “and if you look at AMD’s product line, there are some holes.” Most notably, AMD hasn’t invested in creating a line of low voltage and ultra-low voltage processors that competes with what Intel offers.
AMD would need to develop a chip core especially suited to low-power, as Intel did with the Pentium M, a costly undertaking. Plus, the overall sales opportunity for such chips isn’t huge yet, says Nathan Brookwood, principal analyst for Insight 64. Because AMD’s research and development budget pales next to Intel’s, AMD has to pick its battles with Intel carefully—whereas Intel makes chips for almost every market niche. “Intel can afford to dedicate the resources,” Brookwood says.
By choosing Intel, Apple gets access to the highly-anticipated chip code-named Yonah, a low-power chip with a dual core processor, which aims to band together the power of two regular chips. Aimed at notebooks, Yonah should arrive in PCs in the first quarter of 2006; in keeping with its tradition of remaining tight-lipped about future products, Apple has not commented on when Yonah might show up in its mobile line.
“Yonah could have been the tipping point for Apple,” says Kevin Krewell, editor-in-chief of the Microprocessor Report. Yonah can power Apple notebooks that fly past today’s models.
AMD does not have a direct Yonah competitor that would be available in the same timeframe that Intel is discussing. Is AMD working on a Yonah-like competitor? AMD won’t discuss timeframe or specifics, but the company is currently developing a low-power, dual-core chip for thin and light notebooks, company spokesman Damon Muzny says.
Intel also employs a huge cadre of programmers, a resource that could be important to Apple as software gets rewritten for the x86 architecture, says Microprocessor Report’s Krewell. AMD’s programmer ranks don’t compare in size.
Future AMD opportunity
Interestingly, performance really isn’t the driving force behind Apple’s Intel vs. AMD decision. While the chip rivals have battled on performance for years, the machines now go toe-to-toe on everyday productivity applications. For most consumers on the PC side, the buying decision is much more about the PC maker than the chip supplier. (That said, on some measures, AMD shines. Gamers, for example, who want the absolute fastest speed on traditional apps know that AMD’s single-core Athlon 64 XP FX chips offer an edge over Intel’s best right now.) As more multi-threaded apps designed to better take advantage of dual core CPUs arrive, Intel and AMD will keep battling.
Dual-core chips, which both AMD and Intel are emphasizing, marry two CPUs together for horsepower, but can share certain parts like caches and buses. Unfortunately, the dual-core chips are currently throwing a lot of heat, so both CPUs cannot operate at their maximum clock speeds.
Intel will tackle this problem in the second half of 2006, revising its product line with a new generation of lower-power dual-core chips code-named “Merom” for mobile, “Conroe” for desktops, and “Woodcrest” for servers. Intel will emphasize low power consumption and performance, but not megahertz, Brookwood says. (AMD has emphasized performance, not megahertz ratings, for years.)
“Intel seems to have kicked the megahertz habit,” says Insight 64’s Brookwood. “It’s probably music to Steve Jobs’ ears,” he adds, noting how Jobs had to explain PowerPC chip performance on applications, not raw megahertz ratings.
Might Apple turn to AMD for future processor needs, post-transition to the x86 architecture? An AMD low-power chip line would be required for Apple to consider a switch, Brookwood says. But Intel will have a production capacity edge for at least a couple of years, an important factor, so a switch seems unlikely before then, Krewell says.
source:http://www.macworld.com/news/2005/09/15/intelvsamd/index.php
Revolution Controller Finally Revealed
|
Nintendo always emphasized they weren't following Sony and Microsoft, and boy, they weren't kidding. Nintendo decided Tokyo Game Show (an event the company typically forgoes in favor of Space World) was the right time to sit down with select members of the press and unveil their vision of gaming's future. And guess what? We were there.
We've seen the Revolution, touched and played with its radically different take on the game controller, talked with visionary designer Shigeru Miyamoto about the reasoning behind Nintendo's new approach and we're back with our lengthy, hands-on impressions.
Has Nintendo struck gold again? Read on to find out, and then watch the video of it in action that's available on our download page.
The Revolution Controller Basics: What The Hell Is It?
The controller for Nintendo's upcoming Revolution home console system is a cordless remote-control-like device designed to be used with only one hand. Two small sensors placed near the TV and a chip inside the controller track its position and orientation, allowing the player to manipulate the action on screen by physically moving the controller itself. For example, you could slash an in-game sword by actually swinging the controller from side to side, turn a race car just by twisting your wrist, or aim your gun in a shooter by pointing the controller where you want to fire.
|
A large "A" button sits in the prime spot under your thumb on the face of the controller, with a "B" trigger on the back of the unit for your index finger. Otherwise the button configuration is an interesting mix of old and new: standard D-pad up top, near the power button (to turn the Revolution console on and off), Start and Select in the middle, on either side of the intriguing "Home" button (Nintendo wouldn't go into detail, but sounds like it has to do with navigating system menus, which will be important given the Revolution's promised WiFi connectivity), and two more buttons near the bottom labeled "a" and "b." These last two may seem uncomfortably low for your thumb until you turn the controller 90 degrees and it becomes just like an old 8-bit NES joypad, with the D-pad under your left thumb and "a" and "b" under your right. (Don't forget-Nintendo has promised downloadable versions of their classic games for the Revolution's "Virtual Console.") Nintendo mentioned the button names and their exact sizes could still change slightly before production, but what you see here is close to the final design.
Elsewhere on the controller, the four lights at the very bottom represent which player it belongs to, and that hatch on the back is the battery compartment. (The prototype Revolution controllers we saw used regular batteries just like the GameCube's WaveBird wireless controller-and last a similar amount of time, according to Nintendo-but reps wouldn't say for certain if the final unit would use batteries or some rechargeable option.) The effective maximum range for the wireless controller is expected to be somewhere between 10 and 15 feet. A variety of different colored controllers were on display, including red, lime green, white, gray, black, and silver. Finally, rumble functionality is built in to the controller.
The Revolution Controller Demos: How It Works.
Alright, so enough about sticks and buttons and lights-how does this crazy new controller actually work with games? To answer that question, Nintendo's legendary game creator Shigeru Miyamoto (creator of Mario, Zelda, Donkey Kong, Pikmin, you name it) walked specially selected members of the press through a series of hands-on technology demos. These were not real Revolution games (all the names for the demos are ours)-they were super-simple, graphically crude offerings designed solely to show off different aspects of how the controller can work. Here's a rundown of what we saw, along with our thoughts on each:
|
A firing-range-like contest where two players compete to see who can shoot randomly appearing squares first. Aiming is done by pointing the controller itself at different points on screen, pulling the B trigger to fire.
IMPRESSIONS: A great demonstration of how intuitive the controller can be-pointing it to aim felt perfectly natural, right from the very first second, just like with a light gun. It always shot exactly where it felt like I was aiming, and was incredibly responsive to even slight wrist movements-I barely had to move my hand at all.
DEMO: GONE FISHIN'
Grab a pole and lower it into a 3D pond full of fish. Keep the line steady and when you feel a nibble from the rumble of the controller, pull it up quick!
IMPRESSIONS: An interesting showcase of the controller's 3D movement detection-you position the fishing pole above the pond by moving the controller forward or back, left or right in actual space, then lowered the hook by lowering the controller. It was a bit difficult to keep it steady in the water, but flipping the controller up when you got a bite, mimicking the motion of pulling up a fish in reel life, was a little thrill that just felt right.
DEMO: IRRITATING STICKS
Two players guide rotating sticks through a side-scrolling maze of tunnels and moving obstacles, gathering coins and avoiding touching the walls. (A lot like the PS1 game Irritating Stick, and exactly like the import-only GBA game Kuru Kuru Kururin.)
IMPRESSIONS: Another demo that needed no explanation, you just "got" it immediately-move the controller in whatever direction you want the stick to go. As a 2D game that requires exact movement (the caves get really narrow in parts), this one reinforced how precise and steady the controller's movement detection can be. Another interesting tidbit-if your controller fell outside the detection "box," the demo had an arrow pointing off the edge of the screen in that direction so you could get it back in the correct space.
DEMO: AIR HOCKEY
Exactly what it sounds like: Two players each control a flat stick on either side of a rink by moving around their controller, pushing a puck back and forth, trying to keep it out of the goal on their side.
IMPRESSIONS: A bit sloppy and more sluggish than the other demos, this one was supposed to show how you could put "english" on the puck by twisting the controller but in practice it didn't work as well as in other demos (and I'm not saying that just 'cause I kept scoring on my own goal...wait...OK, actually it is partially because of that.).
DEMO: BASKETBOWL
Two players drag or push a ball to their opponent's basket by making the ground under their controller-maneuvered cursor dip (by holding "B") or rise (by pressing "A").
IMPRESSIONS: This was oddly fun-you could try to move the ball by either making a hill next to it and pushing it along, or making an indentation for it to fall into, then using it to drag it across the court. When you got close to the basket, turning an indentation under the ball into a hill suddenly would fling it up into the air.
DEMO: WHERE'S WALDOASAUR
A simple demonstration of depth perception-the player searches for a particular pokemon on a giant map filled with the creatures (ala Where's Waldo), zooming in by pushing the controller towards the screen and zooming out by pulling away from it.
IMPRESSIONS: Nothing much to say here except that, as a Nintendo rep commented, you can see how this might be put to use for aiming a sniper rifle in a first-person shooter.
DEMO: PILOT WANGS
Manipulate a biplane through the air, trying to fly through rings scattered around the Isle Delfino hub world of Super Mario Sunshine.
IMPRESSIONS: This was about all the different ways the Revolution can detect tilting the controller. It was as if the controller was the airplane itself - as long as your movements weren't too sudden, the on-screen action would mimic your movements with very little lag time. After about a minute I was pulling dramatic dives and loop-de-loops, bullseye-ing plenty of rings.
|
Nintendo saved the best for last. This was the first section of the GameCube game Metroid Prime 2: Echoes, retrofitted to be compatible with the Revolution controller and its analog add-on piece (the "Nunchaku" set-up mentioned earlier). As on the Cube the analog stick controls movement, but instead of holding down a button to look around, you simply point the other controller in the direction you want to aim.
IMPRESSIONS: At first, I was standing up and swinging my hand all around to aim - and my arms got really tired really quick. But once I sat down and relaxed, resting my hands on my legs as I would with a normal controller, everything clicked. It wasn't perfect yet - the Revolution controller functionality had just been added recently and wasn't bug tested or polished, so every so often the view would "spaz out" for a couple seconds - but it was enough to get me excited. As odd as it may look holding the two separate controller pieces, one in each hand, looking around felt incredibly natural, even more than my preferred PC-style keyboard-and-mouse setup. I have to wonder about precision and speed in multiplayer games, but for a more deliberate single-player game like Metroid Prime - and the series is already confirmed for an appearance on the Revolution - this setup already has huge potential.
The Revolution Controller Design Philosophy
So why has Nintendo decided to brazenly break with tradition and the conventions of every other modern console in creating the Revolution controller? According to Mr. Miyamoto, it was part of a conscious decision to make something simple and straightforward enough to reach out to a new audience. "We want a system that takes advantage of new technology for something that anyone, regardless of age or gender, can pick up and play. [Something with a] gameplay style that people who have never played games can pick up and not be intimidated by. We wanted a controller that somebody's mother will look at and not be afraid of."
| |
Of course, Nintendo has no intention of leaving their traditional audience behind, and Mr. Miyamoto is quick to add that the controller is already well suited for a number of popular genres. "[We aimed for something] that is simple enough for everyone," he says, "but also something that people who've been playing games for years will be satisfied with."
source:http://www.1up.com/do/newsStory?cId=3143782