Monday, October 17, 2005

Meet the Life Hackers

In 2000, Gloria Mark was hired as a professor at the University of California at Irvine. Until then, she was working as a researcher, living a life of comparative peace. She would spend her days in her lab, enjoying the sense of serene focus that comes from immersing yourself for hours at a time in a single project. But when her faculty job began, that all ended. Mark would arrive at her desk in the morning, full of energy and ready to tackle her to-do list - only to suffer an endless stream of interruptions. No sooner had she started one task than a colleague would e-mail her with an urgent request; when she went to work on that, the phone would ring. At the end of the day, she had been so constantly distracted that she would have accomplished only a fraction of what she set out to do. "Madness," she thought. "I'm trying to do 30 things at once."

Lots of people complain that office multitasking drives them nuts. But Mark is a scientist of "human-computer interactions" who studies how high-tech devices affect our behavior, so she was able to do more than complain: she set out to measure precisely how nuts we've all become. Beginning in 2004, she persuaded two West Coast high-tech firms to let her study their cubicle dwellers as they surfed the chaos of modern office life. One of her grad students, Victor Gonzalez, sat looking over the shoulder of various employees all day long, for a total of more than 1,000 hours. He noted how many times the employees were interrupted and how long each employee was able to work on any individual task.

When Mark crunched the data, a picture of 21st-century office work emerged that was, she says, "far worse than I could ever have imagined." Each employee spent only 11 minutes on any given project before being interrupted and whisked off to do something else. What's more, each 11-minute project was itself fragmented into even shorter three-minute tasks, like answering e-mail messages, reading a Web page or working on a spreadsheet. And each time a worker was distracted from a task, it would take, on average, 25 minutes to return to that task. To perform an office job today, it seems, your attention must skip like a stone across water all day long, touching down only periodically.

Yet while interruptions are annoying, Mark's study also revealed their flip side: they are often crucial to office work. Sure, the high-tech workers grumbled and moaned about disruptions, and they all claimed that they preferred to work in long, luxurious stretches. But they grudgingly admitted that many of their daily distractions were essential to their jobs. When someone forwards you an urgent e-mail message, it's often something you really do need to see; if a cellphone call breaks through while you're desperately trying to solve a problem, it might be the call that saves your hide. In the language of computer sociology, our jobs today are "interrupt driven." Distractions are not just a plague on our work - sometimes they are our work. To be cut off from other workers is to be cut off from everything.

For a small cadre of computer engineers and academics, this realization has begun to raise an enticing possibility: perhaps we can find an ideal middle ground. If high-tech work distractions are inevitable, then maybe we can re-engineer them so we receive all of their benefits but few of their downsides. Is there such a thing as a perfect interruption?


Mary Czerwinski first confronted this question while working, oddly enough, in outer space. She is one of the world's leading experts in interruption science, and she was hired in 1989 by Lockheed to help NASA design the information systems for the International Space Station. NASA had a problem: how do you deliver an interruption to a busy astronaut? On the space station, astronauts must attend to dozens of experiments while also monitoring the station's warning systems for potentially fatal mechanical errors. NASA wanted to ensure that its warnings were perfectly tuned to the human attention span: if a warning was too distracting, it could throw off the astronauts and cause them to mess up million-dollar experiments. But if the warnings were too subtle and unobtrusive, they might go unnoticed, which would be even worse. The NASA engineers needed something that would split the difference.

Czerwinski noticed that all the information the astronauts received came to them as plain text and numbers. She began experimenting with different types of interruptions and found that it was the style of delivery that was crucial. Hit an astronaut with a textual interruption, and he was likely to ignore it, because it would simply fade into the text-filled screens he was already staring at. Blast a horn and he would definitely notice it - but at the cost of jangling his nerves. Czerwinski proposed a third way: a visual graphic, like a pentagram whose sides changed color based on the type of problem at hand, a solution different enough from the screens of text to break through the clutter.

The science of interruptions began more than 100 years ago, with the emergence of telegraph operators - the first high-stress, time-sensitive information-technology jobs. Psychologists discovered that if someone spoke to a telegraph operator while he was keying a message, the operator was more likely to make errors; his cognition was scrambled by mentally "switching channels." Later, psychologists determined that whenever workers needed to focus on a job that required the monitoring of data, presentation was all-important. Using this knowledge, cockpits for fighter pilots were meticulously planned so that each dial and meter could be read at a glance.

Still, such issues seemed remote from the lives of everyday workers - even information workers - simply because everyday work did not require parsing screenfuls of information. In the 90's, this began to change, and change quickly. As they became ubiquitous in the workplace, computers, which had until then been little more than glorified word-processors and calculators, began to experience a rapid increase in speed and power. "Multitasking" was born; instead of simply working on one program for hours at a time, a computer user could work on several different ones simultaneously. Corporations seized on this as a way to squeeze more productivity out of each worker, and technology companies like Microsoft obliged them by transforming the computer into a hub for every conceivable office task, and laying on the available information with a trowel. The Internet accelerated this trend even further, since it turned the computer from a sealed box into our primary tool for communication. As a result, office denizens now stare at computer screens of mind-boggling complexity, as they juggle messages, text documents, PowerPoint presentations, spreadsheets and Web browsers all at once. In the modern office we are all fighter pilots.

Information is no longer a scarce resource - attention is. David Rose, a Cambridge, Mass.-based expert on computer interfaces, likes to point out that 20 years ago, an office worker had only two types of communication technology: a phone, which required an instant answer, and postal mail, which took days. "Now we have dozens of possibilities between those poles," Rose says. How fast are you supposed to reply to an e-mail message? Or an instant message? Computer-based interruptions fall into a sort of Heisenbergian uncertainty trap: it is difficult to know whether an e-mail message is worth interrupting your work for unless you open and read it - at which point you have, of course, interrupted yourself. Our software tools were essentially designed to compete with one another for our attention, like needy toddlers.

The upshot is something that Linda Stone, a software executive who has worked for both Apple and Microsoft, calls "continuous partial attention": we are so busy keeping tabs on everything that we never focus on anything. This can actually be a positive feeling, inasmuch as the constant pinging makes us feel needed and desired. The reason many interruptions seem impossible to ignore is that they are about relationships - someone, or something, is calling out to us. It is why we have such complex emotions about the chaos of the modern office, feeling alternately drained by its demands and exhilarated when we successfully surf the flood.

"It makes us feel alive," Stone says. "It's what makes us feel important. We just want to connect, connect, connect. But what happens when you take that to the extreme? You get overconnected." Sanity lies on the path down the center - if only there was some way to find it.


It is this middle path that Czerwinski and her generation of computer scientists are now trying to divine. When I first met her in the corridors of Microsoft, she struck me as a strange person to be studying the art of focusing, because she seemed almost attention-deficit disordered herself: a 44-year-old with a pageboy haircut and the electric body language of a teenager. "I'm such a spaz," she said, as we went bounding down the hallways to the cafeteria for a "bio-break." When she ushered me into her office, it was a perfect Exhibit A of the go-go computer-driven life: she had not one but three enormous computer screens, festooned with perhaps 30 open windows - a bunch of e-mail messages, several instant messages and dozens of Web pages. Czerwinski says she regards 20 solid minutes of uninterrupted work as a major triumph; often she'll stay in her office for hours after work, crunching data, since that's the only time her outside distractions wane.

In 1997, Microsoft recruited Czerwinski to join Microsoft Research Labs, a special division of the firm where she and other eggheads would be allowed to conduct basic research into how computers affect human behavior. Czerwinski discovered that the computer industry was still strangely ignorant of how people really used their computers. Microsoft had sold tens of millions of copies of its software but had never closely studied its users' rhythms of work and interruption. How long did they linger on a single document? What interrupted them while they were working, and why?

To figure this out, she took a handful of volunteers and installed software on their computers that would virtually shadow them all day long, recording every mouse click. She discovered that computer users were as restless as hummingbirds. On average, they juggled eight different windows at the same time - a few e-mail messages, maybe a Web page or two and a PowerPoint document. More astonishing, they would spend barely 20 seconds looking at one window before flipping to another.

Why the constant shifting? In part it was because of the basic way that today's computers are laid out. A computer screen offers very little visual real estate. It is like working at a desk so small that you can look at only a single sheet of paper at a time. A Microsoft Word document can cover almost an entire screen. Once you begin multitasking, a computer desktop very quickly becomes buried in detritus.

This is part of the reason that, when someone is interrupted, it takes 25 minutes to cycle back to the original task. Once their work becomes buried beneath a screenful of interruptions, office workers appear to literally forget what task they were originally pursuing. We do not like to think we are this flighty: we might expect that if we are, say, busily filling out some forms and are suddenly distracted by a phone call, we would quickly return to finish the job. But we don't. Researchers find that 40 percent of the time, workers wander off in a new direction when an interruption ends, distracted by the technological equivalent of shiny objects. The central danger of interruptions, Czerwinski realized, is not really the interruption at all. It is the havoc they wreak with our short-term memory: What the heck was I just doing?


When Gloria Mark and Mary Czerwinski, working separately, looked at the desks of the people they were studying, they each noticed the same thing: Post-it notes. Workers would scrawl hieroglyphic reminders of the tasks they were supposed to be working on ("Test PB patch DAN's PC - Waiting for AL," was one that Mark found). Then they would place them directly in their fields of vision, often in a halo around the edge of their computer screens. The Post-it notes were, in essence, a jury-rigged memory device, intended to rescue users from those moments of mental wandering.

For Mark and Czerwinski, these piecemeal efforts at coping pointed to ways that our high-tech tools could be engineered to be less distracting. When Czerwinski walked around the Microsoft campus, she noticed that many people had attached two or three monitors to their computers. They placed their applications on different screens - the e-mail far off on the right side, a Web browser on the left and their main work project right in the middle - so that each application was "glanceable." When the ding on their e-mail program went off, they could quickly peek over at their in-boxes to see what had arrived.

The workers swore that this arrangement made them feel calmer. But did more screen area actually help with cognition? To find out, Czerwinski's team conducted another experiment. The researchers took 15 volunteers, sat each one in front of a regular-size 15-inch monitor and had them complete a variety of tasks designed to challenge their powers of concentration - like a Web search, some cutting and pasting and memorizing a seven-digit phone number. Then the volunteers repeated these same tasks, this time using a computer with a massive 42-inch screen, as big as a plasma TV.

The results? On the bigger screen, people completed the tasks at least 10 percent more quickly - and some as much as 44 percent more quickly. They were also more likely to remember the seven-digit number, which showed that the multitasking was clearly less taxing on their brains. Some of the volunteers were so enthralled with the huge screen that they begged to take it home. In two decades of research, Czerwinski had never seen a single tweak to a computer system so significantly improve a user's productivity. The clearer your screen, she found, the calmer your mind. So her group began devising tools that maximized screen space by grouping documents and programs together - making it possible to easily spy them out of the corner of your eye, ensuring that you would never forget them in the fog of your interruptions. Another experiment created a tiny round window that floats on one side of the screen; moving dots represent information you need to monitor, like the size of your in-box or an approaching meeting. It looks precisely like the radar screen in a military cockpit.


In late 2003, the technology writer Danny O'Brien decided he was fed up with not getting enough done at work. So he sat down and made a list of 70 of the most "sickeningly overprolific" people he knew, most of whom were software engineers of one kind or another. O'Brien wrote a questionnaire asking them to explain how, precisely, they managed such awesome output. Over the next few weeks they e-mailed their replies, and one night O'Brien sat down at his dining-room table to look for clues. He was hoping that the self-described geeks all shared some common tricks.

He was correct. But their suggestions were surprisingly low-tech. None of them used complex technology to manage their to-do lists: no Palm Pilots, no day-planner software. Instead, they all preferred to find one extremely simple application and shove their entire lives into it. Some of O'Brien's correspondents said they opened up a single document in a word-processing program and used it as an extra brain, dumping in everything they needed to remember - addresses, to-do lists, birthdays - and then just searched through that file when they needed a piece of information. Others used e-mail - mailing themselves a reminder of every task, reasoning that their in-boxes were the one thing they were certain to look at all day long.

In essence, the geeks were approaching their frazzled high-tech lives as engineering problems - and they were not waiting for solutions to emerge from on high, from Microsoft or computer firms. Instead they ginned up a multitude of small-bore fixes to reduce the complexities of life, one at a time, in a rather Martha Stewart-esque fashion.

Many of O'Brien's correspondents, it turned out, were also devotees of "Getting Things Done," a system developed by David Allen, a personal-productivity guru who consults with Fortune 500 corporations and whose seminars fill Silicon Valley auditoriums with anxious worker bees. At the core of Allen's system is the very concept of memory that Mark and Czerwinski hit upon: unless the task you're doing is visible right in front of you, you will half-forget about it when you get distracted, and it will nag at you from your subconscious. Thus, as soon as you are interrupted, Allen says, you need either to quickly deal with the interruption or - if it's going to take longer than two minutes - to faithfully add the new task to your constantly updated to-do list. Once the interruption is over, you immediately check your to-do list and go back to whatever is at the top.

"David Allen essentially offers a program that you can run like software in your head and follow automatically," O'Brien explains. "If this happens, then do this. You behave like a robot, which of course really appeals to geeks."

O'Brien summed up his research in a speech called "Life Hacks," which he delivered in February 2004 at the O'Reilly Emerging Technology Conference. Five hundred conference-goers tried to cram into his session, desperate for tips on managing info chaos. When O'Brien repeated the talk the next year, it was mobbed again. By the summer of 2005, the "life hacks" meme had turned into a full-fledged grass-roots movement. Dozens of "life hacking" Web sites now exist, where followers of the movement trade suggestions on how to reduce chaos. The ideas are often quite clever: O'Brien wrote for himself a program that, whenever he's surfing the Web, pops up a message every 10 minutes demanding to know whether he's procrastinating. It turns out that a certain amount of life-hacking is simply cultivating a monklike ability to say no.

"In fairness, I think we bring some of this on ourselves," says Merlin Mann, the founder of the popular life-hacking site 43folders.com. "We'd rather die than be bored for a few minutes, so we just surround ourselves with distractions. We've got 20,000 digital photos instead of 10 we treasure. We have more TV Tivo'd than we'll ever see." In the last year, Mann has embarked on a 12-step-like triage: he canceled his Netflix account, trimmed his instant-messaging "buddy list" so only close friends can contact him and set his e-mail program to bother him only once an hour. ("Unless you're working in a Korean missile silo, you don't need to check e-mail every two minutes," he argues.)

Mann's most famous hack emerged when he decided to ditch his Palm Pilot and embrace a much simpler organizing style. He bought a deck of 3-by-5-inch index cards, clipped them together with a binder clip and dubbed it "The Hipster P.D.A." - an ultra-low-fi organizer, running on the oldest memory technology around: paper.


In the 1920's, the Russian scientist Bluma Zeigarnik performed an experiment that illustrated an intriguing aspect of interruptions. She had several test subjects work on jigsaw puzzles, then interrupted them at various points. She found that the ones least likely to complete the task were those who had been disrupted at the beginning. Because they hadn't had time to become mentally invested in the task, they had trouble recovering from the distraction. In contrast, those who were interrupted toward the end of the task were more likely to stay on track.

Gloria Mark compares this to the way that people work when they are "co-located" - sitting next to each other in cubicles - versus how they work when they are "distributed," each working from different locations and interacting online. She discovered that people in open-cubicle offices suffer more interruptions than those who work remotely. But they have better interruptions, because their co-workers have a social sense of what they are doing. When you work next to other people, they can sense whether you're deeply immersed, panicking or relatively free and ready to talk - and they interrupt you accordingly.

So why don't computers work this way? Instead of pinging us with e-mail and instant messages the second they arrive, our machines could store them up - to be delivered only at an optimum moment, when our brains are mostly relaxed.

One afternoon I drove across the Microsoft campus to visit a man who is trying to achieve precisely that: a computer that can read your mind. His name is Eric Horvitz, and he is one of Czerwinski's closest colleagues in the lab. For the last eight years, he has been building networks equipped with artificial intelligence (A.I.) that carefully observes a computer user's behavior and then tries to predict that sweet spot - the moment when the user will be mentally free and ready to be interrupted.

Horvitz booted the system up to show me how it works. He pointed to a series of bubbles on his screen, each representing one way the machine observes Horvitz's behavior. For example, it measures how long he's been typing or reading e-mail messages; it notices how long he spends in one program before shifting to another. Even more creepily, Horvitz told me, the A.I. program will - a little like HAL from "2001: A Space Odyssey" - eavesdrop on him with a microphone and spy on him using a Webcam, to try and determine how busy he is, and whether he has company in his office. Sure enough, at one point I peeked into the corner of Horvitz's computer screen and there was a little red indicator glowing.

"It's listening to us," Horvitz said with a grin. "The microphone's on."

It is no simple matter for a computer to recognize a user's "busy state," as it turns out, because everyone is busy in his own way. One programmer who works for Horvitz is busiest when he's silent and typing for extended periods, since that means he's furiously coding. But for a manager or executive, sitting quietly might actually be an indication of time being wasted; managers are more likely to be busy when they are talking or if PowerPoint is running.

In the early days of training Horvitz's A.I., you must clarify when you're most and least interruptible, so the machine can begin to pick up your personal patterns. But after a few days, the fun begins - because the machine takes over and, using what you've taught it, tries to predict your future behavior. Horvitz clicked an onscreen icon for "Paul," an employee working on a laptop in a meeting room down the hall. A little chart popped up. Paul, the A.I. program reported, was currently in between tasks - but it predicted that he would begin checking his e-mail within five minutes. Thus, Horvitz explained, right now would be a great time to e-mail him; you'd be likely to get a quick reply. If you wanted to pay him a visit, the program also predicted that - based on his previous patterns - Paul would be back in his office in 30 minutes.

With these sorts of artificial smarts, computer designers could re-engineer our e-mail programs, our messaging and even our phones so that each tool would work like a personal butler - tiptoeing around us when things are hectic and barging in only when our crises have passed. Horvitz's early prototypes offer an impressive glimpse of what's possible. An e-mail program he produced seven years ago, code-named Priorities, analyzes the content of your incoming e-mail messages and ranks them based on the urgency of the message and your relationship with the sender, then weighs that against how busy you are. Superurgent mail is delivered right away; everything else waits in a queue until you're no longer busy. When Czerwinski first tried the program, it gave her as much as three hours of solid work time before nagging her with a message. The software also determined, to the surprise of at least one Microsoft employee, that e-mail missives from Bill Gates were not necessarily urgent, since Gates tends to write long, discursive notes for employees to meditate on.

This raises a possibility both amusing and disturbing: perhaps if we gave artificial brains more control over our schedules, interruptions would actually decline - because A.I. doesn't panic. We humans are Pavlovian; even though we know we're just pumping ourselves full of stress, we can't help frantically checking our e-mail the instant the bell goes ding. But a machine can resist that temptation, because it thinks in statistics. It knows that only an extremely rare message is so important that we must read it right now.


So will Microsoft bring these calming technologies to our real-world computers? "Could Microsoft do it?" asks David Gelernter, a Yale professor and longtime critic of today's computers. "Yeah. But I don't know if they're motivated by the lust for simplicity that you'd need. They're more interested in piling more and more toys on you."

The near-term answer to the question will come when Vista, Microsoft's new operating system, is released in the fall of 2006. Though Czerwinski and Horvitz are reluctant to speculate on which of their innovations will be included in the new system, Horvitz said that the system will "likely" incorporate some way of detecting how busy you are. But he admitted that "a bunch of features may not be shipping with Vista." He says he believes that Microsoft will eventually tame the interruption-driven workplace, even if it takes a while. "I have viewed the task as a 'moon mission' that I believe that Microsoft can pull off," he says.

By a sizable margin, life hackers are devotees not of Microsoft but of Apple, the company's only real rival in the creation of operating systems - and a company that has often seemed to intuit the need for software that reduces the complexity of the desktop. When Apple launched its latest operating system, Tiger, earlier this year, it introduced a feature called Dashboard - a collection of glanceable programs, each of which performs one simple function, like displaying the weather. Tiger also includes a single-key tool that zooms all open windows into a bingo-card-like grid, uncovering any "lost" ones. A superpowered search application speeds up the laborious task of hunting down a missing file. Microsoft is now playing catch-up; Vista promises many of the same tweaks, although it will most likely add a few new ones as well, including, possibly, a 3-D mode for seeing all the windows you have open.

Apple's computers have long been designed specifically to soothe the confusions of the technologically ignorant. For years, that meant producing computer systems that seemed simpler than the ones Microsoft produced, but were less powerful. When computers moved relatively slowly and the Internet was little used, raw productivity - shoving the most data at the user - mattered most, and Microsoft triumphed in the marketplace. But for many users, simplicity now trumps power. Linda Stone, the software executive who has worked alongside the C.E.O.'s of both Microsoft and Apple, argues that we have shifted eras in computing. Now that multitasking is driving us crazy, we treasure technologies that protect us. We love Google not because it brings us the entire Web but because it filters it out, bringing us the one page we really need. In our new age of overload, the winner is the technology that can hold the world at bay.

Yet the truth is that even Apple might not be up to the task of building the ultimately serene computer. After all, even the geekiest life hackers find they need to trick out their Apples with duct-tape-like solutions; and even that sometimes isn't enough. Some experts argue that the basic design of the computer needs to change: so long as computers deliver information primarily through a monitor, they have an inherent bottleneck - forcing us to squeeze the ocean of our lives through a thin straw. David Rose, the Cambridge designer, suspects that computers need to break away from the screen, delivering information through glanceable sources in the world around us, the way wall clocks tell us the time in an instant. For computers to become truly less interruptive, they might have to cease looking like computers. Until then, those Post-it notes on our monitors are probably here to stay.

source:http://www.nytimes.com/2005/10/16/magazine/16guru.html?hp=&pagewanted=print


Is There a Future for Indie Games?

"If you've been following Greg Costikyan's recent rants (such as Death to The Games Industry), you would have seen mention of one developer's attempt at breaking the traditional games publisher funding model. Well, their game is now in the stores, and whats more it has been getting some pretty good reviews, but has anyone heard of it? Judging by some press, the marketing has been somewhat underwhelming. So the question is, is there still a viable space for good games developed outside the traditional corporate publisher model, or does E.A. already own the future of video games?" Moreover, when indie developers have to go up against the likes of EA and Steven Spielberg, what hope can they have for matching that kind of success? At least one company thinks they can do it by offering games for direct download. Is direct purchasing enough of an incentive for your average gamer to shell out money on something he's never heard of before?

source: http://games.slashdot.org/article.pl?sid=05/10/16/1323209&tid=187&tid=98&tid=4&tid=10

Oldest noodles unearthed in China


Late Neolithic noodles: They may settle the origin debate

The remains of the world's oldest noodles have been unearthed in China.

The 50cm-long, yellow strands were found in a pot that had probably been buried during a catastrophic flood.

Radiocarbon dating of the material taken from the Lajia archaeological site on the Yellow River indicates the food was about 4,000 years old.

Scientists tell the journal Nature that the noodles were made using grains from millet grass - unlike modern noodles, which are made with wheat flour.

The discovery goes a long way to settling the old argument over who first created the string-like food.

Professor Houyuan Lu said: "Prior to the discovery of noodles at Lajia, the earliest written record of noodles is traced to a book written during the East Han Dynasty sometime between AD 25 and 220, although it remained a subject of debate whether the Chinese, the Italians, or the Arabs invented it first.

Lajia is a very interesting site; in a way, it is the Pompeii of China
Prof Kam-biu Liu
"Our discovery indicates that noodles were first produced in China," the researcher from the Institute of Geology and Geophysics, Chinese Academy of Sciences, Beijing, explained to BBC News.

The professor's team tells Nature that the ancient settlement at Lajia was hit by a sudden catastrophe.

Among the remains are skeletons thrown into various abnormal postures, suggesting the inhabitants may have been trying to flee the disaster that was enveloping them.

"Based on the geological and archaeological evidence, there was a catastrophic earthquake and immediately following the quake, the site was subject to flooding by the river," explained co-author Professor Kam-biu Liu, from Louisiana State University, US.

Map of Lajia (BBC)
"Lajia is a very interesting site; in a way, it is the Pompeii of China."

It was in amongst the human wreckage that scientists found an upturned earthenware bowl filled with brownish-yellow, fine clay.

When they lifted the inverted container, the noodles were found sitting proud on the cone of sediment left behind.

"It was this unique combination of factors that created a vacuum or empty space between the top of the sediment cone and the bottom of this bowl that allowed the noodles to be preserved," Professor Kam-biu Liu said.

The noodles resemble the La-Mian noodle, the team says; a traditional Chinese noodle that is made by repeatedly pulling and stretching the dough by hand.

There is evidence that a sudden calamity overtook the Lajia site

To identify the plants from which the noodles were made, the team looked at the shape and patterning of starch grains and so-called seed-husk phytoliths in the bowl.

These were compared with modern crops. The analysis pointed to the use of foxtail millet (Setaria italica) and broomcorn millet (Panicum miliaceum)

"Our data demonstrate that noodles were probably initially made from species of domesticated grasses native to China. This is in sharp contrast to modern Chinese noodles or Italian pasta which are mostly made of wheat today," Professor Houyuan Lu said.


One-Fifth of Human Genes Have Been Patented, Study Reveals

A new study shows that 20 percent of human genes have been patented in the United States, primarily by private firms and universities.

The study, which is reported this week in the journal Science, is the first time that a detailed map has been created to match patents to specific physical locations on the human genome.

Researchers can patent genes because they are potentially valuable research tools, useful in diagnostic tests or to discover and produce new drugs.

"It might come as a surprise to many people that in the U.S. patent system human DNA is treated like other natural chemical products," said Fiona Murray, a business and science professor at the Massachusetts Institute of Technology in Cambridge, and a co-author of the study.

"An isolated DNA sequence can be patented in the same manner that a new medicine, purified from a plant, could be patented if an inventor identifies a [new] application."

Hot Spots

Gene patents were central to the biotech boom of the 1980s and 1990s. The earliest gene patents were obtained around 1978 on the gene for human growth hormone.

The human genome project and the introduction of rapid sequencing techniques brought a deluge of new genetic information and many new patents. Yet there has been little comprehensive research about the extent of gene patenting.

The new study reveals that more than 4,000 genes, or 20 percent of the almost 24,000 human genes, have been claimed in U.S. patents.

Of the patented genes, about 63 percent are assigned to private firms and 28 percent are assigned to universities.

The top patent assignee is Incyte, a Palo Alto, California-based drug company whose patents cover 2,000 human genes.

"Gene patents give their owners property rights over gene sequences—for example in a diagnostic test, as a test for the efficacy of a new drug, or in the production of therapeutic proteins," Murray said.

"While this does not quite boil down to [the patent holders] owning our genes … these rights exclude us from using our genes for those purposes that are covered in the patent," she said.

Specific regions of the human genome are "hot spots" of patent activity. Some genes have up to 20 patents asserting rights to how those genes can be used.

"Basically those genes that people think are relevant in disease, such as Alzheimer's or cancer, are more likely to be patented than genes which are something of a mystery," Murray said.

Patent Maze

The effect of gene patenting on research and investment has been the subject of great debate.

Advocates argue that gene patents, like all patents, promote the disclosure and dissemination of ideas by making important uses of gene sequences publicly known.

Patents also provide important incentives to investors who would otherwise be reluctant to invest in ideas that could be copied by competitors.

But critics caution that patents that are very broad can obstruct future innovations by preventing researchers from looking for alternative uses for a patented gene.

"You can find dozens of ways to heat a room besides the Franklin stove, but there's only one gene to make human growth hormone," said Robert Cook-Deegan, director of Duke University's Center for Genome Ethics, Law, and Policy.

"If one institution owns all the rights, it may work well to introduce a new product, but it may also block other uses, including research," he said.

In cases where there are a lot of patents surrounding one area of research, the scientific costs of gene patents—financial and otherwise—can be extremely high.

"Our data raise a number of concerns about gene patents, particularly for heavily patented genes," Murray said. "We worry about the costs to society if scientists—academic and industry—have to walk through a complex maze of patents in order to make more progress in their research."

source:http://news.nationalgeographic.com/news/2005/10/1013_051013_gene_patent_2.html


Indie Gamers Hit the Right Buttons

Startup design outfits are staking their claim to a small-but-growing market niche -- so-called "casual games"


When Andy Schatz left an executive-level job at game maker TKO Software in December, 2004, to start his own gaming studio, he figured he would have to stretch his $100,000 savings for as long as four years before his new business caught on. His math was way off. Schatz's Pocketwatch Games just sold the rights to its first title, an animal-adventure game called Wildlife Tycoon: Venture Africa, to distributor MumboJumbo after a three-way bidding war. Schatz is hoping that MumboJumbo regulars including Wal-Mart (WMT) and Best Buy (BBY) will opt to carry Tycoon as early as next year.


Schatz is now looking for a full-time programmer and an artist to develop more animal-themed titles that would build on the success of Wildlife Tycoon, where players earn jewels by increasing the population size of elephant herds, flamingo stands, and crocodile congregations (see accompanying slide show "Big Fun with Small Developers"). And that's just the opening act of his grand gaming plan. "We want to be the Discovery Channel for games," Schatz says.

His timing couldn't be better. Independent game developers such as Pocketwatch now have a better chance of success in the cutthroat industry than they've had in years. For that they can thank a boom in what's known as casual gaming.

GROWING CORNER. Until recently, the market for electronic games was mainly young, male, and diehard. These days, a bigger, more age-diverse group that increasingly includes and women is joining in the fun, spending anywhere from a few minutes a day to long stretches on online poker or games such as Bejewelled, Tetris, and The Sims.

As more people sign up for high-speed Internet access (almost 60% of the U.S. population now has access to broadband), the gaming experience -- both for games playable online, such as Bejewelled, and CD- or DVD-ROM titles with an online component, like The Sims -- has become more appealing. Casual gamers now make up about 1% of the $20.5 billion game-software market. By 2010, that figure may surge to $2.1 billion, or 5% of sales, says David Cole, an analyst at gaming consultant DFC Intelligence.

Little wonder that companies ranging from RealNetworks (RNWK) to Time Warner (TWX), which just launched a Web-based games channel called GameTap, are devoting more attention to casual gamers (see BW, 10/3/05, "Making a Play For All Those Non-Players").

"SMALL POTATOES." Many indie game developers are jumping into casual gaming partly because the barriers to entry are low. It costs a lot less to come up with a casual game than the graphics-heavy blockbusters developed by giants such as Electronic Arts (ERTS) and Take Two (TTWO).

Casual games are typically played on PCs or mobile phones and involve less complicated features and graphics. And costs can run as low as the mere thousands of dollars -- vs. $20 million to $100 million for your full-blown, hardcore video game that can take dozens of developers years to design. The price tag for Wildlife Tycoon: about $6,000.

Smaller developers are also benefiting, because big-name publishers don't yet pose much of a competitive threat to their niche. Casual games cost anywhere from $2 to $20 -- or else they're offered free, with publishers making money through ads that run alongside the games. Those markets "are still small potatoes for most of the big guys" accustomed to selling a million copies of a game that costs $50, says DFC's Cole.

What's more, casual games often venture into specialized areas of interest outside the core competency of traditional publishers, which tend to focus on sports and shoot-'em-up storylines.

FRESH IDEAS. "As we see greater diversification of genres, it's less practical to be working [on niche projects in-house] than working with different teams that ended up in this genre to begin with," says Patrick Kelly, vice-president of studios at Activision Value Publishing, an Activision subsidiary that puts out casual games. That's why Activision Value has bought all of its dozen or so casual titles, such as bike-building game American Chopper, from indie developers. The company plans to increase its casual-game portfolio by 15% to 20% a year.

Console makers Microsoft (MSFT), Sony (SNE) and Nintendo are starting to give independent developers a helping hand rather than a cold shoulder, in hopes of widening their markets. After all, some casual games have proven to be surprising hits. Nintendo, for one, has found that titles such as Nintendogs, which was released in August and lets players raise a virtual puppy, have boosted its Nintendo DS sales and put the gadget into the hands of people who wouldn't otherwise have bought a portable gaming console.

Plus, some console makers are looking for a broader range of games than those offered by the larger publishers. Many publishers have become afflicted by "sequelitis," says Greg Canessa, a group manager at Microsoft's Xbox division. They create a hit, then milk it by releasing three or four sequels that are predictable and less creative.

"Some of the most innovative game play will come from the independent-developer community," predicts Canessa. "We view the future of gaming as driven by independent game developers," adds Canessa (see BW Online, 10/14/05, "Microsoft Seeds the Indie-Game Ecosystem").

THE X-FACTOR. In fact, independent developers may be key to helping Microsoft grab share from No. 1 console maker, Sony. Microsoft recently sponsored the IndieGamesCon 2005 conference in Eugene, Ore., which attracted 300 indie developers from around the country. There, in an old banquet hall a block away from an abandoned car wash, the mostly male attendees crowded three rows deep to try each other's games, including Marble Blast Ultra, from conference organizer and indie label GarageGames.

Microsoft will distribute Marble Blast Ultra, which lets a player navigate a ball through a slew of 3-D structures and slides to score points, as an online offering for its Xbox 360 console, which will roll out Nov. 22. Marble Blast is the first indie game to be available through the Xbox 360 Live Arcade, a feature of the new console that lets users download games from the Web.

In the next year, Microsoft hopes to make it easier for indie developers to create and sell games through Xbox 360 Live, a service allowing gamers to download new game levels and to play against each other online, as well as MSN Messenger and possibly through Microsoft-powered mobile phones, says Canessa.

READY MARKET. Meantime, independent game makers eager to get distribution for their titles are relying on a relatively new class of middlemen, or agents, who say they can take an indie from obscurity to fame. Agent Joe Lieberman, based in Corvallis, Ore., says he can help developers find good artists, offer advice on improving games, or get a title reviewed by specialized gaming magazines.

Lieberman and other agents also try to push their clients' titles to content aggregators and distributors, which sell games online, at retail stores, or market to larger publishers. It's a tricky business, Lieberman says. "If I screw up, that was that guy's second mortgage," he notes. Still, his fees are pretty low, running around $100 for a press release and $30 to $40 an hour in consulting.

With demand skyrocketing, many indies have little difficulty selling games to publishers directly. Eric Hartman, a Web-design student in Arlington, Va., recently spent three months developing a game in which Lego-sized players build structures out of Lego pieces on the floor of their bedroom. He then contacted Lego, the toy company, met with its reps, and now the company wants to buy his game, Hartman says. He declined to elaborate on the pending deal.

MAKING THE LEAP. Other developers believe they can recoup costs by selling games over the Web. That's Robert Clegg's plan. Clegg is chief product officer at indie outfit Tabula Digita, which will soon debut Dimenxian, its first algebra title, online. In Dimenxian, players use coordinates to hunt and measure various alien creatures. It's aimed at teachers and parents.

Mark Frohnmayer, president of GarageGames, summed it up for conference attendees this way: "It's definitely possible for independent developers to make a living from making games," he said. "If you can quit your day job, now is the time." That's just what Andy Schatz did. And look where he is now.


source:http://www.businessweek.com/innovate/content/oct2005/id20051013_044501.htm

This page is powered by Blogger. Isn't yours?