Wednesday, March 01, 2006

Study: In-flight cell calls pose risk to planes

CMU group finds danger increasing

You might want to think twice the next time you're tempted to make a call from your cell phone during an airplane flight. Or flip on your portable game player. Or work a spreadsheet on your laptop.

Besides possibly annoying fellow travelers and breaking federal regulations, you might be endangering the airplane, according to a Carnegie Mellon University study that quietly monitored transmissions on board a number of flights in the Northeast.

The study, by CMU's Department of Engineering and Public Policy, found that the use of cell phones and other portable electronic devices can interfere with the normal operation of critical airline components, even more so than previously believed.

Researchers concluded that such devices can disrupt the operation of cockpit instruments, including the Global Positioning System receivers that are becoming more common in helping to ensure safe landings.

Researchers noted that there is no definitive instance of an electronic device used by a passenger causing an accident. However, they said their data support the conclusion that use of devices like cell phones "will, in all likelihood, someday cause an accident by interfering with critical cockpit instruments such as GPS receivers."

The findings come as the Federal Communications Commission is considering lifting the ban on the use of cell phones during flight.

Both the FCC and the Federal Aviation Administration have barred the in-flight operation of cell phones because of concerns about interference, both to navigational and communications equipment in the air and to cell phone towers on the ground.

Verizon's Airfones, the devices typically found on the backs of passenger seats in some aircraft, operate at frequencies that do not interfere, and have been on some planes for more than a decade.

Airlines typically allow the use of portable electronic devices, like game players and DVDs, above 10,000 feet, but not during takeoff or landing. Laptops also can be used, but not for communications purposes, such as sending or receiving e-mail.

In the past, the FAA has found nothing to indicate that the use of passive devices like laptops or game-playing electronics poses a threat to the aircraft.

However, the CMU study concluded otherwise. While the researchers looked primarily at cell phone use, they also discovered that emissions from other portable devices proved "problematic."

"We found that the risk posed by these portable devices is higher than previously believed," researcher Bill Strauss said in a release announcing the findings.

And despite the ban on cell phone use during flights, the researchers discovered that on average one to four cell phone calls are made from every commercial flight in the northeast United States.

Some are even made during critical flight times, such as the climb after takeoff or the final approach.

The study is featured in an article in the March issue of IEEE Spectrum, a monthly magazine for technology professionals.

The study was conducted over three months in late 2003 with the cooperation of the FAA, three major airlines and the Transportation Security Administration.

During that period, the researchers monitored radio emissions from cell phones and other electronic devices on commercial flights throughout the Northeast. The equipment used to take the measurements, including a laptop computer, had been modified for safe in-flight use and fit in a nondescript carry-on bag.

While the data date to late 2003, the researchers said they believe the study is the first to document in-flight emissions of portable electronic devices.

In response to the findings, they recommended better coordination between the FCC and FAA in developing electronic emissions standards. They also called for the routine monitoring of on-board radio emissions and the deployment of equipment that would allow flight crews to determine if a passenger is using a cell phone during final approaches.

They also recommended against lifting the ban on cell phone use. The issue is still under review by the FCC, a spokeswoman said.

The FAA, which also would have to approve lifting the ban, has said as recently as July that it opposes a change in the rules.

source:http://www.post-gazette.com/pg/06060/662669.stm


Open Source Legal Issues

Guide to Legal Issues in Using Open Source Software - Feburary 2006

While the use of open source software has many benefits, it brings with it a number of legal risks not posed by proprietary or commercial software. These include an increased risk of exposure to faults and intellectual property claims, and the risk of forced disclosure of confidential code.

There is no reason why agencies should not consider open source software on the same basis as commercial software. But agencies should base their decisions on the overall merits of the software concerned. This means weighing the unique legal risks of open source software together with the usual factors such as cost, functionality, interoperability and security.

To help simplify matters, this guide was prepared for the State Services Commission (SSC) by Chapmann Tripp, to help New Zealand government departments assess and mitigate the legal risks of using open source software.

source:http://www.e.govt.nz/policy/open-source/open-source-legal


Market Is Hot For High-Skilled In Silicon Valley

Five years after the dot-com bubble burst, job growth has returned to Silicon Valley. But it's a different kind of growth than in past recoveries, favoring higher-skilled workers.

Netflix Inc.'s hiring shifts are typical. During the tech boom, the online movie-rental service created 100 customer-service jobs near its Los Gatos, Calif., headquarters in the heart of Silicon Valley. After the tech bust in 2000, Netflix eliminated half of those positions. But the total headcount at Netflix's Silicon Valley offices has grown 20%, to nearly 200 staffers in the last few years.

That's because Netflix, while shedding some lower-end jobs, has aggressively created new, higher-level jobs. It's adding jobs in departments such as Web engineering and product development: That groups' hiring of engineers jumped 20% to more than 50 people in 2005 alone. "Our new engineers have an average of seven to 15 years experience," says Patty McCord, Netflix's chief talent officer. "Five years ago, we hired people with three to five years of experience."

[Turning Around]

Past tech recoveries tended to bring new lower-skilled jobs as well as high-skill jobs. This time, tech firms -- from big companies like Hewlett-Packard Co. to mid- and small-size firms such as Netflix, Adobe Systems Inc., and SanDisk Corp. -- have moved lower-skill jobs out of the Silicon Valley area to cheaper locations, or outsourced them to foreign countries. The new jobs they are creating locally often require specialized skills in engineering and design. Young companies like Google Inc. are simply starting out hiring at the high end, further shifting the overall balance.

A study last month by Joint Venture Silicon Valley, a nonprofit group representing businesses and government agencies in the area, found the nation's tech capital had a net increase in jobs in 2005 for the first time in four years. Most of the growth came in the category of creative and innovation services, including firms in research and development, scientific and technical consulting and industrial design. In total, the number of Silicon Valley jobs in these areas grew 4% from 2002 to 2005, reaching 72,734. At the same time, the number of jobs in electronic-component manufacturing -- which tend to involve assembly and other repetitive tasks -- dropped 28% to 23,772, while jobs in semiconductor-equipment manufacturing fell 23% to 58,133.

Overall, 14% of all the jobs in Silicon Valley today belong to a sector called core design, engineering and science. That exceeds the comparable 9.3% slice of the work force in Austin, Texas; 8.7% in Seattle; and 8.3% in San Diego, according to the study.

Doug Henton, an economist and co-author of the report, says with the growth in these creative engineering jobs, a new face of Silicon Valley is emerging. "Ten years ago, this was an engineering Valley that pumped out chips and computers," he says. "Now it's all about creative tech and staying on the cutting edge."

The shift highlights how Silicon Valley is working to establish a competitive advantage, as lower-cost geographic rivals chip away at its strongholds. The Silicon Valley region has taken the tack of moving up the skills curve before: As competition in chip making became more heated in the 1970s, Silicon Valley chip makers relocated their assembly and manufacturing overseas but retained their core design facilities in the region. Today these chip makers, such as Intel Corp., remain dominant.

[Getting Creative]

Silicon Valley's changing employment makeup does have its downside. Wages are once again creeping up, making it more expensive to do business in the already pricey area. Average annual pay in Silicon Valley hit $69,455 in 2005, up 2.7% from 2004, though it remains below the heights of the average $80,000-plus that the region's workers earned in 2000, according to Joint Venture Silicon Valley.

What's more, as operations and lower-skill tech jobs leave the region, Silicon Valley has a narrower base of industries. That makes the area more vulnerable should another downturn occur, says Steve Levy, an economist at the Center for the Continuing Study of the California Economy in Palo Alto, Calif. "Los Angeles has a far more diverse economic base, with Hollywood, biotechnology, plastics and toys," says Mr. Levy. "But high-skill tech is all we're left with."

Tech companies say the shift toward the top end of the skills spectrum has largely been positive for them -- particularly in productivity. Consider SanDisk, a Sunnyvale, Calif., supplier of flash memory products. SanDisk had 300 operations and manufacturing jobs in Silicon Valley in 2000. The company moved about half of those jobs to Asia over the past few years, but the headcount at its headquarters jumped to 747 people by the end of 2005.

SanDisk's fastest-growing job category has been product development and research, where the company is now hiring "at the master's level and Ph.D. level," says Judy Bruner, SanDisk's chief financial officer. "We can't take just a general engineer."

While Ms. Bruner acknowledges average compensation at SanDisk is rising, she says higher-skill workers have helped the firm get more done. Last year, each SanDisk employee generated an average $2.4 million in revenue, up from around $1.5 million per employee in 2002.

Palm Inc., which makes hand-held computers and cellphones, has seen a similar productivity boost. Teresa Toller, the Sunnyvale, Calif., company's director of staffing, says Palm beefed up its engineering teams by 70% to more than 400 people in all over the past two years. The company has sought engineers specializing in wireless technologies, such as Bluetooth and Wi-Fi. Palm's average revenue per employee was $1.61 million for its fiscal year ended June 3, 2005, almost double the fiscal 2002 level or $788,000 per employee.

Google is doing more specialized hiring in areas such as mechanical and electrical engineering, says Alan Eustace, senior vice president of engineering at the Mountain View, Calif., search company. Last year, Google brought 2,659 new employees on board, pushing its total work force to 5,680. "We definitely hire for creativity, since creative people look at problems a different way and come up with the most interesting solutions," Mr. Eustace says.

The type of Silicon Valley employee now in demand looks something like Simon Smith. The 36-year-old Mr. Smith, who works at software maker Adobe, doesn't have a typical engineering background: He has degrees in architecture and ran a Web development company before becoming creative director of software firm Macromedia Inc. in 2002. Adobe, which makes software programs such as Photoshop and Illustrator, bought Macromedia for $3.4 billion last year.

At Adobe, Mr. Smith heads the world-wide user experience practice, a 12-person consulting team that works with outside clients to design mobile and Internet applications using Adobe technology. "We want to usher in a whole new era of design for our clients by leveraging Adobe technology," says Mr. Smith. He says he plans to expand his team over the next 12 months, with job candidates ideally coming from design firms such as Frog Design Inc., a Palo Alto, Calif.,-based company that helps design products such as Micron's PCs, Dell.com's Web site and Symantec's software packaging. A Frog Design spokesman says the company takes it as a compliment that tech firms are trying to recruit from it, but says its work force remains stable and has grown to 275 employees at the end of last year from 200 in mid-2005.

source:http://online.wsj.com/public/article/SB114109073448184889-CyWkFqDi0lZBMObCUIOK6QUzwu4_20070228.html?mod=blogs


Software Promises More Efficient Design Process

WEST LAFAYETTE, Ind., Feb. 28 (AScribe Newswire) -- Mechanical engineers at Purdue University have developed software that promises to increase the efficiency of creating parts for everything from cars to computer hardware by making it possible to quickly evaluate and optimize complex designs.

The new approach integrates the design and analysis processes, which are now carried out separately. Currently, the geometry of a part is first created using computer-aided design, or CAD, software. This geometry is then converted into a mesh of simple shapes, such as triangles or rectangles, which, when analyzed using a computer, indicates the part's strength and other characteristics. The painstaking procedure, called finite-element analysis, is extensively used in industry.

"It's like taking a continuous curve and breaking it into pieces," said Ganesh Subbarayan, a professor of mechanical engineering at Purdue. "Otherwise, the form is too complex to analyze."

After the finite-element model of the part is created, the part is analyzed to see how well it will perform. If a portion of the shape is found to need redesigning, the part's entire mesh must be recreated to reflect the change.

"After the designer designs the object, it is thrown over to the analyst, and the analyst says, 'OK, I think, based on my analysis, that your design has to be modified this way,' and then throws it back to the designer, who makes the modification," Subbarayan said. "That is not very integrated and not very efficient, and that's the reason these problems take so much time and computational power to solve."

"We are trying to speed up this process to make it more efficient by rethinking the way analysis is carried out. Instead of waiting until the end of the CAD process to do the analysis, we are trying to unify both the CAD design and analysis so that they are carried out concurrently."

Information about the software tool is detailed in a research paper recently published online and will appear in the May issue of the journal Advances in Engineering Software. The paper was written by doctoral student Xuefeng Zhang and Subbarayan. The software tool is based on theoretical work by another doctoral student, Devendra Natekar. Natekar graduated in 2002 and now works for Intel Corp., and Zhang graduated in 2004 and now works at General Electric's Global Research Center.

The method could be especially important when dealing with the corporate sensitivities of global competition.

"The overall philosophy behind the design approach can be extended to enable one to understand the impact of changes in suppliers' components on the performance of a complex system without revealing details of the components or the system," Subbarayan said. "This will enable suppliers to retain their proprietary design knowledge without revealing each other's intellectual property. Such strategies are critical as products are increasingly designed and produced in a globally distributed manner."

The software application, which was written by Zhang as part of his thesis, contains about 35,000 lines of Java code.

"That is a big and complex code," Subbarayan said. "If you take problems like finding the optimal shape for common automotive and aircraft structures, you have to somehow find the shape that has the least weight but at the same time won't break. We call that process shape optimization or topology optimization. These shapes have holes in them for bolting them in place or to reduce their weight. You have to decide whether to have one hole or two holes or 10 holes in a part, exactly where to put those holes and how to shape the holes."

Finite element modeling is the de facto analysis tool for numerous industries, Subbarayan said.

"When you use finite elements, you convert the complex differential equations that describe the physics of the part's behavior into simpler algebraic equations that the computer can solve," he said. "It's a powerful method because it enables you to take any complex problem and solve it."

"To describe the geometry, you take this complex object and break it into primitive objects like cubes, spheres or cones. With our approach, if I only modify some portion of the part, I only modify the primitives directly associated with that portion I am changing and not all of the primitives. If I only change the shape of a specific hole in the part, for example, the rest of the primitive objects are the same shape, so why should I need to reconstruct the whole geometry and remesh the whole geometry?"

Subbarayan calls the approach a "hierarchical, constructive, meshless procedure" because it enables engineers to analyze the changing design of a part without recreating the complex mesh of elements.

"The way it is now, the same CAD software used to make the shape of the part can't be used to analyze the mesh," Subbarayan said. "But now, the same CAD software or some similar CAD-friendly software will be able to do the analysis, and in a much more efficient manner because there is no remeshing."

Subbarayan began working on the project in 1998.

Purdue researchers are using the software tool to design new materials at the microscopic level, and the method also promises to help engineers create optimized shapes of droplets of solder to ensure longer-lasting circuit boards. A similar application is creating optimized arrangement of particles in "thermal interface materials" as they are inserted into microprocessors for heat dissipation. The material is sandwiched between silicon chips and metal heat sinks to serve as a buffer between the two surfaces so that the expanding and contracting metal does not cause the brittle silicon to crack.

"These are all problems in which a shape needs to be modified," Subbarayan said. "In the case of solder, you are talking about what shape a droplet should take - the boundaries of the droplet are constantly modified until the optimal shape is found."

- - - -

CONTACTS: Ganesh Subbarayan, 765-494-9770, ganeshs@purdue.edu

Emil Venere, Purdue News Service, 765-494-4709, venere@purdue.edu

RELATED WEB SITE:

Ganesh Subbarayan -- http://tools.ecn.purdue.edu/ME/Fac_Staff/ subbarayan.whtml

NOTE TO EDITORS: An electronic or hard copy of the research paper is available from Emil Venere (above). The paper will appear in Volume 37, No. 5, on pages 287-311 of the journal Advances in Engineering Software.

Media Contact: See above.



source:http://newswire.ascribe.org/cgi-bin/behold.pl?ascribeid=20060228.114424&time=12%2035%20PST&year=2006&public=0


Telltale Games Secures $825K

Telltale games, the company behind the much-anticipated follow-up to the PC adventure game Sam & Max, has secured an additional $825,000 in funding.ImageThe new funding comes in the form of a convertible bridge note, and brings Telltale's total seed funding to $1.4 million. The company plans to use the funding to expand its technology development, marketing and investment efforts.

Telltale CEO Dan Connors said, "With our first titles based on Jeff Smith's popular comic books [Bone], Telltale is releasing accessible, innovative, interactive stories for the masses. This expansion funding gives us the ability to move even more aggressively in key areas of our business and further accelerate our company's strategy and growth."

Telltale stated that the Keiretsu Forum, North America's largest angel investment network, "made significant contributions" to the funding initiative. Financial advisory firm avanceventures helped acquire the funding.

As a new game developer and publisher, Telltale released its first downloadable episodic title in the form of Bone: Out from Boneville (pictured) last year, with a follow-up coming in spring. In January, Telltale also launched an online casual gaming arcade.

source:http://www.next-gen.biz/index.php?option=com_content&task=view&id=2370&Itemid=2

Unintelligent Design

A monstrous discovery suggests that viruses, long regarded as lowly evolutionary latecomers, may have been the precursors of all life on Earth

Bernard La Scola peers into an electron microscope in his laboratory in Marseille, France.

Few things on Earth are spookier than viruses. The very name virus, from the Latin word for "poisonous slime," speaks to our lowly regard for them. Their anatomy is equally dubious: loose, tiny envelopes of molecules—protein-coated DNA or RNA—that inhabit some netherworld between life and nonlife. Viruses do not have cell membranes, as bacteria do; they are not even cells. They seem most lifelike only when they invade and co-opt the machinery of living cells in order to make more of themselves, often killing their hosts in the process. Their efficiency at doing so ranks them among the most fearsome killers: Ebola virus, HIV, smallpox, flu. Yet they go untouched by antibiotics, having nothing really biotic about them.


The existence of viruses was first surmised just over a century ago by Dutch botanist Martinus Beijerinck. He mashed up disease-riddled tobacco leaves and then passed the juicy pulp through a porcelain filter fine enough to trap everything down to the tiniest bacteria. When even that filtered fluid infected other plants, a world still acclimating to Louis Pasteur's germ theory now had an even tinier class of pathogens to contemplate. Here were entities so wraithlike that they remained unseen until 1935, when scientists armed with the newly invented electron microscope managed to take a picture of the "poison" lurking in Beijerinck's slime, today known as tobacco mosaic virus.


Less an organism than a jumbled collection of biochemical shards, the virus eventually yielded Wendell M. Stanley, the leader of the research team that exposed it, a Nobel Prize in chemistry rather than biology. The discovery also set off an intense scientific and philosophical debate that still rages: What exactly is a virus? Can it properly be described as alive? " 'Life' and 'living' are words that the scientist has borrowed from the plain man," the British virologist Norman Pirie wrote at the time. "Now, however, systems are being discovered and studied which are neither obviously living nor obviously dead, and it is necessary to define these words or else give up using them and coin others."


Seventy years later, the challenge continues to haunt science. So "other" are viruses that we're still trying to corral them with new metaphors: microzombies, pirates of the cell, submicroscopic hijackers. But even the more restrained characterizations betray a long-standing prejudice. Most biologists typically recognize three official branches of life: the eukaryotes, which are organisms whose cells have a nucleus; bacteria, the single-celled organisms that may or may not possess a nucleus; and archaea, an ancient line of microbes without nuclei that may make up as much as a third of all life on Earth (See "Will the Methane Bubble Burst?" Discover, March 2004). Viruses, being dependent on these organisms to host them, are viewed as evolutionary latecomers: genomic scraps that fell out onto the floor back when life was assembling itself into more complex arrangements.


The sheer prevalence of viruses, however, is forcing a reconsideration about how these entities fit into the biological world. Researchers have characterized some 4,000 viruses, from several dozen distinct families. Yet that is a tiny fraction of the number of viruses on Earth. In the last two years, J. Craig Venter, the geneticist who decoded the human genome, has circled the globe in his sailboat and sampled ocean water every couple of hundred miles. Each time he dipped a container overboard, he discovered millions of new viruses—so many that he increased the number of known genes 10-fold. Although we tend to think of viruses only in terms of the damage they do, a broader and more benign picture is emerging. Scientists estimate that they have discovered and documented less than 1 percent of all the living things on the planet. But for every organism in that unidentified 99 percent, at least 10 times as many unknown viruses are thought to exist—the vast majority of which are harmless to life and yet integral to it.


Now, with the recent discovery of a truly monstrous virus, scientists are again casting about for how best to characterize these spectral life-forms. The new virus, officially known as Mimivirus (because it mimics a bacterium), is a creature "so bizarre," as The London Telegraph described it, "and unlike anything else seen by scientists . . . that . . . it could qualify for a new domain in the tree of life." Indeed, Mimivirus is so much more genetically complex than all previously known viruses, not to mention a number of bacteria, that it seems to call for a dramatic redrawing of the tree of life.


"This thing shows that some viruses are organisms that have an ancestor that was much more complex than they are now," says Didier Raoult, one of the leaders of the research team at the Mediterranean University in Marseille, France, that identified the virus. "We have a lot of evidence with Mimivirus that the virus phylum is at least as old as the other branches of life and that viruses were involved very early on in the evolutionary emergence of life."

That represents a radical change in thinking about life's origins: Viruses, long thought to be biology's hitchhikers, turn out to have been biology's formative force.

FRIENDS AND FOES

Bacteriologist Bernard La Scola first identified Mimivirus, the largest known virus, in 2003. Although Mimi infects only amoebas, many of its kin pose a direct threat to humans.

This is striking news, especially at a moment when the basic facts of origins and evolution seem to have fallen under a shroud. In the discussions of intelligent design, one hears a yearning for an old-fashioned creation story, in which some singular, inchoate entity stepped in to give rise to complex life-forms—humans in particular. Now the viruses appear to present a creation story of their own: a stirring, topsy-turvy, and decidedly unintelligent design wherein life arose more by reckless accident than original intent, through an accumulation of genetic accounting errors committed by hordes of mindless, microscopic replication machines. Our descent from apes is the least of it. With the discovery of Mimi, scientists are close to ascribing to viruses the last role that anyone would have conceived for them: that of life's prime mover.

The green hallways of the second-floor Rickettsia Unit at the Mediterranean University constitute what might be called an archive of agony. Stored there on freezer shelves behind a series of locked lab doors is an array of bacterial pathogens that have been caught and identified by the unit's crack detectives over the years. Rickettsia, the microbe that causes such diseases as typhus and Rocky Mountain spotted fever, resides there alongside many others, from Salmonella to strains of Legionella, a bacterium that causes a severe pneumonia-like disease in humans and was first identified following an outbreak at an American Legion convention in Philadelphia in 1976.


In the early spring of 2003, amid the familiar cast of microbial villains, Bernard La Scola, a bacteriologist at the unit, came upon something that none of his colleagues had ever seen. Focusing his electron microscope on a sample of what he assumed would prove to be an elusive new strain of Legionella, La Scola found himself staring instead at a viral monster.


"I went, 'Whoa,' " he recalls. On his computer he brings up an image, magnified 200,000 times, of the creature: a bug-eyed, hexagonal smurf with a head of electrified hair. That's but one of its looks. In three-dimensional imaging it appears more like a soccer ball. La Scola recognized both shapes as classically viral. Viruses seem to crave such crystal-like geometry, which is one of many reasons they've been thought of as more chemical than biological.


"I'm a bacteriologist," he says. "I don't think in terms of viruses, but this thing was way too big for that. So I called a friend of mine in the faculty of virology, and he came and looked at it and said, 'Oh, yes, it is a virus, but the size . . . ' "


While there is some evidence to suggest that it may have once caused a type of pneumonia in humans, Mimivirus now seems to infect only amoebas. Until its positive identification in 2003, it was known as Bradfordcoccus and falsely suspected as the cause of a 1992 pneumonia outbreak in the West Yorkshire mill town of Bradford, England. That was where Mimivirus was first found, hiding inside an amoeba at the base of an industrial cooling tower. Cooling towers—along with evaporative condensers, riparian soil, tap water, showerheads, and treated sewage—are all hangouts of Legionella. And amoebas, as Timothy Rowbotham, a former disease detective with Britain's Public Health Laboratory Service, discovered back in 1992, are an ideal tool for collecting Legionella. In the ongoing to and fro between the world's microbes, amoebas are a nearly indomitable foe, gobbling up nearly everything in their path. Legionella, however, turns the tables on amoebas, rendering them both a food source and a perfect lab culture for microbe hunters.


Rowbotham was unable to identify a number of the samples he collected in 1992, and he stored the mystery cultures in his lab freezer for future research. When budget cuts forced the closure of his lab in 1998, he had the presence of mind to call around and ask fellow scientists if they would be interested in any of the critters in his fridge. He had recently met Didier Raoult, he recalls, "and when I told him on the phone about the cooling tower cultures, he said he'd love those systems."


A student of Rowbotham's who had just accepted a postdoc position in Raoult's lab in Marseille took the samples south with him. Four new strains of Legionella would eventually be drawn out and identified at Raoult's lab, along with new bacteria closely related to Chlamydia, parasitic bacteria that, like Legionella, cause a variety of diseases. One last sample, however, defied all methods of examination for more than a year and a half, until La Scola turned his high-powered scope on the last Bradford holdout: Mimivirus.


Mimi, the largest known virus.

In addition to its signature viral shape, Mimi exhibited what is known as an eclipse phase, a bit of telltale viral creepiness recognizable to any fan of sci-fi horror movies. When a virus penetrates a cell, it disappears inside the nucleus for four to eight hours, giving the outward appearance of complete normalcy. Then the viral particles that the cell has been coerced into making suddenly burst forth, shattering the host.

Still, it wasn't until Raoult sought the assistance of Jean-Michel Claverie, a bioinformatics specialist at the Institute of Structural Biology and Microbiology in Marseille, that the true weirdness and wonder of their monster was revealed.


At about a half-millionth of a meter across, Mimi is one of the few viruses visible under a standard light microscope. Its genome weighs in at a whopping 1.2 million letters: at least 10 times larger than a typical virus's; nearly triple the size of that of its largest viral counterpart, canary pox, in the smallpox family; and larger than the genomes of 20 or more parasitic bacteria. Moreover, within Mimi's outsize helping of genetic material, Claverie found genes for such things as the translation of proteins, DNA repair enzymes, and other types of protein. Those functions were thought to be the exclusive province of more complex cellular organisms. The boundary between viruses and complex bacteria had become officially blurred.


"We already had very large viral genomes in the database before Mimivirus," says Claverie. "But before we saw that the virus and bacteria groups could overlap, we never asked ourselves why some large viruses had, for example, 300 genes, while the typical virus only needs 10. Then we see Mimi, with over 1,000 genes, and we're thinking we have a problem with our whole concept of viruses."

Viruses come in all shapes, sizes, and degrees of sturdiness, and with all manner of strategies for getting at the cellular machinery they lack. Some batter-ram their way through the outer cell membrane. Some meld their membranes with a cell's and then suddenly revolve, like those faux bookcases in the movies, into the sacred chamber. Still others gain entry by disguising themselves as the sort of free-floating molecules that our cells routinely gobble up.


The manner of replication varies, too, depending on the virus's genetic identity. DNA viruses like smallpox, herpes, and now Mimivirus tend to be larger and more sophisticated genetically. They can exist for centuries outside a host and can afford to be more restrained when replicating inside one, making reliable, relatively error-free copies of themselves by hijacking the formula common to all life.


DNA makes a slight variant of itself known as RNA, which directs the production of the specific proteins of which all complex life-forms are composed. So-called RNA viruses are rogues: smaller, fast-replicating shape-shifters, descended from a time that evolutionary biologists refer to as the RNA world, back near the base of life's tree, before today's DNA-based organisms evolved. RNA viruses can direct the copying of their own proteins without using DNA—a shortcut that generates both more copies and more errors, or mutations. Although such activity might get you fired in the business world, in biology, mutations can offer a leg up. During unstable times—when environmental conditions shift or humans develop a successful vaccine—RNA viruses have the resiliency to adapt, outflank, and reemerge.

Influenza is the best known continuously morphing RNA virus. HIV is a particularly insidious RNA virus, known as a retrovirus because once inside the cell nucleus it reverses the DNA formula: a single strand of RNA manufactures its own double strand of viral DNA. That viral DNA is then directly spliced into the host cell's DNA and passed along with the cell's natural replication process.


There is even a newly discovered category of subviral agents known as viroids: naked snippets of RNA that lack even an outer protein coat and don't encode for anything. They are devoid of genes entirely, and yet they replicate and cause illness once inside a host. And then there are deeply derivative entities called satellites, metaviruses that can replicate only within a virus that is already busy inside a host.


Whatever neat conceptions and categorizations we develop, viruses have always found a way to poke holes in them. Scientists long assumed, for example, that viruses could only be made of DNA or RNA. Then in the late 1990s, a number of viruses were found to contain both. Retroviruses, meanwhile, were long thought to infect only animals. The only seemingly safe assumptions were that viruses will always be smaller in both physical size and genomic content than the simplest bacteria and that viruses had to have evolved after those same cellular organisms, on which their parasitism depends.




In the lab, a virus must first be extracted from its holding vial before its genes can be extracted.

Now the discovery of Mimivirus has rendered even these two viral paradigms questionable. What Claverie calls "the final click" came after comparative analysis of Mimi's DNA with that of other organisms in life's three domains: the eukaryotes, bacteria, and archaea. Mimi, it turns out, belongs to its own distinct and extremely ancient lineage of large DNA viruses. Moreover, certain signature Mimi genes, such as those that code for the production of the soccer-ball shape of its capsid (an outer protein coat common to all viruses), have been conserved in viruses that infect organisms from all three of the domains, particularly in eukaryotes. The implications of that finding are truly radical: that Mimi, or a Mimi-like ancestor, emerged prior to the three other domains and played a key role in inventing the very cells of which humans and all complex cellular life-forms are made.


It is a difficult concept to get one's head around. Parasites, to us, are derivative, necessarily descendant from the biological entities they depend on for life. But simple does not always mean less evolved. Mimi's outsize complement of genes—so large that the virus is tantalizingly close to being an independent organism—suggest to many scientists that Mimivirus underwent reductive evolution early on and shed some of its genome, including the genes necessary to replicate on its own.


"With Mimi, we've captured by chance a picture of an organism that was undergoing such a reduction, evolving toward fewer genes," says Claverie. "This guy just retained more ancestral features than others." Biologists, Claverie says, can no longer view viruses as random assemblages of genes. "We have to confer to these guys a nobility, a genealogy. Not only a genealogy. They are very ancestral, and their ancestors are at least contemporary with ours and those of all present-day life-forms. Mimi is like the missing link."

With the aid of advanced gene sequencing, comparative DNA analysis, and endless cross-referencing of the genomes of organisms from the three—or perhaps four—domains of life, a fuller concept of viruses and their role in evolution has begun to coalesce. In the mere year and a half since Bradfordcoccus's true identity was revealed, more genetically distinct and extremely ancient viruses have been found. All of them lead scientists to the same conclusion: Evolution's archvillain looks more and more like its vital and formative force.


Even as Darwinism has come under attack from the theology of the intelligent design movement, scientists have never been closer to divining life's origins. With DNA evidence as solid as that used to convict criminals, researchers can trace the shared genetic lineage of life's different branches back to the very base of the tree, some 4 billion years ago, when the interaction between primordial bacteria and viruses culminated in the "mother cell," the common ancestor of all life on Earth. Although the remoteness and complexity of those events makes them difficult to piece together, viruses like Mimi are emerging as the key players in the picture.


"We are now able to draw a tree of life for the first time that includes viruses as their own branch," says Patrick Forterre, a molecular biologist at the University of Paris-Sud.


Last July Forterre held a weeklong conference in Les Treilles, France, where two dozen of the world's leading microbiologists, cell biologists, and evolutionary biologists met to discuss "The Origin of the Nucleus." The nucleus, the command-and-control center of the eukaryotic cell, is ultimately what distinguishes a human from a bacterium. For eons prior to the emergence of the nucleated cell, life on Earth was essentially slime: vast, directionless mats of single-celled bacteria and archaea.

With no nucleus to further modify and craft gene expression and protein translation, life thrived but literally could not get hold of itself, could not assume new shapes or diversify. How the first nucleus came to be is a question that has intrigued scientists ever since Scottish botanist Robert Brown first detected a cell nucleus while peering at orchids under a microscope one day in 1824.


The discovery of Mimivirus lends weight to one of the more compelling theories discussed at Les Treilles. Back when the three domains of life were emerging, a large DNA virus very much like Mimi may have made its way inside a bacterium or an archaean and, rather than killing it, harmlessly persisted there. The eukaryotic cell nucleus and large, complex DNA viruses like Mimi share a compelling number of biological traits. They both replicate in the cell cytoplasm, and on doing so, each uses the same machinery within the cytoplasm to form a new membrane around itself. They both have certain enzymes for capping messenger RNA, and they both have linear chromosomes rather than the circular ones typically found in a bacterium.

"If this is true," Forterre has said of the viral-nucleus hypothesis, "then we are all basically descended from viruses."


Claverie says, "That's quite a big jump in our thinking about viruses—to go from their not even being organisms to being all life's ancestor."


Some scientists go a step further. They believe that viruses played a role even earlier in the evolutionary mix. The precise order in which the three domains of life evolved—whether, say, the eukaryotes emerged before or after the archaea and bacteria—is a much-debated subject. So is the identity of the progenitor of those different domains, the so-called last universal common ancestor, or LUCA, as it was dubbed by Forterre at the first Les Treilles conference in 1996.


"I'm probably one who has asserted most sternly that LUCA was viral," says Luis Villarreal, the director of the Center for Virus Research at the University of California at Irvine. "The genes and gene functions suggest that we're dealing with one of the earliest and oldest forms of life. Mimivirus really stretches our sense of scale of what a virus can be."


But just how far can that scale be stretched? David Prangishvili, a virologist at the Pasteur Institute in Paris and a colleague with Forterre in studying viruses that infect archaea, now thinks that viruses swam in the primordial soup prior to the emergence of cellular life of any kind and only later became dependent on cells. Forterre is less convinced.


"It is difficult for me to imagine," he says. "You need to have some type of closed system to be sure that the different reactants of the metabolism, or different mechanisms, can interact with each other and also have a kind of Darwinian evolution. You need to have individuals. I think there was an RNA world prior to the DNA world, when you had a lot of RNA cells. Maybe viruses originated at the time of the RNA cell. You need to have a cell to even obtain a virus."


Yet to virologists like Prangishvili and Villarreal, the concept of viruses as the primordial soup's first built-in stirrers seems to align perfectly with their nature: high creative replication, genetic reproduction, and sorting of gene fragments, not to mention their eerie biochemical straddle between life and nonlife.


"I think what confuses people is their assumption that parasites are only damaging things," says Villarreal. "How do you get creation and complexity out of them? You do because they persist, and to do that you have to take on all comers. You come up with inventions that prevent you from being displaced. It's no surprise that the number-one-selling software on the planet these days claims to be 'antiviral.' "


Information, whether biological or industrial, is passed along by replication. Create a new word-processing file and copy it: that's replication. But any replication process is susceptible to errors, which in turn can generate novelty. And novelty, especially in harsh, shifting conditions like those that prevailed on the newly formed Earth, is often an advantage: Some new life-forms will adapt better to the environment. To the utter abhorrence of the proponents of intelligent design, there is a certain randomness to evolution.


Some viruses, like Ebola or the new avian influenza, are basically runaway replicators, effectively burning their own life bridges in the process. But the majority, as Villarreal puts it, strive "to persist, not make a lot." Those that do persist eventually become both stable within, and staples of, evolution. The overwhelming majority of viruses are not harmful to their hosts. Each of us is infected with a huge array of viruses. The human genome, considered as a mass, contains more retrovirus sequences than actual genes.


"They're not doing anything," says Villarreal. "They're just persisting. And they were around long before humans evolved. The better part of the human genome is composed of viral DNA. That's true of nearly all eukaryotes, and the more complicated the organisms, the more of those sequences you have. We aren't sure exactly what they all do, but they are part of our genetic identity, this stuff we dismiss as junk. 'Junk' and 'parasite' are both words that will get you into a fight if you use them improperly. And yet they are where all life's creativity lies—its very origins."

What was the very first bit of life's biochemical code, and where did it come from? It may be no surprise to learn that viruses figure ever more prominently into this line of speculation. Some researchers go so far as to suggest that the very first life on Earth could have arrived in the form of a viral shard from afar, perhaps conveyed in the pore of a meteorite.


"Well, I used to laugh at the idea," says Mark Young, a Montana State University biologist who leads a research team that gathers new archaeal viruses from superhot aquatic environments in Yellowstone National Park and other places around the world. "But I wouldn't say it's absurd anymore. I think it has to at least be kept in the portfolio of the discussion."


Where researchers do agree is that a nearly immeasurable array of viruses remain to be discovered on this planet. A growing number of virologists and biologists are out to catalog them. Both Claverie's and Raoult's labs have already begun searching for more viruses like Mimivirus. Among the most likely sites are algae, the ocean, and of course, cooling towers. Claverie says he sees no good reason why there can't be viruses bigger than Mimi.


"I'm hoping Mimi isn't the only one of its kind on Earth," he says, "especially since that cooling tower in Bradford has been destroyed. But it can't be the only one. That would be ridiculously lucky for it to have just fallen into our lap."


Meanwhile, Young has been finding new archaeal viruses every time he looks for them. Asked why Mimivirus hadn't been discovered sooner, he says it may come down to the simple fact that we just haven't been looking.


"We haven't even begun to scratch the surface. The numbers are mind-boggling. If you put every virus particle on Earth together in a row, they would form a line 10 million light-years long. People, even most biologists, don't have a clue. The general public thinks genetic diversity is us and birds and plants and animals and that viruses are just HIV and the flu. But most of the genetic material on this planet is viruses. No question about it. They and their ability to interact with organisms and move genetic material around are the major players in driving speciation, in determining how organisms even become what they are."


We have been looking for our designer in all the wrong places. It seems we owe our existence to viruses, the least of semiliving forms, and about the only thing they have in common with any sort of theological prime mover is their omnipresence and invisibility. Once again, viruses have altered the way that we view them and, by extension, ourselves. As it turns out, they are not the little breakaway shards of our biology—we are, of theirs.



source:http://www.discover.com/issues/mar-06/cover/


Now Johnny CAN program, says Behlendorf

I had a chance to chat to Apache co-founder Brian Behlendorf late last week.Brian Behlendorf, Apache co-founder and CollabNet CTO Now CTO of CollabNet, Behlendorf these days is in the business of helping companies co-ordinate their software development. But I was surprised (and pleased) to find the extent to which that increasingly means welcoming large numbers of non-programmers into the development process — something that can only grow with the spread of Web 2.0-style mashups into the enterprise.

"The concept of who's a programmer is going to become very fuzzy," Behlendorf told me. "Mashups are really Excel macros 2.0 — and with the rise of web services, the more vehicles that are out there that expose data through programmable APIs, with Office 12.0 and Firefox with AJAX, the more people you'll see create applications. The line between hardcore developers and the average Joe will start to get very fuzzy."

I thought the analogy of mashups to Excel macros"Mashups are really Excel macros 2.0" was especially insightful, and as I've sat down to write this, it's put me in mind of an article from a couple of years ago by spreadsheet inventor Dan Bricklin, about Why Johnny can't program (or 'wontprogram' as the URL has it). He ended the article with a hierarchy of programming systems ordered according to how acceptable they are to 'the average Joe' (or Johnny, in this case). Spreadsheets come next after dialog boxes and forms, because it's much easier for the average user to start getting results with them than visual development environments, markup languages and (worst of all) raw programming languages such as Java, C and FORTRAN.

What Behlendorf is saying is that when you start putting mashup tools into the familiar environment of Microsoft's Office suite or into extensions to a popular browser like FireFox, then the effect is the same as putting macro languages into the spreadsheet — it opens them out to 'average Joe' users, because (as Bricklin explains) that's an environment where they can see results as they go along, and thus more easily learn by trial and error.

This is not just an analogy, of course. You can already see it literally in action today if you download and start using StrikeIron OnDemand Web Services for Microsoft Excel, a plug-in that lets you drag-and-drop Web services output fields directly into Excel. Or you can play with the "wkcHTTP" function that Dan Bricklin built into the latest Alpha release of wikiCalc — the collaborative spreadsheet program he's currently developing. The new function provides access to external web services during recalculation. In the Web 2.0 world, it's appropriate of course for Johnny not only to program, but to do it collaboratively, too.

Collaboration is a key element of lifting everyone's game in the CollabNet world. So I asked Behlendorf about the dangers of unleashing newbies into the development arena. Wouldn't it end up creating a lot of Frankenstein mashups, just like the horror artwork from the early days of desktop publishing?

"I don’t know what we could do to keep users from writing bad software," he laughed. "I'm fully in favour of giving people more rope to hang themselves with."

Professional developers have themselves been all too guilty of creating APIs that change frequently, thus breaking dependent applications, he said. The best way of placing a check on bad programming is by opening up communication channels and feedback forums.

"If there is room for feedback where they can be flamed to hell for breaking things, they'll be less likely to do that."

CollabNet's experience is that large numbers of users do get involved in the development process. A typical CollabNet site will start off with a couple of hundred seats for all its core software people, but once it's been live for a year and a half or more, that number will expand to include even larger numbers of non-core or "light" users.

"There's this spectrum of development all the way between naive users and core developers," said Behlendorf. He believes the non-programmers have a crucial stake in the applications that are being developed, and should have access to a framework that lets them participate.

source:http://blogs.zdnet.com/SAAS/index.php?p=125


Who's Reading Your Cell's Text Messages?

An eWeek article that's something of a life lesson: Don't be too smart for your own good. The article tells the tale of a college student who cleverly chose null@vtext.com as his cellphone email address. He's been getting thousands of wayward emails and text messages since 2001. From the article: "Initially, the content of the messages was innocuous, he said. It was things like don't forget to drop the car off at baker's and to call mom at 781-XXX-XXXX, stuff like that, Bubrouski wrote. The problem worsened in mid-2002, when Bubrouski's phone began channeling what he claims were dozens of messages from an e-mail address used by General Motors' then-new "OnStar" system. The messages quickly filled up the memory on his cell phone and contained diagnostic response to tests on a beta version of OnStar. 'Basically, peoples' cars were sending messages to my phone, Bubrouski wrote. "

source:http://slashdot.org/article.pl?sid=06/02/28/1630206

Smaller Brands Rise in Electronics Market

SAN JOSE, Calif. - Steve Forman is getting ready to join the flat-panel television craze. Lower-priced 42-inch models are top candidates.

But don't ask the 62-year-old retired engineer to name the brands: He can't remember them.

Like many consumers, Forman, the current owner of a Toshiba TV, has abandoned his former penchant to stick with well-known brands, reflecting an agnosticism that has empowered underdogs in electronics.

The major consumer electronics makers still dominate the market, but analysts say the collective presence of lesser-known brands for products ranging from TVs to computers has helped keep prices down while boosting product choices.

In some sectors, like LCD TVs, "they are emerging as forces to reckon with," said iSuppli Corp. analyst Riddhi Patel.

At a heavily trafficked Costco store in Redwood City, gleaming high-definition TVs from household names like Panasonic, Pioneer and Sony share the shelf space with generally less expensive sets from less familiar Akai and Vizio.

At Amazon.com, a 32-inch Olevia flat-panel LCD TV was recently the No. 1 selling TV. The relatively unknown brand consistently ranks in Amazon's top TV vendor list, helped by positive customer reviews, according to Amazon officials.

"Twenty years ago, you couldn't create a consumer electronics brand this fast," said Vince Solitto, chief executive of Syntax-Brillian Corp., which introduced its first Olevia TV in 2004. "The only reason we were able to do this was because of the Internet."

The Tempe, Ariz.-based company started by selling its LCD TVs online only, at 20 to 30 percent below name-brand prices. Like other upstart electronics makers, Syntax-Brillian benefited from customer and media reviews posted online, while shopping-comparison sites and other Web outlets helped point consumers to good deals or new brands.

Now, Olevia TVs are also sold at two dozen retail chains, including Circuit City, Kmart and CompUSA.

In the holiday quarter of 2005, Syntax-Brillian was the eighth-largest seller of LCD TVs in the United States, according to iSuppli. At the same time, Vizio TVs made by Costa Mesa, Calif., newcomer V Inc. emerged from obscurity to rank No. 11, notching a 2 percent share, just under the much better known Toshiba Corp., which had 3 percent.

With their lower prices, Olevia and Vizio are nipping away as Sharp Corp. fights to hold onto its leading 13 percent share, with Sony Corp (NYSE:SNE - news). on its tail and stiffening competition from such other makers as Royal Philips Electronics NV and Samsung Electronics Co.

The underdogs' market gains are considerable feats, analysts say, since the brands were nonexistent a few years ago and are now competing against an estimated 60 TV makers.

But flat-panel televisions marked a frontier in the new millennium, up for grabs for electronics behemoths and newcomers alike. No one has a long track record yet in plasma or LCD TVs.

And many electronics makers often get core components from the same big-name panel makers, like Samsung and LG Philips LCD Co. or a variety of Taiwanese manufacturers.

Using the same suppliers doesn't mean product quality is the same across the board. Some companies, and especially deep-pocketed vendors, invest in other components and proprietary technologies that could make differences in features, reliability and picture quality, analysts say.

The question, however, has become how much extra consumers are willing to pay for the difference in quality or brand reassurance. Recent surveys show consumers often name price over brand as their main buying criterion in electronics, said Forrester Research analyst Ted Schadler.

Forman, for instance, says he doesn't want to spend more than $2,000 on a 42-inch flat-panel TV.

"I don't have a good way of judging quality or long-term reliability, so I'm looking at all brands," he said. "And nowadays I believe that most companies have got the quality and technology up to a good level or they wouldn't be in business."

Consumers are also putting more faith in retailers, which would rather not carry expensive televisions or computers that are unreliable, since those could become customer-service liabilities, Current Analysis analyst Sam Bhavnani said.

At the same time, there are simply more retailers latching onto electronics.

When Office Depot Inc. approached Syntax-Brillian last fall to stock up for holiday shoppers, Solitto was surprised a place that sold staplers would want TVs. But he was more than happy to comply.

Five years ago, Taiwan-based Acer Inc. was one of the faceless manufacturers for top-tier computer companies. But after starting its own line of products, Acer now competes head-on with some of its former customers.

The computer maker strives to run the leanest possible operation so it can offer low prices and still profit. It has a skeleton sales force and avoids expensive marketing methods like rebates that draw customers but require extra overhead.

"Just having a simple, everyday low price works for us," said Maarten de Haas, Acer's vice president of product management and marketing.

The strategy is paying off.

In the fourth quarter of 2005, Acer sold 7 percent of computer notebooks in the U.S. retail market, stealing share from retail stalwarts Toshiba and Hewlett-Packard Co., according to Current Analysis. With its strength in Europe and the Middle East, Acer now ranks as the world's fourth-largest PC maker and is climbing in the U.S., where it is eighth, according to market researcher IDC.

Syntax-Brillian's tight focus on TVs similarly lends to its success, Solitto said.

"We're not marketing a lifestyle," Solitto said. "We're not selling everything from MP3 players to refrigerators."

Indeed, electronics titans like Sony and Samsung spend millions in marketing and research across a wide swath of cutting-edge technologies.

That's necessary for Sony to compete against its core, big-name rivals, said Randy Waynick, a Sony senior vice president. Waynick insists Sony isn't looking to compete on price against the lesser-known brands.

"There's always a place for a good, better, best story, and sometimes, it's a cheap, good and a best," Waynick said.

It's up to consumers to decide what works for them. And there's certainly no shortage of options now, Patel said: "Consumers have brand choices at each price level."

source:http://news.yahoo.com/s/ap/20060227/ap_on_bi_ge/underdog_electronics


This page is powered by Blogger. Isn't yours?