Monday, May 29, 2006

High performance FFT on GPUs

"The UNC GAMMA group has recently released a high performance FFT library which can handle large 1-D FFTs. According to their webpage, the FFT library is able to achieve 4x higher computational performance on a $500 NVIDIA 7900 GPU than optimized Intel Math Kernel FFT routines running on high-end Intel and AMD CPUs costing $1500-$2000. The library is supported for both Linux and Windows platforms and is tested to work on many programmable GPUs. There is also a link to download the library freely for non-commerical use."

source:http://hardware.slashdot.org/article.pl?sid=06/05/29/1424213

Freshman MIT Students Automate Dorm Room

"Two freshman MIT students have automated their dorm room, complete with a big red party button which generates an instant party. Their custom-engineered system is called MIDAS, the Multi-Function In Dorm Automation System. According to the MIT News office, "Gone are the light switches and glaring fluorescent lights of a typical dorm room. Zack Anderson and RJ Ryan's room has several lighting schemes, remote web access, voice activation, a security system, electric blinds and more ... With the touch of one red button, their dorm room becomes a rave. The lights go out, the blinds close, the displays read, "feel the energy" as a voice repeats the same phrase over a deep bass beat.""

source:http://hardware.slashdot.org/article.pl?sid=06/05/29/132210

Our Indie Experiment - MadMinute Games

"MadMinute Games' Norb Timpko has contributed the first installment in a series on independent game developers. He describes the balancing act required to get a game like Take Command: 2nd Manassas out the door while still having families and day jobs."

source:http://games.slashdot.org/article.pl?sid=06/05/29/1150200

Alternate Reality Gaming V2.0

"Alternate Reality Games [ARGs] have been bubbling under for the past 10 years now. Usually completely homebrew or attached to big budget productions, they have been used to create buzz around a game, product or movie. Perplex City have bucked that trend. Their ARG is completely independent of anything else, its entirely self contained. With fresh ideas on income generation and a $200,000 top prize to whomever finds the real life buried treasure - is this the future of an entirely new form of entertainment?"

source:http://games.slashdot.org/article.pl?sid=06/05/29/1135232

Laser enrichment could cut cost of nuclear power

NUCLEAR power could become significantly cheaper thanks to world-leading laser technology being developed in Sydney.

A team of about 25 scientists, engineers and technicians at Lucas Heights, home of Australia's only atomic reactor, has succeeded where other nations, with budgets stretching into billions of dollars, have failed.

After a decade of work they have tested a new way to process, or enrich, the uranium needed to drive power plants.

The technology, said Michael Goldsworthy, a nuclear scientist and leader of the project, may halve enrichment costs, which he estimated accounted for 30 per cent of the price of nuclear fuel.

Power stations are fuelled by a specific blend of two types of uranium. About 5 per cent must be uranium 235, with the rest made from uranium 238. But natural uranium is 0.7 per cent U-235 and 99.3 per cent U-238.

There are at present only two methods for sifting uranium atoms, or isotopes, to create the right mix. One, called diffusion, involves forcing uranium through filters. Being lighter, U-235 passes through more easily and is thus separated from its heavier counterpart. The second method, widely adopted in the 1970s, uses centrifuges to spin the heavier and lighter atoms apart.

Both, said Dr Goldsworthy, are "very crude. You have to repeat the process over and over," consuming enormous amounts of electricity. The spinning method requires "thousands and thousands of centrifuges".

The Lucas Heights team, working for Dr Goldsworthy's research company Silex (Separation of Isotopes by Laser Excitation), is the only one in the world developing a third technique that involves streaming uranium through lasers tuned to a frequency that only "sees" the U-235 atoms.

The lasers electrically charge the atoms, which become trapped in an electromagnetic field and drawn to a metal plate for collection. "It's absolutely cutting-edge technology, incredibly difficult to develop," Dr Goldsworthy said.

During the 1980s and '90s the US, France, Britain, Germany, South Africa and Japan attempted to develop laser-enrichment technology, but all failed. One US effort involving 500 scientists gave up after spending $2 billion.

"By world standards, we have worked on a shoestring budget," Dr Goldsworthy said, estimating the "more elegant and sophisticated" Australian concept at about $65 million.

This week Silex, which has no government funding, signed a deal giving General Electric the rights to commercialise the technology. The first laser-enrichment plant will be built in the US, but others could follow in Australia.

Dr Goldsworthy hopes that in 20 years the laser technology could be enriching a third of the world's power station uranium, returning "handsome royalty streams" to Australia.

Asked if the Federal Government, which this week speculated Australia could "value-add" mined uranium through enrichment, was aware of his team's progress, Dr Goldsworthy said that, due to regulation, "we report to the Government regularly".

source:http://www.smh.com.au/news/national/laser-enrichment-could-cut-cost-of-nuclear-power/2006/05/26/1148524888448.html


One small breath for man

Scientists have paved the way for the first permanently manned base on the Moon by developing a way to 'squeeze' oxygen out of lunar soil.

Nasa experts say the technique will allow astronauts of the future to create their own supplies of the gas instead of transporting it all from Earth.

The space agency plans to take its extraction system to the Moon in 2011 as part of its Robotic Lunar Exploration Program, which will test a range of equipment designed to support human life. If the technique is successful, it could lead to a permanent station like Moon-base Alpha from the popular Seventies series Space: 1999.

To extract oxygen from lunar soil, scientists used a lens-like structure to focus sunlight on to it, heating it to 2,500C.

In Nasa's latest tests, a 12ft-wide dish was used to concentrate the sun's rays on to 100g of a substance similar to Moon soil. After a few hours, one fifth of the substance had turned into oxygen.

The soil is kept in a vacuum during the process to help suck out the oxygen.

Lunar soil brought back to Earth is in short supply and highly prized, so Nasa researchers have been using matter with the same composition for its tests.

The soil contains about 45 per cent oxygen by weight, but it is mostly 'trapped' in the form of silcon dioxide.

Nasa plans to repeat the same processs on the Moon to produce oxygen, which could support life and be used to help fuel rockets setting out on deep-space missions.

At the moment, all oxygen supplies would have to be brought from Earth, which is so expensive and energy-inefficient that it effectively rules out a permanent Moon base.

Dr Eric Cardiff, an engineer at Nasa's Goddard Space Flight Centre in Maryland, explained: 'Part of the advantage of the technique is that we're using the resources that are present on the Moon. We're living off the land, as it were.'

He added: 'You can breathe pure oxygen. There are some trace gases mixed with the oxygen we produced but they're in very small amounts. There is nothing dangerous.'

The next step for scientists will be to reduce the temperature needed to extract oxygen.

Carrying out the method on the Moon would involve a mining operation to collect soil and feed it into a reactor, where the oxygen could be drawn off.

It will be easier to lift the soil than it would be on Earth because the Moon has a lower gravity.

Alternative methods to extract oxygen from Moon soil are also under investigation, including melting the rocks into a liquid and freeing oxygen with an electric current.

source:http://www.dailymail.co.uk/pages/live/articles/news/news.html?in_article_id=388052&in_page_id=1770


Welcome to Google Checkout, that will be $3.14

The first time I looked up the domain "GDrive.com" it appeared that someone other than Google had it registered. A trip down memory lane takes us to my very first article that describes how I determined GDrive.com is in fact owned by Google, despite what it looks like on the surface.

Well, by the same logic I have found that a brand new set of domains appearing to be registered to someone else were actually registered by Google on May 25th.

The domains googlecheckout.net/org/info (.com is owned by someone else at the moment) have all been registered to a company called DNStination, Inc. Don't be fooled, the registrar is MarkMonitor — a company that prides itself on the protection of your corporate identity. There is no way they would let just anybody register a domain with "Google" in it — especially since Google is one of their clients.

Then who is this DNStination, Inc. then? Googling the address of this "company" tells us exactly who it is. The address maps directly to none other than MarkMonitor itself.

Since we know Google is behind it's registration, what is Google Checkout going to be? I think it will be a shopping cart system to help websites accept payment for their items online. The money site owners make will be deposited into a holding account at Google — just like AdSense works.

Isn't this starting to sound a lot like PayPal? Who knows, they could even offer a Google branded Mastercard "debit card" like PayPal's ATM/Debit Card — after all, the domain googlemastercard.com is registered to Google too.

If this is indeed what they are planning, it would make sense for Google Checkout to tie into Google Analytics so website owners can easily track with certainty how their AdWords campaign is directly affecting sales — right through the checkout process.

Maybe one day Google will even provide an inventory management solution with an API so websites can have their inventory in Google Base and on their own website without double entry.

source:http://blogs.zdnet.com/Google/?p=208


A look at how the PS3 got to be $600


While the entire internet (at least the gaming part) continues to reel from Sony's $600 price announcement, we have to consider just how it came to be that the video game console leader decided to create such an expensive piece of machinery. There's no confusing this: Sony has known for some time that the PlayStation 3 would cost more than any game system before it (two jobs, remember?). But what corporate machinations influenced that decision?

Last week, Next Generation ran a fascinating article written by David Cole, of strategic market research firm DFC Intelligence, following the circumstance that resulted in the unpopular price tag. With the growing homogenization of consumer technology, and increased competition, Sony -- under the direction of Ken Kutaragi as head of Sony's semiconductor operations -- looked to custom-built technology like the Cell processor and Blu-ray to distinguish their product from the others (compare this to Microsoft's more nimble strategy of outsourcing the 360's chip-design to IBM).

Kutaragi was demoted after being passed over for the role of CEO and, when former Sony Pictures head Howard Stringer assumed the position, the relationship between the content and technology divisions of Sony became even more intimate. Stringer "quickly dubbed the PlayStation 3 as one of the company's 'champion' products." Kutaragi's desire to stratify the console market with Cell technology in effect wed Sony to the unpalatable prospect of charging an unprecedented price. Coupled with Sony's desire to not only push their own content on HD discs, but to control that medium with their proprietary Blu-ray format, the final price was escalated by two very advanced (and very expensive) pieces of Sony technology.

Continue reading for some thoughts on why the price may not matter ...

During the hectic Xbox 360 holiday shopping craze, Slate's "Everyday Economist" Tim Harford wondered, considering the prices they were fetching on eBay, "Why doesn't Microsoft price them at $700 instead?" Microsoft was already losing money on every console sold and, as a public company, that's got to be a hard thing to shrug off. If 360s were netting in excess of $700 on eBay, shouldn't Microsoft have taken advantage of that demand to help offset the initial investment?

Like consumer electronics (DVD or, perhaps, Blu-ray players), prices start off high -- appealing to the early-adopter audience who absorb the brunt of the cost -- only to lower to more palatable mass-market prices once efficiencies of scale are reached and manufacturing processes are streamlined. Sony could be modeling their admittedly very high-tech PS3 after this model (which has earned them some small amount of success in the similarly competitive world of consumer electronics), opting for more gradual price adjustments instead of the long term static prices we're used to today.

The problem is, if Sony chooses to follow this model (that is, if the price will drop after initial peak demand is fulfilled and then continue to drop) will consumers know enough to base their purchasing decisions on that future possibility when an Xbox 360 will cost them significantly less now? In a follow up to his Xbox 360 piece, Slate's Harford returned to the question with some more input on the high demand on eBay being related to the paucity of gamers willing to resell their newly earned 360s: "only a few consoles are up for sale and only the most desperate buyers compete for them. If more people put their consoles up for auction, the price would drop."

If consumer perception reflects a belief that Sony's price will remain static, a larger worry for Sony may be waning publisher support. With next-gen development costs increasing, it will be proportionally difficult to persuade publishers to isolate their titles to one platform, especially when that platform is being sold as a (relatively cheap) next-gen movie player and not a gaming console. As Cole concludes in Next Generation, "The PlayStation 3 needs to justify its price difference as a game machine." They haven't done that yet ... but there's still time.

source:http://www.joystiq.com/2006/05/27/a-look-at-how-the-ps3-got-to-be-600/

Why There Are No Indie Video Games And why that's bad for gamers

Don't believe all those recent articles about how video game manufacturers are headed to the poor house. The global market share for video games should surpass $40 billion by 2010. Thanks to powerful new consoles, their graphics are approaching CGI quality. And they now attract an audience well beyond teenage boys. Remember, this all started with two paddles and a bouncing ball.

But success can have a downside. Just ask Pong's creator, Nolan Bushnell, who often gripes that flashy game technology stifles creativity. Bushnell might be a dinosaur, but he has a point. Over the last decade, the costs of high-gloss, graphics-intensive titles have mushroomed. Console games are now so expensive that only a few companies can afford to make them. Today's game industry is like Hollywood in the first half of the 20th century, when an oligopoly of studios controlled the business. Instead of MGM, Paramount, and Universal, we have Electronic Arts, Sony, and Activision.

In today's movie business, it's possible for an indie film like Napoleon Dynamite to become a sensation. Saw, which cost a mere $1.2 million, grossed 100 times that amount. That just doesn't happen in video games. The average PlayStation 2 game costs about $8 million. Studios often need large development teams—usually 40 or more people—to meet their tight deadlines. They spend money to license everything from comic book heroes to graphics engines. They record A-list actors. And if they burn their own CDs or do their own marketing, costs can really soar.

Most independent developers take money from the big publishers in exchange for the rights to the games they've developed. The publishers market and distribute the games to retailers. The developers pay back the initial loan from the royalties they earn. Several industry types told me that an indie studio will typically get a $5 million advance on 15 percent royalties. If the game has a wholesale price of $30, the developer must sell more than a million units to get out of hock. In other words, the game has to be a blockbuster, something on the order of Tomb Raider or Splinter Cell. The cost of the average PlayStation 3 game is expected to rise to $15 million-$20 million, plus another $10 million or so for marketing. That means indie developers, who already go bust with great regularity, will have even less wiggle room.

For today's indie developer, a safety net is just as important as a good idea. Stardock, the company behind the hit PC strategy game Galactic Civilizations, gets most of its revenue from sales of office software. Other indies make deals with the government to work on defense technology then plow these funds back into game development.

Why should gamers and industry bigwigs care if it's tough for the little guy? Because back when games were cheaper to make, the independents came up with the ideas that moved the business forward. Richard Garriott peddled Ultima, the first major role-playing title, in plastic bags. Sid Meier's Civilization and Westwood's Dune II cracked open the strategy genre. Id Software's John Carmack and John Romero created the pioneering first-person shooter Doom. Will Wright gave us SimCity and open-ended "sandbox" simulations.

What happened to these pioneers? Garriott never produced another breakthrough like Ultima; he now works for online multiplayer giant NCsoft. Meier has spent most of the last decade updating his previous hits at a company owned by Grand Theft Auto publisher Take-Two Interactive. Id Software has clung to its independence but produced nothing further in the way of milestone games. Perhaps the lone indie superstar to retain his auteur status is Will Wright, who now has his own "studio" within Electronic Arts. Wright takes years to cook up his always-innovative games. In 2000, he released The Sims, which transported players into the first of many "digital dollhouses" and became the best-selling computer game of all time. Wright's next game, Spore, aims to simulate the evolution of life from microorganism to space-faring civilization. It will probably be the only innovative title EA releases next year.

But Wright and his studio, Maxis, are an exception. The most successful indies get bought by the industry giants, where they often become casualties of consolidation. Westwood Studios, which created the hit Command & Conquer, was bought by Electronic Arts in 1998 and shut down in 2003. Wolfpack Studios, which made the game Shadowbane, was bought by Ubisoft in 2004 and shut down last week.

Instead of adopting the solo developers' pioneering mojo, the risk-averse major studios stick with proven formulas. Can't wait to fire up Final Fantasy 13? It'll be out soon. So will the 19th iteration of EA's Madden football game, complete with updated player names and numbers. EA, the industry's leading publisher, had a nasty case of sequelitis last year. I've looked at nearly 50 games that EA released last year, and I've yet to find one that isn't a rehash like NBA Live 06 or a movie tie-in like Batman Begins.

With costs rising, the console market looks nearly impossible to break into. If there's a niche indie developers can make their own, it's PC gaming, which accounts for about 15 percent of the domestic market. Greg Costikyan, a purveyor of experimental PC games with less-demanding graphics, plans to bypass publishers' distribution chains by selling his games online. His online publishing effort, Manifesto Games, is due to launch this year with about 100 titles.

It seems doubtful, though, that someone like Costikyan can impact mainstream tastes. In the online gaming magazine Escapist, he wrote that gamers are partially to blame for the lack of an independent scene:

Indie rock fans may prefer somewhat muddy sound over some lushly orchestrated, producer-massaged score; indie film fans may prefer quirky, low-budget titles over big-budget special FX extravaganzas; but in gaming, we have no indie aesthetic, no group of people (of any size at least) who prize independent vision and creativity over production values.

It's true that mainstream consumers have given big publishers little incentive to change. When the movie business decentralized in the mid-20th century, the door opened for avant-garde filmmakers free from studio control. Just as important, though, was the social change of the 1960s—moviegoers wanted something new. It happened again in the 1990s, when Quentin Tarantino and the Weinsteins re-invigorated American indie film and transformed the business.

Today, the video-game industry is at a similar crossroads. In one direction is an ever-deepening Madden playbook, and in the other is progress. The video game now holds much promise as a cultural mover. If the big studios stay in charge, it may return to its former status: the pastime of teenage boys and middle-aged nerds at gaming conventions.

source:http://www.slate.com/id/2142453/


Pakistan plans largest mobile WiMax rollout

Pakistan plans to roll out the largest mobile WiMax network yet, Motorola announced Tuesday.

Motorola is providing the country's Wateen Telecom with an 802.16e-based Motowi4 network. An initial uptake of a million subscribers is expected, with a nationwide rollout to follow.

As a developing country, Pakistan has until now lacked the infrastructure for widespread broadband.

The deployment is a milestone in the spread of WiMax, a superfast wireless technology that has a range of up to 30 miles and can deliver broadband at a theoretical maximum of 75 megabits per second. The 802.16-2004 standard, which is used in fixed WiMax networks, is being skipped in favor of a large-scale introduction of 802.16e, which was only recently agreed upon by the WiMax Forum.

"We made the decision 18 months ago to jump over (802.16-2004) and go straight to 802.16e," Paul Sergeant, Motorola's marketing director for Motowi4, told ZDNet UK on Tuesday. "We've been working on it for a while, which is how we're able to ship so soon after agreement."

"802.16e leads to a much larger market as it addresses mobility needs, but we also felt it could be just as good a solution for fixed broadband," he added.

Some analysts said the Pakistan deal is proof that major players in the industry are throwing their weight behind mobile WiMax in a way they haven't with the fixed version.

"The really interesting thing is that Motorola is really focusing on the mobile version, as are Alcatel and Siemens," Julien Grivolas, a telecom analyst at Ovum, said.

"Mobile WiMax is going to be something for the big players, as opposed to fixed WiMax, where (they) set up OEM (original equipment manufacturer) agreements with smaller vendors," Grivolas said.

On Tuesday, Motorola also made its first public demonstration of third-party interoperability of its WiMax products. At the WiMax World Europe Conference in Vienna, it showed off a third-party PCMCIA card that incorporates a mobile WiMax chip from Beceem Communications.

"The market is looking for carrier class (802.16e) solutions that either support mobility from the beginning or can be upgraded," Sergeant said.

source:http://news.com.com/2100-1039_3-6075684.html


Mars robots to get smart upgrade

Martian dust devils
Dust devils on Mars: Catch them if you can
The US space agency's rovers will get a software upgrade to allow them to make "intelligent" decisions in the study of Martian clouds and dust devils.

The new algorithms will give the robots' computers the onboard ability to search through their images to find pictures that feature these phenomena.

Only the most significant data will then be sent to Earth, maximising the scientific return from the missions.

Nasa says its robotic craft will become increasingly autonomous in the future.

"An instrument can acquire considerably more data than can be down-linked - this is a recurring theme on all spacecraft," explained Rebecca Castano, from the agency's Jet Propulsion Laboratory (JPL).

Mars rover (Nasa)
In the future, the capability will be there not just to patch flight software but to completely re-write it
Ralph Lorenz, University of Arizona
"The idea now is to collect as much data as the instrument can, analyse them onboard for features of specific interest, and then down-link only the data that have the highest priority," she told BBC News.

Currently, the rovers are allocated time to look for clouds and dust devils, which may or may not appear - they are naturally transient events. And getting humans to sift the images is time consuming.

The software upgrade, due to take place in the next month, will make the whole process much more efficient.

"Clouds typically occur in 8-20% of the data collected right now," Castano said.

"If we could look for a much more extended time and select only those images with clouds then we could increase our understanding of how and when these phenomena form. Similarly with the dust devils."

Quick reactions

Leaving the robots to "get on with it" - to do the decision-making - is the way ahead, Nasa believes.

The agency's Mars Odyssey orbiter, which has been mapping the Red Planet since 2001, will get new autonomous flight software later this year.

Mt Erebus (Nasa)
Distant Erebus: Scientists were alerted rapidly to the event
This will give the satellite the ability to react to sudden changes on the Martian surface. It will be "tuned" to look for temperature anomalies, rapid changes in the polar caps, the emergence of dust storms and the formation of water-ice clouds.

If its algorithms mark an event of interest, the spacecraft will be able to break into its routine and take more images, without waiting for commands from Earth.

Scientists say this will capture short-lived, but highly significant, events that might otherwise have been missed.

The approach has been pioneered on Nasa's Earth Observing-1 satellite, which has now made thousands of autonomous data collects since 2003.

EO-1, Nasa
Operational costs on EO-1 have been reduced dramatically
A classic example was an eruption on Antarctica's Mt Erebus volcano in 2004. Typically, it could take several weeks to learn such a remote volcano had gone into an active phase; but as soon as EO-1 detected heat from the lava lake at the mountain's summit, it reprogrammed its camera to take more pictures.

The spacecraft also sent a rapid alert to volcanologists on the mission's science team.

So successful has EO-1's Autonomous Sciencecraft Experiment software been that it is now running the satellite's main science operations.

"This has helped us reduce the operations cost of this mission from $3.6m to $1.6m a year - over half that reduction was directly attributed to the onboard automation that we're talking about. That was critical in getting the mission extended," said Steve Chien, principal investigator for autonomous sciencecraft at JPL.

"The approach has shown its worth and it is applicable to a wide range of future missions."

Distant control

Ralph Lorenz, from the University of Arizona, Tucson, works on the current Cassini-Huygens mission to Saturn and its moons.

He was thrilled by the performance of the Huygens lander, which touched down on the ringed planet's largest satellite, Titan, in January 2005; and he is already thinking about a return flight some time in the next decade.

Titan blimp (Ralph Lorenz)
On distant worlds, autonomous flight control could be essential
He said self-reliant spacecraft would open up new science opportunities on far-distant missions, where probes might be out of contact with Earth for hours or even days at a time.

Lorenz envisages the next craft on Titan to be a blimp that could fly itself around the moon and select the most interesting locations to set down to do investigations.

"It's important to note also that launch dates will no longer limit technological capabilities," he added.

"We've seen how the Mars rovers are constantly being updated. To get to Titan, it will take about seven years, during which time we can improve and finesse the type of autonomous software we might apply. In the future, the capability will be there not just to patch flight software but to completely re-write it."

Castano, Chien and Lorenz were explaining the latest developments in autonomous spacecraft operations here at the American Geophysical Union Joint Assembly.


source:http://news.bbc.co.uk/2/hi/science/nature/5022524.stm


Amnesty International vs. Internet Censorship

"Amnesty International has a new online campaign against governments which censor websites, monitor online communications, and persecute citizens who express dissent in blogs, emails, or chat-rooms. The website, Irrepressible.info contains a web-based petition (to be presented at a UN conference in November 2006) and also a downloadable web gadget which displays random excerpts of censored material on your own website."

source:http://yro.slashdot.org/article.pl?sid=06/05/28/136247

The Potential of Science With the Cell Processor

"High Performance Computing Newswire is running an article on a paper by computer scientists at the U.S. Department of Energy's Lawrence Berkeley National Laboratory. They have evaluated the processor's performance in running several scientific application kernels, then compared this performance against other processor architectures. The full paper is available from Computer Science department at Berkeley."

source:http://science.slashdot.org/article.pl?sid=06/05/28/047223

Some Cell Phone Owners Spurn Gadgetry

OVERLAND PARK, Kan. (AP) - Nathan Bales represents a troubling trend for cellular phone carriers. The Kansas City-area countertop installer recently traded in a number of feature-laden phones for a stripped-down model. He said he didn't like using them to surf the Internet, rarely took pictures with them and couldn't stand scrolling through seemingly endless menus to get the functions to work.

"I want a phone that is tough and easy to use," said Bales, 30. "I don't want to listen to music with it. I'm not a cyber-savvy guy."

But the wireless industry needs him to be comfortable with advanced features and actively use them. As the universe of people who want a cell phone and don't already have one gets smaller, wireless carriers are counting on advanced services to generate the bulk of new revenue in coming years.

Consumers last year paid $8.6 billion for so-called data applications on their phones, up 86 percent from the year before, according to wireless trade group CTIA.

But they've also shown a growing frustration with how confusing those added features can be. A J.D. Power & Associates survey last year found consumer satisfaction with their mobile devices has declined since 2003, with some of the largest drops linked to user interface for Internet and e-mail services.

That has providers working hard to make their devices easier to use - fewer steps, brighter and less cluttered screens, different pricing strategies - so consumers will not only use data functions more often but also be encouraged to buy additional ones.

For Sprint Nextel Corp. (S), the process begins in a suite of small rooms on its operations campus in suburban Kansas City.

On one recent day, a trio of researchers watched through one-way glass and overhead cameras as a volunteer navigated her way through a prototype program that lets parents set limits on their children's phone use.

The observers monitored how many steps it took for the woman to make the program work, how easily she made mistakes and how quickly she could get herself out of trouble. The results could be used to further tweak the program, said Robert Moritz, director of device development.

"If you bring somebody in and they have problems, it's not because they're dumb, but we were dumb with the design," Moritz said, adding that the lab typically tests devices and programs with up to 50 users over three to nine months. The company also uses focus groups to determine what people want from their phones and what they say needs fixing.

The results of those studies can sometimes push back the release of a product. For example, Michael Coffey, vice president of Sprint's user-experience design, said the company delayed releasing its walkie-talkie Ready Link service for about a year after testers said they didn't like the short delay between when the user pushes the button and the recipient answers.

Coffey said the testing is worth it because ease-of-use can be a competitive edge.

"IPod was not the first MP3 player on the market, but once they figured it out (the user interface), they became the predominant one overnight," he said. "Whether you make it a marketing message or not, the public will discover that usability and choose your product over a competitor's."

So far, Sprint Nextel is doing something right as its subscribers spend the highest average amount for data services in the industry.

"We believe there's a strong correlation between our standard of success and how usable the products are," he said.

The other major wireless providers use similar techniques to improve their devices and programs.

Cingular Wireless, the nation's largest wireless provider, developed MEdia Net, which allows users to personalize their phones for using the Internet, downloading ringtones or getting e-mail.

Verizon Wireless has V-Cast, a service that makes it easier to download music and video. The company has also pushed designs that allow users to accomplish many things with one button press.

"It's not fun to download a ringtone and have to figure out how to get that on your phone," said Verizon spokeswoman Brenda Ramey. "We do not shy away from testing. If the device or service doesn't work, it's a reflection on our network."

T-Mobile also has focused on a few key areas, introducing T-Zone to help customers find ringtones and screen wallpaper by subject and decreasing the number of steps to take and send photos, for example.

"Communication and personalization will continue to be the driver for phone use," said Michael Gallelli, director of product marketing at T-Mobile.

Industry experts say the companies understand the stakes involved in making sure their designs attract customers and keep them loyal.

"Five years ago, I wouldn't have seen a commercial from Cingular that you can customize your layout," said David Chamberlain, principal wireless analyst for research firm In-Stat. "To think that they're putting this kind of effort into the interface is welcome news."

How well they're doing is a different matter.

Some analysts pointed to phones from niche providers, such as youth-oriented Amp'd Mobile and sports-centric ESPN Mobile, as good examples of intuitive design, marrying easy-to-understand menus with pared-down lists of content aimed at their particular markets.

But Roger Entner of the market research firm Ovum said none of the major carriers impresses him. He says most of them are trying to replicate how people use personal computers instead of coming up with a new approach.

"What do (customers) do best on the phone? They talk. What do they do worst? Type. Why is every user interface based on typing?" Entner said. "Right now, the software developers take advantage of every weakness a device has and none of the strengths."

Some wireless carriers and third-party companies are experimenting with voice-recognition technology. Kirkland, Wash.-based VoiceBox Technologies, for instance, plans to release a product later this year that recognizes words and context in a customer's speech to immediately bring them content on their phones.

Charles Golvin of Forrester Research said a recent survey indicated few cellular customers choose a phone based on its usability, typically because they either don't think there's anything better or, like Bales in Kansas City, don't think they need those services.

But Golvin said for the market to truly grow, the programs and phones themselves are going to have to become more graceful and not just the purview of tech-junkies.

"Early adopters are less retarded by the user interface," he said. "As we're moving from the early adopters to the more mainstream customers, it will make a huge difference."

source:http://apnews1.iwon.com//article/20060527/D8HS7D300.html


When Escape Seems Just a Mouse-Click Away

Stress-Driven Addiction to Online Games Spikes in S. Korea


SEOUL -- Unable to pass tough university entrance exams and under intense pressure from his parents to study harder, 20-year-old Kim Myung gradually retreated to the one place where he could still feel invincible -- the virtual world of electronic games.

In front of his computer screen, Kim played hours upon hours of interactive role-playing games with other anonymous online gamers. When he slew zombies and ghouls with particular dexterity, he recalled, the flashing words "Excellent!" or "Masterstroke!" fired him up. Kim played from 8 a.m. until well after midnight -- and in the process, over four months, gained 10 pounds while surviving largely on one meal a day of instant noodles.

"I guess I knew I was becoming addicted, but I couldn't stop myself," Kim recalled from a clinic where he was undergoing counseling. "I stopped changing my clothes. I didn't go out. And I began to see myself as the character in my games."

In South Korea, the nation that experts describe as home to the world's most extreme gamer culture, authorities are alarmed by what many here are calling an epidemic of electronic game addiction.

Last month, the government -- which opened a treatment center in 2002 -- launched a game addiction hotline. Hundreds of private hospitals and psychiatric clinics have opened units to treat the problem.

An estimated 2.4 percent of the population from 9 to 39 are believed to be suffering from game addiction, according to a government-funded survey. Another 10.2 percent were found to be "borderline cases" at risk of addiction -- defined as an obsession with playing electronic games to the point of sleep deprivation, disruption of daily life and a loosening grip on reality. Such feelings are typically coupled with depression and a sense of withdrawal when not playing, counselors say.

The situation has grown so acute that 10 South Koreans -- mostly teenagers and people in their twenties -- died in 2005 from game addiction-related causes, up from only two known deaths from 2001 to 2004, according to government officials. Most of the deaths were attributed to a disruption in blood circulation caused by sitting in a single, cramped position for too long -- a problem known as "economy class syndrome," a reference to sitting in an airplane's smallest seats on long flights.

In one instance, a 28-year-old man died in the central city of Taegu last year after reportedly playing an online computer game for 50 hours with few breaks. He finally collapsed in a "PC baang " -- one of the tens of thousands of Internet game cafes that have become as common as convenience stores across South Korea. Users can pop in to these small, smoky dens -- with walls covered in gothic game posters -- for about $1 an hour, day or night.

"Game addiction has become one of our newest societal ills," said Son Yeongi, president of the Korea Agency for Digital Opportunity, which offers government-funded counseling. "Gaming itself is not the problem. Like anything, this is about excessive use."

Experts are seeing more cases of game addiction in many industrialized nations -- particularly the United States and Japan. But sociologists and psychiatrists have identified South Korea as the epicenter of the problem.

That is in part because young people here suffer from acute stress as they face educational pressures said to far exceed those endured by their peers in other countries. It is not uncommon, for instance, for South Korean students to be forced by their parents into four to five hours of daily after-school tutoring. With drug abuse and teenage sex considered rare in the socially conservative country, escape through electronic games can be a hugely attractive outlet.

At the same time, South Korea boasts an unparalleled gaming culture. In 2000 in Seoul, the capital, South Koreans inaugurated the World Cyber Games -- a sort of gaming Olympics that now draws players from 67 nations. Professional South Korean gamers can earn more than $100,000 a year in domestic and international competitions.

When Escape Seems Just a Mouse-Click Away

In many other nations, video game consoles such as Nintendo or Sony PlayStation rule. But South Koreans largely opt for online, interactive role-playing games. Such games have no end and allow multiple players to come together via the Internet.

Online games are hot here partly because South Korea is the world's most wired nation. Nearly 70 percent of South Koreans -- compared with 45 percent of Japanese and 33 percent of Americans -- now gain access to the Internet via the super-fast broadband connections required for the most popular online games, according to Telecompaper, an Internet research organization in the Netherlands. Now, Koreans can also play sophisticated games via cellphones.

But hard-core and casual gamers alike tend to while away their time inside PC baangs, which translates as "PC rooms." At one PC baang in southern Seoul on a recent afternoon, the sounds of electronic swords, guns and fists pounding cyber-opponents filled a dim room lit mostly by the glow of computer screens and smoldering cigarettes. Engrossed in their games, few of the young men and women inside conversed with one another.

Web sites allow players to individualize their game characters by purchasing clothing, weapons and other items -- and for some, such characters can become extensions of their own personalities. Rare items are sold through highly developed online markets. Moon Sung Hoon, a 31-year-old Web page designer who spends about five hours a day inside PC baangs, said he paid $800 in an online auction last year for a virtual sword.

"This is my way of releasing stress," he said. "I'm not hurting anyone, so what's the problem?"

But doctors cite a growing toll on Korean family life. M.H. Kim, a 37-year-old homemaker in Seoul, forced her 14-year-old son into treatment at a private clinic two months ago. The boy had slipped deeper and deeper into his computer games as he entered junior high school.

"My husband began putting an English book into my son's hands and demanding that he memorize the entire lesson in one night," said Kim, who asked that only the initials of her first name be used to preserve privacy. "He would not be allowed to go to sleep until he had finished. But he ended up not studying at all and just playing his games instead."

Mental health counseling of any sort still carries a heavy stigma here, and it took Kim months to persuade her husband to put their boy into game addiction treatment. After their son ran away for three months -- scrounging money from relatives to play games at PC baangs -- Kim's husband gave in.

"I can understand my son's suffering," she said. "He could never satisfy his father and was failing at school. But when he plays his games, he becomes an undefeatable warrior."

The boy's doctor, Chin Tae Won, said the most serious addictions result in violence. He cited a case last year in which a game-addicted grammar school boy with confused concepts of life and death killed his little brother with a hammer after the younger boy interrupted his game playing.

"There is nothing wrong with kids relieving stress through games," Chin said. "But parents need to watch for the warning signs of addiction. If a child gets violent when told to stop playing a game, that's one of the first indications that there's a problem."

source:http://www.washingtonpost.com/wp-dyn/content/article/2006/05/26/AR2006052601960_2.html


BSA Claims 35% of Software is Pirated

"Business Software Alliance says 35% of packaged software installed on PCs globally is pirated, and estimates the losses at $34 bln. From the article: 'The countries with the highest piracy rates were Vietnam (90%), Zimbabwe (90%), Indonesia (87%), China (86%), and Pakistan (86%). The countries with the lowest piracy rates were the United States (21%), New Zealand (23%), Austria (26%), and Finland (26%).' TechDirt analysis debunks some of the myths: 'The BSA claims that all of these "lost sales" represent real harm to the economy. It's the same bogus argument they've trotted out before, which is easily debunked. Much of that unauthorized software is being used to make firms much more productive than they would be otherwise -- probably benefiting the overall economy quite a bit.'"

source:http://slashdot.org/article.pl?sid=06/05/27/2112244

Teens arrested in alleged MySpace extortion scam

Two New York teenagers have been arrested and charged with attempting to extort $150,000 from MySpace, the popular community Web site.

Shaun Harrison, 18, and Saverio Mondelli, 19, both of whom are from Suffolk County, N.Y., were arrested in a sting operation last week, the Los Angeles County District Attorney's office said Wednesday. The pair had traveled to Los Angeles to meet people they allegedly believed were MySpace employees, but who were in fact undercover investigators, according to the district attorney's statement.

The alleged crimes began late last year when the two young men took advantage of a flaw they had discovered in the MySpace Web site in order to obtain personal information on MySpace users, the district attorney said.

MySpace discovered the intrusion earlier this year and blocked it. The Los Angeles-based company also reported the incident to authorities. During the course of the investigation, threats were made that unless $150,000 was paid, new exploit code would be released, according to the statement.

By this time, the sting operation had been set up, so instead of meeting with MySpace late last week, the pair from New York met with undercover officers from the U.S. Secret Service and the Los Angeles District Attorney's Bureau of Investigation.

Harrison and Mondelli, both programmers, operated a Web site called MySpacePlus.com, according to the district attorney's office. The Web site looks to be a storefront for SpyFuse, which appears to be a tool that can be used to manipulate MySpace. A statement on the Web site says the tool is currently unavailable due to an "unexpected legal complication."

MySpace declined to comment other than to say that executives are cooperating with law enforcement. The company, owned by News Corp., recently reported that membership has grown to 70 million.

Jane Robinson, press secretary for the L.A. district attorney's office, said that Harrison and Mondelli were charged with multiple felony counts, including illegal computer access, sending a threatening letter for extortion and attempted extortion.

The pair could be sent to prison for more than four years if convicted, Robinson said. Both men were arraigned this week and pleaded not guilty. A preliminary hearing has been scheduled for June 5.

source:http://news.zdnet.com/2100-9588_22-6077455.html


NYC Mayor Advocates U.S. Worker Database

NEW YORK

Republican Mayor Michael Bloomberg thrust himself into the national immigration debate Wednesday, advocating a plan that would establish a DNA or fingerprint database to track and verify all legal U.S. workers.

The mayor also said elements of the legislation moving through Congress are ridiculous and said lawmakers who want to deport all illegal immigrants are living in a "fantasy."

In an editorial for The Wall Street Journal and two nationally televised interviews, the mayor reiterated his long-standing belief that the 12 million undocumented immigrants in the United States should be given the opportunity for citizenship, saying that deporting them is impossible and would devastate the economy.

Aides said Bloomberg believes his views are relevant because he has a rare perspective as a former businessman who ran a company for two decades before he became mayor, in charge of enforcing the laws in a city with an estimated half-million illegal immigrants. They said that the editorial was his idea and that CNN and Fox News approached him to discuss his views on the air.

In the article and on air, Bloomberg slammed lawmakers who want to deport all illegal immigrants, saying on Fox News that "they are living in a fantasy world."

Asked in that interview whether his opinions put him at odds with his political party, the mayor, a former Democrat, shot back: "With which party?

"I'm not a partisan guy," Bloomberg said. "I am a mayor who has to deal with 500,000 people who are integral to our economy but are undocumented."

Bloomberg compared his proposed federal identification database to the Social Security card, insisting that such a system would not violate citizens' privacy and was not a civil liberties issue.

"You don't have to work _ but if you want to work for a company you have to have a Social Security card," he said. "The difference is, in the day and age when everybody's got a PC on their desk with Photoshop that can replicate anything, it's become a joke."

The mayor said DNA and fingerprint technology could be used to create a worker ID database that will "uniquely identify the person" applying for a job, ensuring that cards are not illegally transferred or forged.

Donna Lieberman, director of the New York Civil Liberties Union, said a DNA or fingerprint database "doesn't sound like the free society we think we're living in."

"It will inevitably be used not just by employers but by law enforcement, government agencies, schools and all over the private sector," she said.

source:http://www.breitbart.com/news/2006/05/24/D8HQE6B80.html


Dell With Google, Yahoo with eBay. Microsoft: Left Outside Alone?

If the last week should have been dedicated to talks concerning Windows Vista Beta 2 and Microsoft Office 2007 Beta 2 as well as other things Microsoft presented at WinHec, things eventually took an unexpected turn as the Internet giants decided that it was time to come up with a series of striking alliances.

Google signed a surprise partnership with Dell – one of Microsoft's traditional allies – a partnership which doesn't cirectly concern online search, as one may have expected, but installing Google software products on the Dell PCs. In other words, a frontal attack aimed at Microsoft and, why not, a response to the fact that MSN is the default search engine in Internet Explorer 7.

Yahoo didn't wait too long to admire the future Windows Vista Beta 2 either, but it rushed forward to a deal with eBay – an alliance whose main purpose is to offer a counterpart to Google, but which also has any chance to pose a threat to MSN Search as well.

All these changes in the Portal Wars leave Microsoft facing a complicated question: who are they going to ally with? If, aside from the enormous beneficial influence to their image, the Google-Dell partnership means only a small part compared to the hardware producers which will enroll under the Windows Vista flag, when it comes to web services it will be the Yahoo-eBay combination which will probably affect Microsoft more.

As for the online searches, all studies agree: Google is placed first, with a percent almost double compared to Yahoo, while MSN fights for the same 11 percents it's been having for some time. If thanks to eBay Yahoo now has a chance to win back some grounds from the distance separating them from Google, Microsoft has to find an alliance or a wonder-solution to help them raise – or, at least, maintain their level – his percents a bit in the online searching domain.
As redoutable as the Yahoo-eBay combination may be, Google has a good enough position in the race to be able to take time in searching new solutions. From its aspiring position, MSN search can't afford that luxury. Under the pressure made on the one side by Google-Dell, which shows Microsoft that they don't hold exclusivity in matter of preinstalled software, and on the other side by Yahoo-eBay, Microsoft has to look for more solutions – many and good.

The tendency is quite clear and the web services are leading. And no matter how much Windows Vista will be delayed, it's quite certain that it won't change the situation of Microsoft's desktop supremacy in a radical way, but Microsoft is in danger of missing the start in matter of web services. The efforts that the Redmond-based company is making with their Windows Live platform are remarkable and demonstrate that Microsoft has the abilities it takes. But will they managed to face all by themselves the attack unleashed by Google and Yahoo?

source:http://www.playfuls.com/news_02733_Dell_With_Google_Yahoo_with_eBay_Microsoft_Left_Outside_Alone.html

BitTorrent: Shedding no tiers

Newsnight's ubergeek talks to BitTorrent inventor Bram Cohen and finds him distinctly equivocal about fears of a two speed internet.

BitTorrent
Many ISPs have virtually banned BitTorrent from their network
So there's me driving up to Homebase to get some new wine glasses for my posh media chums to come round and watch the World Cup. And I get to within half a mile of the store and my car starts to slow down.

Before I know it, I'm doing five miles an hour. What's more, half the other cars around me are doing the same. But the cars on the other side of the road are all fine. So I turn round and head home and suddenly it's all back to normal. "What on earth is going on?" as our man Paxman would say.

"It's simple" said the grease monkey at my local garage. "The people who made your car have done a deal with B&Q. They've fixed it so that if you ever drive towards Homebase, you'll start going at 5 miles an hour."

Network neutrality

Alert readers among you might observe that I'm talking rubbish, and, despite this being the BBC, I must admit I made the whole incident up. But imagine if such a thing were possible. How happy would you be if you were on the receiving end? Which brings us to the principle of network neutrality.

In a network neutral world, every piece of internet data is treated equally. Whether you're downloading porn from Japan or buying music from iTunes in California or reading a blog in Russian, the Internet doesn't care. It's all just data and it's all treated the same, all given the same priority on the information superhighway.

And if some big corporation were to start paying your internet service provider to start prioritising their offerings over their rivals, then that ISP would, arguably, be selling that internet connection twice, once to you and once to the corporation. That would be a violation of network neutrality.

Last week the U.S. House of Representatives Judiciary Committee produced a draft bill to make network neutrality an explicit legal requirement in the US. This follows a big row where the infrastructure manufacturers like Cisco and 3M have been lobbying heavily against network neutrality while internet companies like Google and Microsoft have been calling for the opposite.

BitTorrent

Network cabling
Before there can be toll networks someone has to build them
Why? One reason, perhaps, is because if toll roads are to be allowed on the internet, then someone has to build them, and that means jobs for the hardware boys. But the internet companies may not fancy having to pay those tolls and dance attendance on a new gatekeeper.

At which point enter our old friend the BitTorrent. You'll recall that this protocol has lately spread across the internet like Japanese knotweed, gobbling up perhaps a third of internet capacity, so that many service providers have virtually banned it from their networks before they become choked up completely.

Technically that is perhaps a violation of network neutrality, but one born of practicality rather than any darker motive, they would argue if they were here. Anyway the main losers are pirates and they can look after themselves.

Bram Cohen, the 'ubergeek' who gave us BitTorrent, is right up there in the pantheon of Internet gods. But unlike such luminaries as Shawn Fanning and Tim Berners Lee, Bram still hopes to make money from the fruits of his intellect. To which end he's done a deal with Warner Brothers to help them to distribute their movies on BitTorrent.

One of the things that's hoped might sweeten the deal is a new kind of faster torrent which the makers hope will make the current version look like paint drying. At the same time it will also unblock those congested pipes, so that his invention can avoid getting banned from networks quite so often.

Massive acceleration

Server
Does speeding up serving some data negate network neutrality?
The new version is currently trialling as a collaboration between Bram, NTL and a company called Cachelogic here in Britain. Cachelogic are offering a series of data stores strategically placed around the Internet which the new BitTorrent system talks to. Whenever they see a commercially approved BitTorrent, they make a copy of the data.

The next time someone on the Internet requests that data, it comes not from the original sender but from the Cachelogic store, only this time massively accelerated.

You can see where this is going. The companies who subscribe to the service will see their data race down the toll roads much faster than everyone else's can travel. What then for network neutrality?

We asked Bram about network neutrality. He told me "I most definitely do not want the internet to become like television where there's actual censorship... however it is very difficult to actually create network neutrality laws which don't result in an absurdity like making it so that ISPs can't drop spam or stop... (hacker) attacks. "

Does the Cachelogic proposal violate network neutrality? "Depending on how you define net neutrality that violates some definitions of it," says Cohen.

And would he feel comfortable if a media company using BitTorrent did start seeking network priority for its data?

"It depends really on the nature of the whole thing... I'm against net censorship. However when you're talking about large file transfers going to very large numbers of people there frequently are significant costs involved... (the media companies) are frequently bearing a lot of costs already today. They make some stuff available and pay for bandwidth on it so it's just a question of the download costs as well as the upload costs."

Taking its toll

Board
Companies could also upgrade our ability to receive data
He has a point. Big media corporations already pay a fortune for powerful internet capacities so that you can more easily read articles like this one. This would just be the logical next step - rather than merely improving their capacity to send data out the door, the companies upgrade your ability to receive it as well.

To go back to our analogy, it's not that your car will necessarily slow down when you head to Homebase. It's just that you'll suddenly start travelling at several hundred miles per hour if you go to the rival store. They're not doing anything to harm your surfing.

Objectively they're making it better. Even if you don't want to download their movies you might still benefit from the relief in congestion over the whole internet. And if capital wants to build something and people want to pay for it, well, chances are it's going to get built.

Which is exactly what you'd say about a toll road.

source:http://news.bbc.co.uk/1/hi/programmes/newsnight/5017542.stm


Top 10 Strangest Gadgets of the Future

"This week, the editors of TechEBlog have compiled a list of the 'Top 10 Strangest Gadgets of the Future,' from solar powered LEDs to memory LCD screens, it's all there." Urinal gaming stations! How did no one implement this sooner?

source:http://slashdot.org/article.pl?sid=06/05/27/1250237

TALKS COOL BETWEEN MICROSOFT AND EBAY

May 26, 2006 -- For several weeks Microsoft has been in discussions about a possible acquisition of online auctioneer eBay, The Post has learned.

According to multiple sources close to the matter, Microsoft has considered buying eBay and merging it with its MSN portal - a deal that would give MSN and eBay considerable clout to take on Google.

Sources indicate that the talks, while still active, have cooled somewhat in the last two weeks as executives considered antitrust issues.

It is unclear what the full impact of yesterday's advertising and search alliance between Yahoo! and eBay will be for talks between MSN and eBay.

One source close to the matter suggested the Yahoo-eBay tie-up would not stop Microsoft from pursuing the online auctioneer.

Last year Microsoft was close to a deal with Time Warner that would have combined America Online and MSN. But Microsoft eventually lost out to Google.

The potential deal was considered important for MSN, as it would have resulted in AOL switching from Google's search technology to MSN's technology.

MSN is focused on expanding its search engine, which lags behind Google and Yahoo! Google eventually inked a deal with AOL, which included a broad online advertising partnership and an extension of the lucrative search deal. Google also invested $1 billion for a 5 percent stake in AOL.

But Microsoft had been in talks with AOL over a much broader deal, and at times the two companies had considered merging AOL and MSN in to a 50-50 joint venture.

But once the AOL deal fell apart, Microsoft turned to both Yahoo! and eBay for possible tie-ups that would help it compete with Google.

Sources said that talks with eBay were put on Microsoft's front-burner only recently, while the Yahoo! talks simmered.

One source said Microsoft boss Bill Gates came to the conclusion that Yahoo! was more a content company than a tech company. This source said Gates has no interest in owning a content company.

source:http://www.nypost.com/business/64226.htm


Gonzales pressures ISPs on data retention

U.S. Attorney General Alberto Gonzales and FBI Director Robert Mueller on Friday urged telecommunications officials to record their customers' Internet activities, CNET News.com has learned.

In a private meeting with industry representatives, Gonzales, Mueller and other senior members of the Justice Department said Internet service providers should retain subscriber information and network data for two years, according to two sources familiar with the discussion who spoke on condition of anonymity.

The closed-door meeting at the Justice Department, which Gonzales had requested, according to the sources, comes as the idea of legally mandated data retention has become popular on Capitol Hill and inside the Bush administration. Supporters of the idea say it will help prosecutions of child pornography because in many cases, logs are deleted during the routine course of business.

In a speech last month at the National Center for Missing and Exploited Children, Gonzales said that Internet providers must retain records for a "reasonable amount of time."

"I will reach out personally to the CEOs of the leading service providers and to other industry leaders," Gonzales said. "Record retention by Internet service providers consistent with the legitimate privacy rights of Americans is an issue that must be addressed."

Until Gonzales' speech, the Bush administration had generally opposed laws requiring data retention, saying it had "serious reservations" (click for PDF) about them. But after the European Parliament last December approved such a requirement for Internet, telephone and voice over Internet Protocol providers, top administration officials began talking about the practice more favorably.

During Friday's meeting, Justice Department officials passed around pixellated (that is, slightly obscured) photographs of child pornography to emphasize the lurid nature of the crimes police are trying to prevent, according to one source.

A Justice Department spokesman familiar with the administration's stand on data retention was in meetings on Friday and unavailable for comment, a department representative said.

Privacy advocates have been alarmed by the idea of legally mandated data retention, saying that, while child exploitation may be the justification today, those records would be available in all kinds of criminal and civil suits--including terrorism, tax evasion, drug, and even divorce cases.

It was not immediately clear what Gonzales and Mueller meant by suggesting that network data be retained. One possibility is requiring Internet providers to record the Internet addresses their customers are temporarily assigned. A more extensive mandate would require companies to keep track of e-mail messages sent, Web pages visited and perhaps even instant-messaging correspondents.

'Preservation' vs. 'retention'
Two proposals to mandate data retention have surfaced in the U.S. Congress. One, backed by Rep. Diana DeGette, a Colorado Democrat, says that any Internet service that "enables users to access content" must permanently retain records that would permit police to identify each user. The records could only be discarded at least one year after the user's account was closed.

The other was drafted by aides to Wisconsin Rep. F. James Sensenbrenner, the chairman of the House Judiciary Committee, a close ally of President Bush. Sensenbrenner said through a spokesman last week, though, that his proposal is on hold because "our committee's agenda is tremendously overcrowded already."

At the moment, Internet service providers typically discard any log file that's no longer required for business reasons such as network monitoring, fraud prevention or billing disputes. Companies do, however, alter that general rule when contacted by police performing an investigation--a practice called data preservation.

A 1996 federal law called the Electronic Communication Transactional Records Act regulates data preservation. It requires Internet providers to retain any "record" in their possession for 90 days "upon the request of a governmental entity."

Because Internet addresses remain a relatively scarce commodity, ISPs tend to allocate them to customers from a pool based on whether a computer is in use at the time. (Two standard techniques used are the Dynamic Host Configuration Protocol and Point-to-Point Protocol over Ethernet.)

In addition, Internet providers are required by another federal law to report child pornography sightings to the National Center for Missing and Exploited Children, which is in turn charged with forwarding that report to the appropriate police agency.

When adopting its data retention rules, the European Parliament approved U.K.-backed requirements, saying that communications providers in its 25 member countries--several of which had enacted their own data retention laws already--must retain customer data for a minimum of six months and a maximum of two years.

The Europe-wide requirement applies to a wide variety of "traffic" and "location" data, including the identities of the customers' correspondents; the date, time and duration of phone calls, voice over Internet Protocol calls or e-mail messages; and the location of the device used for the communications. But the "content" of the communications is not supposed to be retained. The rules are expected to take effect in 2008.

source:http://news.zdnet.com/2100-1009_22-6077654.html


Earth's ozone layer appears to be on the road to recovery

May 26, 2006: Think of the ozone layer as Earth's sunglasses, protecting life on the surface from the harmful glare of the sun's strongest ultraviolet rays, which can cause skin cancer and other maladies.

see captionPeople were understandably alarmed, then, in the 1980s when scientists noticed that manmade chemicals in the atmosphere were destroying this layer. Governments quickly enacted an international treaty, called the Montreal Protocol, to ban ozone-destroying gases such as CFCs then found in aerosol cans and air conditioners.

Right: The Antarctic ozone hole. [More]

Today, almost 20 years later, reports continue of large ozone holes opening over Antarctica, allowing dangerous UV rays through to Earth's surface. Indeed, the 2005 ozone hole was one of the biggest ever, spanning 24 million sq km in area, nearly the size of North America.

Listening to this news, you might suppose that little progress has been made. You'd be wrong.

While the ozone hole over Antarctica continues to open wide, the ozone layer around the rest of the planet seems to be on the mend. For the last 9 years, worldwide ozone has remained roughly constant, halting the decline first noticed in the 1980s.

The question is why? Is the Montreal Protocol responsible? Or is some other process at work?

It's a complicated question. CFCs are not the only things that can influence the ozone layer; sunspots, volcanoes and weather also play a role. Ultraviolet rays from sunspots boost the ozone layer, while sulfurous gases emitted by some volcanoes can weaken it. Cold air in the stratosphere can either weaken or boost the ozone layer, depending on altitude and latitude. These processes and others are laid out in a review just published in the May 4th issue of Nature: "The search for signs of recovery of the ozone layer" by Elizabeth Westhead and Signe Andersen.

Sorting out cause and effect is difficult, but a group of NASA and university researchers may have made some headway. Their new study, entitled "Attribution of recovery in lower-stratospheric ozone," was just accepted for publication in the Journal of Geophysical Research. It concludes that about half of the recent trend is due to CFC reductions.

Lead author Eun-Su Yang of the Georgia Institute of Technology explains: "We measured ozone concentrations at different altitudes using satellites, balloons and instruments on the ground. Then we compared our measurements with computer predictions of ozone recovery, [calculated from real, measured reductions in CFCs]." Their calculations took into account the known behavior of the sunspot cycle (which peaked in 2001), seasonal changes in the ozone layer, and Quasi-Biennial Oscillations, a type of stratospheric wind pattern known to affect ozone.

see captionWhat they found is both good news and a puzzle.

The good news: In the upper stratosphere (above roughly 18 km), ozone recovery can be explained almost entirely by CFC reductions. "Up there, the Montreal Protocol seems to be working," says co-author Mike Newchurch of the Global Hydrology and Climate Center in Huntsville, Alabama.

Right: The ozone layer is located about 15+ km above Earth's surface. [More]

The puzzle: In the lower stratosphere (between 10 and 18 km) ozone has recovered even better than changes in CFCs alone would predict. Something else must be affecting the trend at these lower altitudes.

The "something else" could be atmospheric wind patterns. "Winds carry ozone from the equator where it is made to higher latitudes where it is destroyed. Changing wind patterns affect the balance of ozone and could be boosting the recovery below 18 km," says Newchurch. This explanation seems to offer the best fit to the computer model of Yang et al. The jury is still out, however; other sources of natural or manmade variability may yet prove to be the cause of the lower-stratosphere's bonus ozone.

Whatever the explanation, if the trend continues, the global ozone layer should be restored to 1980 levels sometime between 2030 and 2070. By then even the Antarctic ozone hole might close--for good.

source:http://science.nasa.gov/headlines/y2006/26may_ozone.htm?list832167


The Rich Get Richer: Google Needs Some Ad Sense

After 29 years of working in high-tech companies and writing about them, I have noticed how insular they tend to be, often not seeing either the world or themselves at all clearly. Whether intended or not, this cultural artifact comes to control how the world in turn sees them, which rarely works in their favor. The classic example is Microsoft, where hiring smart people fresh from school and working them 60 hours or more per week -- in an environment where they don't even leave the building to eat -- leads to a state of corporate delusion, where lying and cheating suddenly begin to make sense. But it isn't just Microsoft that does this. It is ANY high tech company that hires young people, isolates them through long hours at work, feeds them at work, and effectively determines their friends, who are their co-workers. This trend even extends to the anti-Microsoft, to Google, where the light of day is sorely needed.

Google is secretive. This started as a deliberate marketing mystique, but endures today more as a really annoying company habit. Google folks don't understand why the rest of us have a problem with this, but then Google folks aren't like you and me. The result of this secrecy and Google's "almighty algorithm" mentality is that the company makes changes -- and mistakes -- without informing its customers or even doing all that much to correct the problems. It's all just beta code, after all. But the business part is real, as is the money that some people have lost because of Google's poor communication skills combined, frankly, with poor follow-through.

First there is click fraud. Google makes its money when people click on Google ads, but some of those clicks are fraudulent -- are not honestly intended to gain information or to buy products. Click fraud generally comes in two varieties that I'll call "buy" and "sell." An example of buy-side click fraud would be my little sister religiously clicking on every Google ad on this page (What? We have no Google ads?) in the mistaken belief that doing so would make me some money. It is mass clicking by a single person without an intention to actually buy or even to gain information. Sell-side click fraud would be one advertiser clicking on the ads of a competitor with the intention of costing that competitor money without increasing their sales. Both types of click fraud ought to be detectable, and in fact, Google says it already detects the 10 percent or so of clicks that are fraudulent (Business 2.0 magazine says it is more like 30 percent), and adjusts the bill before the advertiser even knows what is happening.

But not all click fraud is detected automatically. It is one thing to notice the same IP address being used to click 30 ads in three seconds (that is obviously fraud), but quite another if the clicks are spread out or come from what appear to be a variety of users. There, too, Google pledges to make things right, though it may take some time -- too much time, I think.

My friend Mario Fantoni is a victim of click fraud, which in this case is simply defined as his Google AdWords bill climbing from $250 one month to $4,000 the next with no change in the campaign or increase in sales. Mario contacted Google, which, after an "investigation," decided that he was, indeed, a victim of click fraud. Good for Google! But that was seven weeks ago and Mario is still waiting for his credit card to be reimbursed for $3,750. Google has yet to explain why it is taking so long for Mario to get his money back. For that matter, Google has yet to actually say that Mario will GET his money back. They are still "investigating," which could mean anything because the company will not explain what it means.

If you have been a victim of acknowledged click fraud on Google (where Google admits there is a problem), please let me know about it this week (bob@cringely.com), and especially tell me what it took to get reimbursed.

The next problem I have with Google came to me courtesy of Luis Dias, a software developer with IO Software in the UK. Luis's product is a mathematical equation editor cleverly called Equations! It is great for users of Don Knuth's LaTex page formatting program, especially if they want output readable in other applications, like Microsoft Word. You can find more about the program in this week's links.

Luis decided to sell his program online using a Google ad campaign, targeting terms like "physics equations," "equation editor," and of course "LaTex." Because he didn't expect much competition selling equation editors, Luis thought that he could get most of these words for about Google's minimum price, which in the UK is 1p. In practice, though, he found that the minimum price was 3p for most words, and that minimum shortly jumped and then jumped again until some words cost as much as £2.75 (about $5.15). Since there was no competition for these ads, Luis couldn't figure out what was going on, and frankly, Google wasn't much help. They said that his words had low "Quality Scores," which meant that the minimum charge per word had to go up by the amount specified. That made no sense to Luis or to me, so I contacted Jeff Huber at Google.

To his and Google's credit, Jeff became very involved in explaining this situation and Google's position. I happen to think it is the wrong position, but at least we have had good communication.

What I learned is that the Quality Score of Luis's words was low, suggesting that it was doubtful many readers would find the ads useful or click on them. "In the recent past," explained Jeff, "ads with low Quality Scores were disabled -- i.e., not shown -- which could create frustration for an advertiser since it was a binary (on/off) decision. More recently, we evolved from the binary approach to a more flexible economic model that instead of disabling lower performing ads entirely would allow them to participate as long as the minimum bid was set at an appropriate level. This feature has both provided advertisers with greater control, as well as helped reduce the number of low quality ads by better aligning economic incentives."

This explains Luis's surreal experience of having to pay more and more to get less and less. Unfortunately Google doesn't do a very good job of explaining this change, perhaps because it appears to be precisely the kind of paid placement they do at Overture Systems (now part of Yahoo). This almost total lack of explanation may have been part of the reason why Luis was scratching his head.

We have two concepts here -- a Quality Score for words and a History for campaigns. A low Quality Score can lead to an increase in minimum word price and a poor history (lots of low Quality Scores along with low clicks-through) can lead to minimum prices being raised not just for one word, but for all words. In the case of Luis, all this took place in three days and half a dozen words, which I'd hardly call much history, yet there is very little for him to do about it short of canceling his Google account and starting all over.

The funny thing about history at Google is that it exists even before people see the ads. The AdSense algorithm takes a look at ads before they are posted and makes a quality estimate that is the starting point for the history even before an ad is shown. It apparently looks at ads in preparation, too, which is how it is able to assign minimum bid prices. Even ads you decide never to run can affect prices.

There appears to me to be a fundamental error here in Google methodology. While they talk a lot about keywords and their quality scores, the Google system appears not to work with keywords at all, but with campaigns. A poorly performing keyword will drag down all the other keywords in the campaign no matter if their quality score is good or not, if they had just two impressions, or were ever even active.

What's happening here is just that Luis is trying to sell something that hardly anybody wants to buy.

The Google system -- THOUGH I AM SURE IT IS NOT INTENDED TO OPERATE THIS WAY -- works poorly for small sellers trying to reach buyers of obscure products. This may come down, frankly, to Google's concept of small business, which I have never seen defined.

The system appears to be optimized for people selling goods of interest to millions of people, to huge marketers no matter what they are selling, but not to tiny businesspeople trying to mine narrow niches like equation editors. By targeting campaigns and not keywords, big advertisers can use any keywords they wish regardless of the relevance simply because they exist in a much larger corpus. Because keywords are treated by Google not individually but in the context of the whole campaign and these companies receive so many clicks overall, none of their keywords are likely to perform poorly using this algorithm. The result is that no matter how inappropriate or even offensive, those words will also be cheap to buy. And because these keywords have performed "well," their "history" is then updated with regard to the campaign of the big advertiser, effectively blocking out small advertisers who can't compete with the massive click rates of big companies.

Google says "we'd much rather show nothing (white space) than a poorly targeted or non-relevant ad." But on the basis of pure performance -- what Google actually DOES, rather than what Google SAYS -- it would appear that ad quality is irrelevant in the presence of huge ad budgets. The data suggest Google really cares about massive click rates, which under most circumstances come from big companies that have a huge built-in advantage.

So the rich get richer.

Google attracts advertisers like Luis with the idea that their ads will be cheaper because, frankly, they are selling something that is only thinly traded. The dream is that the system scales and scales fairly, only it isn't fair at all because if Amazon wants to advertise an equation editor USING EXACTLY THE SAME AD TEXT AND FORMATTING AS LUIS -- their words will cost 100 times less than the same words bought by Luis. It's not that Amazon (or any other big Google advertiser) has better copy writers, it is just that they sell a broader range of things.

"A large percentage of impressions & clicks do have £0.01 minimum bids," said Jeff from Google, "but these are our very highest quality ads/advertisers."

In other words, the minimum word price is 1p, BUT NOT FOR YOU.

But what is Luis to do, I mean really? All his keywords are now at very high prices and will not come down. The only way to escape this vicious circle is to open a new account, but even then he'll still be at the mercy of badly behaved keywords that come with the equation editing territory. He has to describe his product SOMEHOW.

It would be far better, I suppose, for Google customer service to simply suggest Luis not use Google ads to sell his equation editor.

I am sure this is not what Google intended, but it is misleading, unfair, and poorly explained.

"The system does scale fairly, and provides a level playing field for both small and large advertisers," says Jeff Huber. "If Mr. Dias has relevant ads, keywords, and landing page, he should be able to do just as well as other advertisers, regardless of size. It does not mean, however, that Mr. Dias or any other advertiser will be able to economically show ads that are not relevant and not consistent with user intent. If Mr. Dias or other advertisers want a large quantity of untargeted impressions, there are a variety of media that offer these relatively cost effectively (e.g., web banner ads, TV, newspapers, magazines). It is fair to observe that if there are any advertisers who may have a slight advantage, it's advertisers who have strong brands that users recognize and trust, and therefore users find more compelling when they show relevant ads -- but that's very consistent with the 'real world' and value of brands."

It all comes down to the AdWords algorithm and its intent, which isn't to help Luis OR Amazon, but to simply maximize profit for Google.

source:http://www.pbs.org/cringely/pulpit/pulpit20060525.html


This page is powered by Blogger. Isn't yours?