Monday, March 20, 2006

The Surprising Truth About Ugly Websites

"After seeing the example of Plenty of Fish and the reports of the site earning over $10,000/day in Adsense revenues, I quickly realized that there are a lot of ugly websites that are extremely successful. The reason for this, according to the article, is that ugly websites do a few things that beautiful websites tend to lack."

source:http://developers.slashdot.org/developers/06/03/20/175212.shtml

Nike and Google launch Joga.com

"Given the increasing popularity of social-networking sites among the young and affluent, Nike has introduced a new site dedicated to the world's most popular sport: soccer. While Nike provides the content (via its army of sponsored athletes, among others), Google provides the technical expertise. Orkut has been very popular in soccer-crazed Brazil, so Google may be able to make a brand extension here. Joga.com is currently invite only, though a form at the bottom of the home page takes requests for invitations." I actually found the launch of a site like this interesting not because of the content, but because of the trend in "private label" sites. It's a Shake'n'Bake Social Network, and you helped make it.

source:http://slashdot.org/articles/06/03/20/1346220.shtml

Microsoft's Plans For Handheld Game Player And "iPod Killer"

Here's a story that came from reporting for my book, which is coming out in May under the title "The Xbox 360 Uncloaked: The Real Story Behind Microsoft's Next-Generation Video Game Console," www.spiderworks.com/xbox360.

In a bid to capture the huge audience for handheld entertainment gadgets, Microsoft is designing a product that combines video games, music and video in one handheld device, according to sources familiar with the project.

The Microsoft product would compete with Sony, Nintendo and Apple Computer's products,
including the iPod. And Microsoft has some of its most seasoned talent from the division that created its popular Xbox 360 working on it. Game executive J Allard leads the project, and its director is Greg Gibson, who was the system designer on the Xbox 360 video game console. Bryan Lee, the finance chief on the Xbox business, is leading the business side of the project.

By anchoring its entertainment device as a handheld game player, Microsoft is starting from its position of strength in the entertainment business that it hopes Apple cannot match, even with its iPod. The game press has dubbed it an "iPod killer,'' but its functions would likely more closely resemble Sony's PlayStation Portable multimedia gaming device.

While details are sketchy, the pedigree of the people in charge of the business show how strategic it is to Microsoft's future.

"That would certainly be an interesting development in the market,'' said Anita Frazier, a game industry analyst at the NPD Group.

The other competitors have huge leads on Microsoft. But the Xbox veterans have been underdogs for a while. Gibson, 35, is an electrical engineer who joined Microsoft in 1997 to help design computer mice and other hardware. He shifted to the Xbox division in 1999 to help design the innards of the original Xbox. In 2002, he became the system designer in charge of the overall design of the Xbox 360.

Allard, a 36-year-old progammer who became famous for prompting Bill Gates to take the Internet seriously, commanded much of the hardware and software teams who put together the Xbox 360. Lee, a longtime entertainment executive, joined Microsoft as finance chief for the Xbox a few years ago.

The approval of the project spurred the reorganization of the leadership team in the Home and Entertainment Division in December. In September, Robbie Bach, formerly the chief Xbox officer, was promoted to lead the Entertainment and Devices Group, which combined the Xbox with other mobile and entertainment businesses in one of four major product groups.

Then in December, the jobs of the top Xbox executives were broadened so that they could manage all of the businesses related to the broader Entertainment and Devices Group, which included the Xbox business, mobile devices, MSN, music, and home productivity software. Allard, whose group designed the Xbox 360, was named to head "experience and design'' for the entire group.

Sources say that the reason for the reorganization was to bring Allard, Lee, Gibson and all of the relevant businesses into a single group, which is supervised by Robbie Bach. The participation of these highly regarded Xbox veterans suggests that Microsoft is very serious about catching up with Sony's PlayStation Portable handheld game player, Apple's iPod music players, and Nintendo's handheld GameBoy Advance and Nintendo DS game players.

In the past, all of Microsoft's efforts to compete have fallen short. The company considered making an ""Xboy'' game player a few years ago but shelved the idea. It considered making a game handheld at the same time it devised plans for the Xbox 360 in 2002 and 2003, but it again decided to delay its entry.

Meanwhile, Microsoft's efforts in PocketPC handhelds and Portable Media Players have fallen short in competition with the iPod. Last week, Microsoft unveiled Project Origami, a handheld Windows computer. But that device isn't targeted on pure entertainment as the Xplayer is. The existence of these other projects suggests that there is still some infighting within Microsoft about its best approach to portable gadgets.

The handheld project is still in its early stages. Microsoft is still figuring out which strategy to pursue in music technology, according to sources familiar with the matter. The code name for its music service, which would be the equivalent of Apple's iTunes, is "Alexandria.''

One benefit of waiting longer is that the handheld will likely have sufficient technology in it to run a lot of original Xbox games from a few years ago. Hence, it wouldn't be hard to create a new library of games for the handheld.

Signs of activity have surfaced. Transmeta, a maker of low-power chip technology, said last year that it had assigned 30 engineers to work with Microsoft on a secret project. Transmeta's engineers work on ways to take the power out of computing chips so that they can be used in handheld devices with long battery lives.

In an interview with Business Week in January, Xbox corporate vice president Peter Moore said "it can't just be our version of the iPod'' and added the Xbox brand "is an opportunity'' if Microsoft decides to enter the mobile entertainment competition. He declined to comment on the rumor about the handheld. But sources familiar with the project confirmed its existence within the Xbox organization.

What remains to be seen is when Microsoft will launch the device. Gibson may not need a large engineering team to run the project. But his group of hardware engineers only became free last fall, when most work on the Xbox 360 was completed.

It could be 2007 before the device hits store shelves. That gives rivals such as Sony, Nintendo and Apple considerable time to consolidate their position and come up with their own new gadgets in the meantime.

The Mercury News strives to avoid use of unnamed sources. When unnamed sources are used because information cannot otherwise be obtained, the newspaper generally requires more than
one source to confirm the information.

NEW from A+E Interactive

source:http://blogs.mercurynews.com/aei/2006/03/microsofts_plan.html

Silicon Valley Start-Ups See Cash Everywhere

The market for high-technology start-up businesses is so intense in Silicon Valley that some companies are being showered with millions of dollars from investors -- without even asking for it.

It is a phenomenon called "pre-emptive financing," and it has become more common in the past several months.

The question is whether venture capitalists are moving too quickly, funding risky, untested start-up businesses -- just as they did during the heady, and ultimately unsustainable, technology-stock boom of 1999 and 2000.

Pre-emptive financing happens when a venture capitalist seeks out a promising start-up business and offers it money out of the blue, before the company tries to raise a second or third round of cash. If the offer is good enough, in theory, the venture investor will snag a piece of the company quickly, thus avoiding a costly bidding war that could erupt later once the company says publicly it is looking for cash and attracts several suitors.

Such bidding wars are increasingly common these days and have pushed up prices investors pay for stakes in some start-up companies. The median valuation of venture-funded start-up businesses -- the amount investors think these companies are worth -- soared to $15.2 million in 2005, from $10 million two years earlier, according to research firm VentureOne, a unit of Dow Jones & Co., which publishes The Wall Street Journal. Venture capitalists typically take stakes in small, private-sector companies, hoping for a payout later through a company sale or an initial public offering of stock.

The trend puts many start-up companies in the driver's seat. "I've had several [venture-capital] firms come to me long before we were looking for money," says Jason Goldberg, chief executive officer of Seattle online job-search concern Jobster Inc. His company, after getting unsolicited calls from two venture firms last year, raised $19.5 million in additional financing much earlier than it had planned. Young companies in wireless communications, computer games and consumer Internet services are big targets for pre-emptive funding calls these days.

It all highlights how desperate some venture capitalists have become to find homes for the huge amounts of cash they have raised, and how few companies there are that really deserve their money. Last year, venture-capital firms raised $25.2 billion from investors, the most since 2001, and they are struggling to put all that money to work.

"There's such a dearth of very high-quality opportunities," says Michael Greeley, a general partner with IDG Ventures in Boston. "Investors will move very aggressively right now." IDG has hired a recruiting firm to set up meetings between IDG partners and promising start-up executives, he says, to help his firm develop relationships and, in some cases, possibly pre-empt the process of formal fund raising.

Some investors are so eager to one up each other that they don't even bother to find out who to call at promising start-ups. A Pasadena, Calif., home-improvement Web site, Done Right, operated by a company called Perform Local Inc., received an email in January from a well-known investment firm inquiring about putting cash into the company. Paul Ryan, Done Right's chief executive officer, says the missive wasn't sent to him or to his executives -- it landed in a general corporate email inbox. Mr. Ryan wasn't put off by the impersonal plea: "We're having very good discussions with [the firm] right now," he says, declining to name the potential investor.

The risk of pre-emptive financing is that investors are so eager to seal deals early that they will overlook flawed business models and management and invest imprudently. The trend "absolutely harkens back to the bubble days" of 1999 and 2000, says Tom Blaisdell, a general partner with DCM-Doll Capital Management, Menlo Park, Calif. Mr. Blaisdell and others note pre-emptive funding has its upsides: If venture capitalists can offer financing before anyone else, they sometimes have more time to investigate a company's business and work with it to come up with mutually favorable investment terms. Many investors say the start-up market isn't as frothy as it was in the bubble, with sentiment tempered by the lethargic market for start-up initial public offerings of stock.

Indeed, not all venture firms are embracing the practice. Still, many start-up companies are leveraging pre-emptive funding calls into even more money.

Jobster, which does business with Microsoft Corp. and Cisco Systems Inc., raised $8 million from two well-known venture-capital firms in 2004. Just a few months later, Mr. Goldberg, the CEO, received unsolicited calls from two other firms offering him more money, and at terms that valued the company much more richly than had its first two investors, Ignition Partners and Trinity Ventures. Mr. Goldberg declined to name the two new firms that contacted him.

Those calls got him thinking that perhaps he did need more funding. So Mr. Goldberg called another venture firm, Mayfield Fund, which had expressed interest in Jobster months before but hadn't invested. Mayfield wound up leading the $19.5 million round of funding in August; the two pre-emptive cold-callers didn't get a piece of the deal.

"Nothing gets a VC moving like the idea that someone else might get their deal," Mr. Goldberg says. Mayfield Managing Director Allen Morgan says he isn't surprised top-drawer start-up businesses like Jobster are getting pre-emptive offers. "We're all hungry for returns," he says.

Other start-up companies also are benefiting. Gibu Thomas, a co-founder and CEO of Sharpcast Inc., which makes software that helps people access, share and back up digital content between various electronic devices, says he struggled to raise money for his start-up company in late 2004 and early 2005. But last August, Sharpcast snagged $3 million from two firms, Draper Fisher Jurvetson and Selby Venture Partners, and two months later, other venture capitalists clamored to put more money into the company.

Greg Gretsch, a managing director with venture firm Sigma Partners, visited Sharpcast after hearing about it from an engineer who worked there. When Mr. Gretsch asked if the company was thinking about fund raising, Sharpcast executives said they had just raised money and weren't looking for more.

"I'd like to do a pre-emptive round," Mr. Thomas recalls Mr. Gretsch saying. Adds Mr. Thomas, "I didn't even know what a pre-emptive round meant."

He does now. Sigma recently signed on as the lead investor in a new $13.5 million funding round for Sharpcast, an investment that valued the company at a price more than three times its valuation in August. Sharpcast says Mr. Gretsch says he was so impressed with Sharpcast that "I was motivated enough to jump through hoops and do a deal with them fairly quickly" at a rich price.

Sharpcast's product won't be launched until the spring, Mr. Thomas says.

http://online.wsj.com/public/article/SB114282030825002662-WsKRxguaW3lK1G_Le2jFAW9dCn8_20070319.html?mod=blogssource:


Aging Japan builds robot to look after elderly

TOKYO (AFP) - A Japanese-led research team said it had made a seeing, hearing and smelling robot that can carry human beings and is aimed at helping care for the country's growing number of elderly.

Government-backed research institute Riken said the 158-centimeter (five-foot) RI-MAN humanoid can already carry a doll weighing 12 kilograms (26 pounds) and could be capable of bearing 70 kilograms within five years.

"We're hoping that through future study it will eventually be able to care for elderly people or work in rehabilitation," said Toshiharu Mukai, one of the research team leaders.

Covered by five millimeters (0.2 inches) soft silicone, RI-MAN is equipped with sensors that show it a body's weight and position.

The 100-kilogram (220-pound) robot can also distinguish eight different kinds of smells, can tell which direction a voice is coming from and uses powers of sight to follow a human face.

"In the future, we would like to develop a capacity to detect a human's health condition through his breath," Mukai said.

Japan is bracing for a major increase in needs for elderly care due to a declining birth rate and a population that is among the world's longest living.

The population declined in 2005 for the first time since World War II as more young people put off starting families.

source:http://news.yahoo.com/s/afp/20060314/lf_afp/afplifestylejapan


DNA Origami

"Caltech scientist Paul Rothemund has developed a new technique for designing and generating self-assembling 2D nanostructures out of DNA. To demonstrate the technique, which is reportedly simple enough that a high-schooler can design with it, Rothemund created patterns like smiley faces, text, and a map of the Americas. The technique might be useful for generating 'nanobreadboard' scaffolds for things like molecular-scale circuitry, protein-based factories, and quantum computers. Rothemund is currently working to extend the technique to 3D nanostructures."

source:http://science.slashdot.org/article.pl?sid=06/03/20/0311242

What a tangled Web we weave Being Googled can jeopardize your job search

Using Google, the wildly popular Internet search engine, as an action verb has been a part of our cultural fabric for years now. Daily, millions use it to "Google" old flames, long-lost friends and even ourselves in hopes of digging up dirt. But who would have thought this addictive habit could stand in the way of landing your next job?

But it's not your Googling another person that starts the trouble: the danger occurs when a potential boss Googles you.

An increasing number of employers are investigating potential hires online to find out more about an applicant than what's on their résumé.

You may be the perfect candidate for the job, but if your name pulls up something incriminating in a Google search, you could lose your shot. "People do need to keep in mind that the information they post online - whether in a résumé, profile or otherwise - should be considered public information," warns Danielle C. Perry, director of public relations at Monster.com. Sure, you may not have intentionally posted something controversial about yourself online, but from blogs to dating profiles, the Web has become a place where people air dirty laundry without a thought, making it a dangerous place to mix business with pleasure.

Just ask 27-year-old Colleen Kluttz. Type the freelance television producer's name into Google and the second item that comes up is her popular My Space profile. This online social network has become an outpost for photographic and written self-expression, but it's not always an asset in landing a job. "A friend of mine posted a picture of me on My Space with my eyes half closed and a caption that suggests I've smoked something illegal," says Kluttz.

While the caption was a joke, Kluttz now wonders whether the past two employers she interviewed with thought it was so funny. Both expressed interest in hiring Kluttz, but at the 11th hour went with someone else. "As a freelancer, I'm constantly on the lookout for the next best opportunity, but I haven't been having much luck recently," Kluttz explains. "I really haven't been concerned that people are Googling me, but now that I'm doing the math, it seems like this is definitely going to be a constant concern from this day forward."

In addition to all the other stresses of a job search, do you really have to assume you'll get Googled any time you apply for a job? Employment experts say yes. "More and more companies are doing background checks," says Michael Erwin, senior career adviser at Career Builder.com. "If you have something on Google, it's better to let them know in advance." He also warns, "Make sure what you put on your résumé is truthful."

FOIBLES AND RANTS EXPOSED

Bloggers may also have reason for concern. When Ciara Healy applied for a job at a university, she had no idea her personal blog could get her into trouble. But when a member of the search committee Googled her, he found she had called him a "belligerent jerk," though not by name, and canceled the interview. "I almost immediately deleted the blog," wrote Healy via E-mail. For obvious reasons, Healy doesn't think employers should Google candidates, but also because she doesn't believe that one's entire life should be up for review. "What is on the table at an interview should be skills, detectible levels of craziness, overall impression and a good fit in the workplace," she writes, "not your foibles, rants, petty opinions or brilliant insights."

While Kluttz can change her My Space profile and Healy has axed her blog, other Google-addled job seekers, like Jason Hartley, find themselves stuck. Hartley, 34, a full-time blogger and writer, has always been careful about what he posts on his personal music blog, Advanced Theory. But there's nothing he can do about the two other Jason Hartleys that appear when you type his name into Google.

"There's a guy who's a dancer," says Hartley. "We're the same age and I used to be a dancer, so people assume it's me." If that weren't enough, there's a third Jason Hartley and he's a well-known blogger. "He's a soldier who's gotten a lot of recognition for writing about the Iraq war. He's a real standup guy and again people think he's me." Needless to say, whenever Hartley goes on an interview he has to be upfront. "If I were going on a job interview, I would have to say I'm not that guy."

Be good for Google

Worried about what'll pop up if a potential boss looks you up? On Google, you can't afford to fudge a date or a job title, so be sure your résumé information matches your Web presence. If you keep a blog, be careful how much you reveal about your personal life. Even if it doesn't affect your getting hired, it may expose aspects of your life you'd rather keep out of the office.

Of course Google can work to your advantage, too. If you're looking to seal the deal on a job, it can't hurt to search for your employer's interests and job history to see what you have in common. Hint at a shared interest and he or she might just overlook that compromising My Space picture after all.

http://www.nydailynews.com/front/story/401069p-339405c.htmlsource:


SPECULATIONS ON THE FUTURE OF SCIENCE

Science will continue to surprise us with what it discovers and creates; then it will astound us by devising new methods to surprises us. At the core of science's self-modification is technology. New tools enable new structures of knowledge and new ways of discovery. The achievement of science is to know new things; the evolution of science is to know them in new ways. What evolves is less the body of what we know and more the nature of our knowing

[ED. NOTE: As part of the activites of the Long Now Foundation, Stewart Brand has organized a series of seminars which are held at Fort Mason in San Francisco. "The purpose of the series", Brand writes, "is to build a coherent, compelling body of ideas about long-term thinking, to help nudge civilization toward Long Now's goal of making long-term thinking automatic and common instead of difficult and rare."

Speakers in the series so far include a number of Edgies: Brian Eno, Jared Diamond, George Dyson, Kevin Kelly, Clay Shirky, and Bruce Sterling. All seminars are archived and freely downloadable.

The following Edge feature is based on Kevin Kelly's March 10th talk on "The Next 100 Years of Science: Long-term Trends in the Scientific Method." He's been exploring the theme on his blog, The Technium — JB ]


Introduction by Stewart Brand

Science, says Kevin Kelly, is the process of changing how we know things. It is the foundation our culture and society. While civilizations come and go, science grows steadily onward. It does this by watching itself.

Recursion is the essence of science. For example, science papers cite other science papers, and that process of research pointing at itself invokes a whole higher level, the emergent shape of citation space. Recursion always does that. It is the engine of scientific progress and thus of the progress of society.

A particularly fruitful way to look at the history of science is to study how science itself has changed over time, with an eye to what that trajectory might suggest about the future. Kelly chronicled a sequence of new recursive devices in science...

2000 BC — First text indexes
200 BC — Cataloged library (at Alexandria)
1000 AD — Collaborative encyclopedia
1590 — Controlled experiment (Roger Bacon)
1600 — Laboratory
1609 — Telescopes and microscopes
1650 — Society of experts
1665 — Repeatability (Robert Boyle)
1665 — Scholarly journals
1675 — Peer review
1687 — Hypothesis/prediction (Isaac Newton)
1920 — Falsifiability (Karl Popper)
1926 — Randomized design (Ronald Fisher)
1937 — Controlled placebo
1946 — Computer simulation
1950 — Double blind experiment
1962 — Study of scientific method (Thomas Kuhn)

Projecting forward, Kelly had five things to say about the next 100 years in science...

1) There will be more change in the next 50 years of science than in the last 400 years.

2) This will be a century of biology. It is the domain with the most scientists, the most new results, the most economic value, the most ethical importance, and the most to learn.

3) Computers will keep leading to new ways of science. Information is growing by 66% per year while physical production grows by only 7% per year. The data volume is growing to such levels of "zillionics" that we can expect science to compile vast combinatorial libraries, to run combinatorial sweeps through possibility space (as Stephen Wolfram has done with cellular automata), and to run multiple competing hypotheses in a matrix. Deep realtime simulations and hypothesis search will drive data collection in the real world.

4) New ways of knowing will emerge. "Wikiscience" is leading to perpetually refined papers with a thousand authors. Distributed instrumentation and experiment, thanks to miniscule transaction cost, will yield smart-mob, hive-mind science operating "fast, cheap, & out of control." Negative results will have positive value (there is already a "Journal of Negative Results in Biomedicine"). Triple-blind experiments will emerge through massive non-invasive statistical data collection--- no one, not the subjects or the experimenters, will realize an experiment was going on until later. (In the Q&A, one questioner predicted the coming of the zero-author paper, generated wholly by computers.)

5) Science will create new levels of meaning. The Internet already is made of one quintillion transistors, a trillion links, a million emails per second, 20 exabytes of memory. It is approaching the level of the human brain and is doubling every year, while the brain is not. It is all becoming effectively one machine. And we are the machine.

"Science is the way we surprise God," said Kelly. "That's what we're here for." Our moral obligation is to generate possibilities, to discover the infinite ways, however complex and high-dimension, to play the infinite game. It will take all possible species of intelligence in order for the universe to understand itself. Science, in this way, is holy. It is a divine trip.

Stewart Brand

KEVIN KELLY helped launch Wired magazine in 1993, and served as its Executive Editor until January 1999. He is now Editor-At-Large for WiredFrom 1984 to 1990 Kelly was publisher and editor of the Whole Earth Review. In the late 80s, Kelly conceived and oversaw the publication of four versions of the Whole Earth Catalogs. He was a founding board member of the WELL.

Kelly is the author of Out of Control and New Rules for the New Economy, and his writing has appeared in many national and international publications such as the New York Times, The Economist, Time, Harpers, Science, GQ, and Esquire.

Kevin Kelly's Edge Bio page


SPECULATIONS ON THE FUTURE OF SCIENCE

(KEVIN KELLY:) Science will continue to surprise us with what it discovers and creates; then it will astound us by devising new methods to surprises us. At the core of science's self-modification is technology. New tools enable new structures of knowledge and new ways of discovery. The achievement of science is to know new things; the evolution of science is to know them in new ways. What evolves is less the body of what we know and more the nature of our knowing.

Technology is, in its essence, new ways of thinking. The most powerful type of technology, sometimes called enabling technology, is a thought incarnate which enables new knowledge to find and develop news ways to know. This kind of recursive bootstrapping is how science evolves. As in every type of knowledge, it accrues layers of self-reference to its former state.

New informational organizations are layered upon the old without displacement, just as in biological evolution. Our brains are good examples. We retain reptilian reflexes deep in our minds (fight or flight) while the more complex structuring of knowledge (how to do statistics) is layered over those primitive networks. In the same way, older methods of knowing (older scientific methods) are not jettisoned; they are simply subsumed by new levels of order and complexity. But the new tools of observation and measurement, and the new technologies of knowing, will alter the character of science, even while it retains the old methods.

I'm willing to bet the scientific method 400 years from now will differ from today's understanding of science more than today's science method differs from the proto-science used 400 years ago. A sensible forecast of technological innovations in the next 400 years is beyond our imaginations (or at least mine), but we can fruitfully envision technological changes that might occur in the next 50 years.

Based on the suggestions of the observers above, and my own active imagination, I offer the following as possible near-term advances in the evolution of the scientific method.

Compiled Negative Results — Negative results are saved, shared, compiled and analyzed, instead of being dumped. Positive results may increase their credibility when linked to negative results. We already have hints of this in the recent decision of biochemical journals to require investigators to register early phase 1 clinical trials. Usually phase 1 trials of a drug end in failure and their negative results are not reported. As a public heath measure, these negative results should be shared. Major journals have pledged not to publish the findings of phase 3 trials if their earlier phase 1 results had not been reported, whether negative or not.

Triple Blind Experiments – In a double blind experiment neither researcher nor subject are aware of the controls, but both are aware of the experiment. In a triple blind experiment all participants are blind to the controls and to the very fact of the experiment itself. The way of science depends on cheap non-invasive sensor running continuously for years generating immense streams of data. While ordinary life continues for the subjects, massive amounts of constant data about their lifestyles are drawn and archived. Out of this huge database, specific controls, measurements and variables can be "isolated" afterwards. For instance, the vital signs and lifestyle metrics of a hundred thousand people might be recorded in dozens of different ways for 20-years, and then later analysis could find certain variables (smoking habits, heart conditions) and certain ways of measuring that would permit the entire 20 years to be viewed as an experiment – one that no one knew was even going on at the time. This post-hoc analysis depends on pattern recognition abilities of supercomputers. It removes one more variable (knowledge of experiment) and permits greater freedom in devising experiments from the indiscriminate data.

Images-25

Combinatorial Sweep Exploration – Much of the unknown can be explored by systematically creating random varieties of it at a large scale. You can explore the composition of ceramics (or thin films, or rare-earth conductors) by creating all possible types of ceramic (or thin films, or rare-earth conductors), and then testing them in their millions. You can explore certain realms of proteins by generating all possible variations of that type of protein and they seeing if they bind to a desired disease-specific site. You can discover new algorithms by automatically generating all possible programs and then running them against the desired problem. Indeed all possible Xs of almost any sort can be summoned and examined as a way to study X. None of this combinatorial exploration was even thinkable before robotics and computers; now both of these technologies permit this brute force style of science. The parameters of the emergent "library" of possibilities yielded by the sweep become the experiment. With sufficient computational power, together with a pool of proper primitive parts, vast territories unknown to science can be probed in this manner.

Images-26

Evolutionary Search – A combinatorial exploration can be taken even further. If new libraries of variations can be derived from the best of a previous generation of good results, it is possible to evolve solutions. The best results are mutated and bred toward better results. The best testing protein is mutated randomly in thousands of way, and the best of that bunch kept and mutated further, until a lineage of proteins, each one more suited to the task than its ancestors, finally leads to one that works perfectly. This method can be applied to computer programs and even to the generation of better hypothesis.

Simmatrix

Multiple Hypothesis Matrix – Instead of proposing a series of single hypothesis, in which each hypothesis is falsified and discarded until one theory finally passes and is verified, a matrix of many hypothesis scenarios are proposed and managed simultaneously. An experiment travels through the matrix of multiple hypothesis, some of which are partially right and partially wrong. Veracity is statistical; more than one thesis is permitted to stand with partial results. Just as data were assigned a margin of error, so too will hypothesis. An explanation may be stated as: 20% is explained by this theory, 35% by this theory, and 65% by this theory. A matrix also permits experiments with more variables and more complexity than before.

Pattern Augmentation – Pattern-seeking software which recognizes a pattern in noisy results. In large bodies of information with many variables, algorithmic discovery of patterns will become necessary and common. These exist in specialized niches of knowledge (such particle smashing) but more general rules and general-purpose pattern engines will enable pattern-seeking tools to become part of all data treatment.

Adaptive Real Time Experiments – Results evaluated, and large-scale experiments modified in real time. What we have now is primarily batch-mode science. Traditionally, the experiment starts, the results are collected, and then conclusions reached. After a pause the next experiment is designed in response, and then launched. In adaptive experiments, the analysis happens in parallel with collection, and the intent and design of the test is shifted on the fly. Some medical tests are already stopped or re-evaluated on the basis of early findings; this method would extend that method to other realms. Proper methods would be needed to keep the adaptive experiment objective.

AI Proofs – Artificial intelligence will derive and check the logic of an experiment. Ever more sophisticated and complicated science experiments become ever more difficult to judge. Artificial expert systems will at first evaluate the scientific logic of a paper to ensure the architecture of the argument is valid. It will also ensure it publishes the required types of data. This "proof review" will augment the peer-review of editors and reviewers. Over time, as the protocols for an AI check became standard, AI can score papers and proposals for experiments for certain consistencies and structure. This metric can then be used to categorize experiments, to suggest improvements and further research, and to facilitate comparisons and meta-analysis. A better way to inspect, measure and grade the structure of experiments would also help develop better kinds of experiments.

200603031234

Wiki-Science – The average number of authors per paper continues to rise. With massive collaborations, the numbers will boom. Experiments involving thousands of investigators collaborating on a "paper" will commonplace. The paper is ongoing, and never finished. It becomes a trail of edits and experiments posted in real time — an ever evolving "document." Contributions are not assigned. Tools for tracking credit and contributions will be vital. Responsibilities for errors will be hard to pin down. Wiki-science will often be the first word on a new area. Some researchers will specialize in refining ideas first proposed by wiki-science.

Defined Benefit Funding — Ordinarily science is funded by the experiment (results not guaranteed) or by the investigator (nothing guaranteed). The use of prize money for particular scientific achievements will play greater roles. A goal is defined, funding secured for the first to reach it, and the contest opened to all. The Turing Test prize awarded to the first computer to pass the Turing Test as a passable intelligence. Defined Benefit Funding can also be combined with prediction markets, which set up a marketplace of bets on possible innovations. The bet winnings can encourage funding of specific technologies.

Zillionics – Ubiquitous always-on sensors in bodies and environment will transform medical, environmental, and space sciences. Unrelenting rivers of sensory data will flow day and night from zillions of sources. The exploding number of new, cheap, wireless, and novel sensing tools will require new types of programs to distill, index and archive this ocean of data, as well as to find meaningful signals in it. The field of "zillionics" — - dealing with zillions of data flows — - will be essential in health, natural sciences, and astronomy. This trend will require further innovations in statistics, math, visualizations, and computer science. More is different. Zillionics requires a new scientific perspective in terms of permissible errors, numbers of unknowns, probable causes, repeatability, and significant signals.

Images-23

Deep Simulations – As our knowledge of complex systems advances, we can construct more complex simulations of them. Both the success and failures of these simulations will help us to acquire more knowledge of the systems. Developing a robust simulation will become a fundamental part of science in every field. Indeed the science of making viable simulations will become its own specialty, with a set of best practices, and an emerging theory of simulations. And just as we now expect a hypothesis to be subjected to the discipline of being stated in mathematical equations, in the future we will expect all hypothesis to be exercised in a simulation. There will also be the craft of taking things known only in simulation and testing them in other simulations—sort of a simulation of a simulation.

Hyper-analysis Mapping – Just as meta-analysis gathered diverse experiments on one subject and integrated their (sometimes contradictory) results into a large meta-view, hyper-analysis creates an extremely large-scale view by pulling together meta-analysis. The cross-links of references, assumptions, evidence and results are unraveled by computation, and then reviewed at a larger scale which may include data and studies adjacent but not core to the subject. Hyper-mapping tallies not only what is known in a particular wide field, but also emphasizes unknowns and contradictions based on what is known outside that field. It is used to integrate a meta-analysis with other meta-results, and to spotlight "white spaces" where additional research would be most productive.

Images-24

Return of the Subjective – Science came into its own when it managed to refuse the subjective and embrace the objective. The repeatability of an experiment by another, perhaps less enthusiastic, observer was instrumental in keeping science rational. But as science plunges into the outer limits of scale – at the largest and smallest ends – and confronts the weirdness of the fundamental principles of matter/energy/information such as that inherent in quantum effects, it may not be able to ignore the role of observer. Existence seems to be a paradox of self-causality, and any science exploring the origins of existence will eventually have to embrace the subjective, without become irrational. The tools for managing paradox are still undeveloped.

source:http://www.edge.org/3rd_culture/kelly06/kelly06_index.html


Kinderstart sues Google over lower page ranking

SAN FRANCISCO, March 18 (Reuters) - A parental advice Internet site has sued Google Inc. (GOOG.O: Quote, Profile, Research), charging it unfairly deprived the company of customers by downgrading its search-result ranking without reason or warning.

The civil lawsuit filed in U.S. District Court in San Jose, California, on Friday by KinderStart.com seeks financial damages along with information on how Google ranks Internet sites when users conduct a Web-based search.

Google could not immediately be reached for comment but the company aggressively defends the secrecy of its patented search ranking system and asserts its right to adapt it to give customers what it determines to be the best results.

KinderStart charges that Google without warning in March 2005 penalized the site in its search rankings, sparking a "cataclysmic" 70 percent fall in its audience -- and a resulting 80 percent decline in revenue.

At its height, KinderStart counted 10 million page views per month, the lawsuit said. Web site page views are a basic way of measuring audience and are used to set advertising rates.

"Google does not generally inform Web sites that they have been penalized nor does it explain in detail why the Web site was penalized," the lawsuit said.

While an entire sub-industry exists to help Web sites feature prominently in Google results, the company is known to punish those who try to trick the system into boosting their search rankings.

The lawsuit notes that rival search systems from Microsoft Corp.'s (MSFT.O: Quote, Profile, Research) MSN and Yahoo Inc. (YHOO.O: Quote, Profile, Research) feature Kinderstart.com at the top of their rankings when the name "Kinderstart" is typed in.

The complaint accuses Google, as the dominant provider of Web searches, of violating KinderStart's constitutional right to free speech by blocking search engine results showing Web site content and other communications.

KinderStart contends that once a company has been penalized, it is difficult to contact Google to regain good standing and impossible to get a report on whether or why the search leader took such action.

The suit was filed the same day a federal judge denied a U.S. government request that Google be ordered to hand over a sample of keywords customers use to search the Internet while requiring the company to produce some Web addresses indexed in its system.

source:http://today.reuters.com/business/newsArticle.aspx?type=ousiv&storyID=2006-03-19T020938Z_01_KRA907614_RTRIDST_0_BUSINESSPRO-TECH-GOOGLE-LAWSUIT-DC.XML


Scientists make 'bionic' muscles

Man in gym
Scientists have developed artificial, super-strength muscles which are powered by alcohol and hydrogen.

And they could eventually be used to make more advanced prosthetic limbs, say researchers at University of Texas.

Writing in Science, they say these artificial muscles are 100 times more powerful than the body's own.

They said they could even be used in "exoskeletons" to give superhuman strength to certain professions such as firefighters, soldiers and astronauts.

The approach could transform the way complex mechanical systems were built
Dr John Madden, University of British Columbia

Two types of muscle are being investigated by US researchers at the Nanotech Institute at the University of Texas in Dallas, working with colleagues from South Korea.

Both release the chemical energy of fuels, such as hydrogen and alcohol, while consuming oxygen.

In effect they are replicating the first stage in "breathing" - by taking in oxygen. The existing form of artificial muscles are driven by batteries.

However, neither of the types developed by the Texan researchers resembles a normal muscle - being made up of wires, cantilevers and glass bottles.

'Mimicking nature'

The most powerful type, "shorted fuel cell muscles" convert chemical energy into heat, causing a special shape-memory metal alloy to contract.

Turning down the heat allows the muscle to relax.

Lab tests showed that these devices had a lifting strength more than 100 times that of normal skeletal muscle.

Another kind of muscle being developed by the team converted chemical energy into electrical energy which caused a material made from carbon nanotube electrodes to bend.

Dr John Madden, from the University of British Columbia in Vancouver, Canada, writing in Science, said "the approach could transform the way complex mechanical systems were built".

He said the artificial muscles mimicked nature in a number of ways.

"The muscle consumes oxygen and fuel that can be transported via a circulation system; the muscle itself supports the chemical reaction that leads to mechanical work; electrochemical circuits can act as nerves, controlling actuation; some energy is stored locally in the muscle itself; and, like natural muscle, the materials studied contract linearly."

But he said the challenge now was to create a circulation system like that of humans that replaces the wires in the artificial muscles.

Dr Madden said pressures needed to be generated so that waste gases could be produced, and the artificial muscles could truly be described as "breathing".


The Great Escape

You're trapped in a high tech Spanish slammer, crawling through real tunnels, behind real bars. First-person gameplay breaks out of the box.

I just fought my way up a wind tunnel, scrambled through a ventilation duct, clambered across 40 yards of rope netting, rolled under a fence, and burrowed through a mass of grapefruit-sized plastic spheres. Now I'm facing two doors. One leads to freedom. The other to a room with something nasty in it, possibly involving torture.

I've got a full sweat going, my pulse is hammering, and the countdown on my wrist-mounted navigation unit tells me I'm running out of time. Minutes ago, a pictogram flashed up at me on a video monitor. Now I have to match it to one of a dozen symbols on a column between the two doors. Pick the correct one and I'm free. Mess up and I'm toast. I make my choice. Bzzzt. The door to my right swings open to reveal a large chair bristling with wires and leather straps.

Until this moment, I thought I had mastered La Fuga.

This medieval-looking electric chair sits deep inside an old bank in Madrid. The building has been remodeled to house La Fuga, a real-life role-playing game. Think of La Fuga (The Escape) as a $20 million cross between Halo and laser tag. The goal is simple: Decipher visual riddles to navigate and escape Mazzina, a high tech prison.

The company behind La Fuga is called Négone. It was founded by a sister-and-brother team, network engineer Silvia Garcia Alonso and former investment banker Jorge, who owned a piece of a dotcom that sold to Yahoo! for $400 million. They put their share of the money into live immersive gaming, starting Négone in 2002 and opening La Fuga last October. "There were lots of advances in in-home entertainment," Silvia says, "but in real-world entertainment, there was nothing happening."

A standard first-person shooter was one option, but the duo wanted something more cinematic. "There are certain plots that work again and again," Silvia says. "Finding treasure, a robbery, a big escape. The idea I think we all have when we see these movies is that it would be great to be the main character."

Creating the game presented both physical and intellectual challenges: They needed to erect a maze of steel and exposed concrete, and they needed to build a database to track the progress of each player through the labyrinth. Négone's coders didn't have to worry about writing the sort of physics-simulation software used in videogames, but Silvia says the logic engine - which keeps track of who's where in the building and what they're doing - gave her fits. "For video RPGs, you can use an off-the-shelf game engine, the way EA or Id does," she says. "But there's nothing that could handle all the kinds of data we need to use, so we had to build it ourselves." Now that the Madrid facility is operational, the company is focusing on opening a game center in Manhattan early next year - with plans for 60 more worldwide in the next decade.

I pay 15 euros, set up an account, and receive a navigational unit with a networked PDA and an RFID chip that I strap to my forearm. The chip tracks my progress through the prison.

I have three lives - three incorrect test answers - before the system will spit me back into the lobby. A kiosk scans the RFID chip in my wrist unit, and I head down a set of stairs and through a dark passageway into a room lined with steel. From a 17-inch flat­screen, a severe-looking woman with slicked-back hair tells me I've been assigned for reprogramming. Everything's in Spanish. I'm accompanied by a translator, but I need no help getting the gist: Resistance is futile.

The screen goes static and then switches to a view of a sweaty prisoner with a 5 o'clock shadow who tells me that I can liberate myself and all the other drones stuck in the prison. Those who have escaped before me will contact me to assist in my quest. The door opens, and I enter a sort of closet before another door opens to reveal a metal air duct. I try to step in, but I slip, fall hard on my ass, and slide down the chute into a room containing a baggage carousel surrounded by screens.

One monitor displays another female executive type extolling the virtues of the prison's reeducation system. I hold my wrist unit to the screen; up pops a member of the resistance, Lieutenant Gunderson, a sultry young woman wearing a combat helmet who claims to be my hacker guide. Between come-hither looks, she tells me I'll need a circuit-welding tool to get to the next level. I can't decide whether she's trying to help me or seduce me. But I need to stay focused. To get the tool, I have to solve a puzzle.

I'm shown four pictures, only one of which contains a pair of perfectly parallel lines. I have 30 seconds to choose the correct one. As the time ticks away, I push a button on my wrist unit: C. Bingo. A circuit-welder icon lights up on my screen.

I reach another door, which opens into another dark room crisscrossed by a maze of metal grates and mirrors, with more mirrors overhead. Gunderson appears on a ceiling-mounted screen, telling me I need a cloak of invisibility. Metal-and-glass-encased RFID readers are scattered throughout the room, each encircled by pulsing yellow LEDs. I hold my wrist unit next to one just to see what happens. Two hundred bonus points! I successfully answer another question, identifying the longest of four tangents extending from a circle. I get the cloak. Gunderson casts more sexy glances in my direction, and a map on the monitor guides me to the doorway out of the labyrinth.

I head up a set of stairs into a room filled with dance-floor fog pierced by red laser beams. I feel like Ethan Hunt in Mission: Impossible. A couple of other players are making their way between beams, contorting themselves and sliding along the floor. My invisibility cloak hides me from the lasers, and I move through the room quickly. I solve another puzzle; a door opens into the library. While most of the rooms are concrete and metal, straight out of Doom, the library is pure Ian Schrager: glowing glass walls with a cluster of monitors in the center. A code flashes up on one of them. Each wall is backlit in blue, orange, or white and contains book-shaped glass bricks. There's a code on each brick. I need to find the matching sequence among the dozens of bricks. I get it and move on.

One final room. The outside walls are lined with monitors, and a smaller glass chamber juts out from the far wall. Another puzzle opens another door, revealing several monitors mounted in a column like totems on a pole. There's a door to my right and to my left. Gunderson tells me I'm almost free. A screen flashes a complicated pictogram. I have to pick the matching one from a dozen similar tiles in various places on the pole. I press a button beneath one of the choices, and it's over - I've won. Back in the Négone lobby, I'm presented with a commemorative mug. It's almost too easy.

I take a walk to Négone headquarters and size up my score - 38,000 - on the company's Web site. Respectable, but the single-round record is more than 50,000. Either I need to start hunting down more of those little bonus-point boxes or there's a better way to do it.

The second time I play, Négone's system knows that I've mastered a basic version, so it sends me on a course that's more physically challenging. The system drives up my pulse rate just before I have to take the tests - by sending me up a flight of steep stairs, for example, or into a wind tunnel. This makes it much harder to concentrate on the puzzles. I reach the final glass chamber with two mistakes, and I think I might be able to pull off another win. Wrong. The right hand door opens to reveal the electric chair, and I'm shown the exit. It dawns on me that my initial run was Négone's version of the bunny slope.

Next stop: Times Square. Will Négone's planned 30,000-square-foot game center at 49th and Broadway appeal to US gamers? Today's videogame graphics and story lines are so sophisticated that aspects of La Fuga seem a bit canned and low-budget by comparison. La Fuga may be shooting for Halo's apocalyptic look, but the emphasis on discovery and puzzle-solving makes the experience flow more like Myst. The overwrought video clips smack of a bad telenovella. On top of that, the game's most compelling aspect - its physicality - could be too much for gamers used to moving only their thumbs.

But climbing ropes, tunneling through a roomful of plastic spheres, and squeezing through air ducts (just like in Aliens) can be pretty damn exciting - even without the rocket launchers, railguns, and frag grenades you get in the typical RPG. Maybe Négone's games will appeal to parents trying to get their Gen Y progeny out of the living room for a bit of exercise.

Even though I've never seen anything like it, the game somehow seems familiar. Then, on the flight home, it hits me: Eight years ago I went on a hardcore Doom jag for a few months and started having dreams that took place inside the game. That's what playing La Fuga feels like. It's a fully realized dream sequence. One thing I'm painfully aware of during my 13-hour plane ride: Falling down in a real-world game definitely leaves a real-world bruise.

source:http://www.wired.com/wired/archive/14.03/lafuga.html?pg=2&topic=lafuga&topic_set=


Security flaws could cripple missile defense network

The network that stitches together radars, missile launch sites and command control centers for the Missile Defense Agency (MDA) ground-based defense system has such serious security flaws that the agency and its contractor, Boeing, may not be able to prevent misuse of the system, according to a Defense Department Inspector General’s report.

The report, released late last month, said MDA and Boeing allowed the use of group passwords on the unencrypted portion of MDA’s Ground-based Midcourse Defense (GMD) communications network.

The report said that neither MDA nor Boeing officials saw the need to install a system to conduct automated log audits on unencrypted communications and monitoring systems. Even though current DOD policies require such automated network monitoring, such a requirement “was not in the contract."

The network, which was also developed to conform to more than 20-year-old DOD security policies rather than more recent guidelines, lacks a comprehensive user account management process, the report said. Neither MDA nor Boeing conducted required Information Assurance (IA) training for users before they were granted access to the network, the report stated.

Because of this poor information security, the DOD IG report said, MDA and Boeing officials “may not be able to reduce the risk and magnitude of harm resulting from misuse or unauthorized access or modification of information [on the network] and ensure the continuity of the system in the event of an interruption.”

David Wright, a senior scientist with the Union of Concerned Scientists, said he was surprised by the network flaws outlined in the report. It “sounds like the kind of stuff routinely done with this kind of network,” he said. “It’s hard to imagine they would design one without it.”

Stephen Young, an MDA analyst at UCS, said the security flaws could affect operation of the entire GMDS project. “The network is absolutely essential to GMD…without it, the system can’t work.”

President Bush directed DOD in 2002 to develop GMD to counter missile threats from countries such as North Korea as well as terrorists, and Boeing on its Web site describes the project as “the first missile defense program deployed operationally to defend the homeland against ballistic missile attacks conducted by terrorists or rogue states”

GMD consists of missile interceptors based in underground silos at Fort Greely, Alaska and Vandenberg Air Force Base, Calif., and high-powered sea- and land-based radars to track incoming missiles, a Boeing fact sheet said.

Spokesmen for MDA, Boeing and Northrop Grumman, contractor for the unencrypted portion of GCN, all declined to answer questions from Federal Computer Week on the security flaws in the GMD network. Boeing and Northrop Grumman deferred to MDA, and an MDA spokesman said his agency would not answer any press questions until it responds to the IG report on March 24.

Harris Corp., a GCN subcontractor, described the network on its Web site as “the largest synchronous optical networking ring in the world that includes more than 20,000 miles of fiber crossing 30 states and will connect all GMD sites.”

MDA budget documents describe the GCN as a fiber-optic network interconnected with military satellites. These budget documents said the GCN connects the two missile silo sites with control and communications nodes at Fort Greely and Shriever Air Force Base and the Cheyenne Mountain Operations Center, both in Colorado, as well as radars in Alaska and a test bed in Huntsville, Ala.

source:http://www.fcw.com/article92640-03-16-06-Web&newsletter%3Dyes


Statistical Analysis Bolsters Theory Linking Warmer Oceans to Stronger Hurricanes



Science Image: hurricane
Image: NASA
Since the 1970s, ocean surface temperatures around the globe have been on the rise--from one half to one degree Fahrenheit, depending on the region. Last summer, two studies linked this temperature rise to stronger and more frequent hurricanes. Skeptics called other factors into account, such as natural variability, but a new statistical analysis shows that only this sea surface temperature increase explains this trend.

Climate researcher Judith Curry and her colleagues at the Georgia Institute of Technology looked at the hurricane records for storms between 1970 and 2004 in all of the world's ocean basins, yielding a total sample of 210 seasons over the six regions. They subjected the records to a mathematical test derived from information theory--so-called mutual information, which measures the amount of information two variables share, so that if they do not overlap at all this measure would be zero.

The researchers then looked at sea surface temperature, specific humidity, wind shear and wind variation over longitude to see what, if anything, these variables shared with the increasing number of strong storms the world over. According to the analysis appearing online today in Science, this trend only depends on sea surface temperature. "If you examine the intensification of a single storm, or even the statistics on intensification for a particular season, factors like wind shear can play an important role," Curry says. "However, there is no global trend in wind shear or the other factors over the 35-year period."

The link between rising ocean temperatures and overall climate change remains murky because of the overlap between natural cycles and any global warming. "But if you buy the argument that global warming is causing the increase in sea surface temperatures--and everybody seems to be buying this--then it's a pretty small leap to say global warming is causing this increase [in hurricane frequency]," Curry says. Her team will now focus on clarifying the mechanisms at work in the North Atlantic by separating out the 75-year natural cycle and climate change. "The last peak was in 1950, the next is in 2025," she adds. "We're only halfway up [the cycle] and we're already 50 percent worse [in terms of storms]. To me, that's a compelling issue that needs to be confronted." --David Biello

source:http://www.sciam.com/article.cfm?chanID=sa003&articleID=000051A6-DE14-1419-9E1483414B7F0000

This Green Beer's the Real Deal

A time-honored tradition on St. Patrick's Day is to drink green beer. But Brooklyn Brewery won't partake in the ritual. Instead, the brew pub serves up beer that's honey-colored or amber gold. In terms of environmental impact, however, its beers are as green as they come.

Brooklyn Brewery, located on Brewer's Row in Brooklyn, New York, is one of a handful of breweries around the country that uses sustainable energy when producing its beer.

Its choice of method is wind power, which provides 100 percent of the brewery's energy needs, making the 1,658,000 gallons of beer it produces green year-round. Brooklyn Brewery's energy bills are 10 percent to 13 percent higher than they would be otherwise, but its operators say leaning on alternative energy just makes sense.

"It's the right thing to do, and not too many years down the road it will be a common choice," said Steve Hindy, founder of Brooklyn Brewery and co-author of Beer School. "If you are going to be in business, it's good to have principles."

Brooklyn Brewery isn't equipped with wind turbines on the warehouse roof, however. Instead, the brewery pays a premium rate to Con Edison so that the 285,000 kilowatt-hours it uses off the grid every year is replaced by energy produced at a wind farm located in Madison County, in upstate New York.

Community Energy, which manages the 20 General Electric wind turbines that produce electricity for the beer house, estimates that Brooklyn Brewery's commitment to green power stops 335,000 pounds of carbon dioxide, 1,500 pounds of sulfur dioxide and 500 pounds of nitrogen oxide from being emitted into the atmosphere annually.

That's equivalent to what 22,000 trees absorb in one year or what is produced by driving a car 290,000 miles.

It's just one of many socially conscious programs that the $12 million beer company runs to make its beverages environmentally friendly.

It also pays farmers in New Jersey to swing by and pick up the "spent grain" -- the remaining husks that are left over after brewing. The farmers then feed the nutritious grains to their livestock, making good use of Brooklyn Brewery's waste.

But Brooklyn Brewery isn't alone in its conservation efforts. New Belgium Brewery in Fort Collins, Colorado, has developed a unique method that uses its waste to power its factory.

After producing its libations, New Belgium puts its waste water inside closed pools filled with anaerobic bacteria. The microbes feed on the water, rich in nutrients from the brewing process, and produce methane gas, which is then pumped back to the factory where it becomes electrical and thermal energy.

Right now, New Belgium meets 30 percent of its energy needs -- between 40,000 and 60,000 kwh per month -- through this cogeneration process. The remaining 70 percent comes from wind, which means no fossil fuels are burned making New Belgium's various beers.

The used water, once cleaned by anaerobic microbes, is used in the factory for cooling and cleaning. Then the water is retreated and returned to the municipal system.

The system cost $5 million to set up, but Kim Jordan, CEO and co-founder of New Belgium, said the investment was worth every penny.

"It's a gratifying way to use money, to try and push the envelope and the practice of alternative energy," she said. "It's our goal to completely close that loop, so all our energy use comes from our own waste stream."

souce:http://www.wired.com/news/technology/0,70361-0.html?tw=wn_index_4


Ringside Seat to the Universe's First Split Second

You don't get much closer to the big bang than this.

Scientists peering back to the oldest light in the universe have evidence to support the concept of inflation, which poses that the universe expanded many trillion times its size faster than a snap of the fingers at the outset of the big bang.

The expansion of the universe over most of it's history has been relatively gradual. The notion that a rapid period inflation preceded the Big Bang expansion was first put forth 25 years ago. The new WMAP observations favor specific inflation scenarios over other long held ideas. Image to right: Time Line of the Universe -- The expansion of the universe over most of it's history has been relatively gradual. The notion that a rapid period "inflation" preceded the Big Bang expansion was first put forth 25 years ago. The new WMAP observations favor specific inflation scenarios over other long held ideas. Click image to enlarge. Credit: NASA

We're talking about when the universe was less than a trillionth of a trillionth of a second old. In that crucial split second, changes occurred that allowed for the creation of stars and galaxies hundreds of millions of years later.

The new finding was made with NASA's Wilkinson Microwave Anisotropy Probe (WMAP) and is based on three years of continuous observations of the cosmic microwave background, the afterglow light from the first moments of the universe.

It's admittedly mind-boggling. Inflation poses that the universe expanded far faster than the speed of light and grew from a subatomic size to a golf-ball size almost instantaneously. This concept, however, was a mere product of calculations done with pencil and paper around 1980. The idea stands on much firmer ground today.

"Inflation was an amazing concept when it was first proposed 25 years ago, and now we can support it with real observations," said WMAP team member Dr. Gary Hinshaw of NASA Goddard Space Flight Center in Greenbelt, Md., a lead author on one of the scientific papers submitted for publication.

How do we gaze back to the infant universe? The cosmic microwave background is a fossilized record of what occurred way back when. Embedded in this light are subtle patterns that point to very specific conditions about the early universe.

Previous observations have focused on the temperature patterns of this light, which have provided an accurate age of the universe and insights into its geometry and composition. The temperature differences, varying by about a millionth of a degree, point to density differences---a little more matter here, a little less matter there. Over the course of millions of years, gravity exploited the density differences to create the structure of the universe---stars and galaxies separated by vast voids.

The new WMAP observations give not only a more detailed temperature map, but also the first full-sky map of the polarization of the microwave background.

This major breakthrough enables scientists to obtain much deeper insight into what happened within the first trillionth of a second, when cosmic inflation perhaps occurred. The polarization signal is at least 100 times fainter than the temperature signal.

The WMAP team is announcing two major results: evidence for cosmic inflation, and confirmation of when stars first turned on. Both results depended on a combination of temperature and polarization data.

WMAP finds that the first stars---the forebears of all subsequent generations of stars and of life itself---were fully formed remarkably early, only about 400 million years after inflation. This is called the era of reionization, the point when the light from the first stars ionized hydrogen atoms, liberating electrons from the protons.

Polarization is affected by the environment through which the light passes, such as the polarized glare of sunlight produced when it reflects off of a shiny object. Scientists are hunting for two kinds of polarization signals in the microwave background. One, called the E-mode, points to the era of reionization. This is the polarization caused by the microwave background scattering off of the ionized hydrogen. The other is called B-mode, which points directly to inflation.

WMAP detected E-mode polarization but not B-mode yet. B-mode detection could provide smoking-gun evidence for inflation. But with the temperature map plus the E-mode polarization map, the WMAP team can say several things about inflation.

For example, scientists now have an upper limit on the energy of inflation. Also, WMAP data support basic predictions of inflation about the size and strength of spacetime fluctuations and how they get weaker on smaller length scales.

"It blows my mind that we can now distinguish between different versions of what happened within the first trillionth of a second of the universe," said Dr. Charles Bennett of the Johns Hopkins University in Baltimore, WMAP principal investigator.

And it's only going to get better as WMAP continues to soak up light. The polarization detection will grow stronger. "The longer WMAP observes, the more it reveals about how our universe grew from microscopic quantum fluctuations to the vast expanses of stars and galaxies we see today," Bennett said.

The European Space Agency plans to launch a mission called Planck by 2008 that will study microwave background polarization. A proposed NASA Beyond Einstein inflation probe would search for B-mode signals, the calling card of the big bang.

WMAP was launched on June 30, 2001, and is now a million miles from Earth in the direction opposite the Sun. The WMAP team includes researchers at NASA Goddard; Johns Hopkins University; Princeton University; the Canadian Institute of Theoretical Astrophysics in Toronto; University of Texas at Austin; Cornell University; University of Chicago; Brown University in Providence, R.I.; University of British Columbia; University of Pennsylvania in Philadelphia; and University of California, Los Angeles.

source:http://www.nasa.gov/vision/universe/starsgalaxies/wmap_pol.html

This page is powered by Blogger. Isn't yours?