Wednesday, June 07, 2006

Making Science Machine Readable

"New Scientist is reporting on a new open source tool for writing up scientific experiments for computers, not humans. Called EXPO, it avoids the many problems computers have with natural language, and can be applied to any experiment, from physics to biology. It could at last let computers do real science - looking at published results and theories for new links and directions."

source:http://science.slashdot.org/science/06/06/07/1442221.shtml

Personal data on 2.2 million troops stolen

WASHINGTON (Reuters) - Personal information on about 2.2 million active-duty, National Guard and Reserve troops was stolen last month from a government employee's house, officials said on Tuesday in the latest revelation of a widening scandal.

This means nearly all current U.S. military personnel may be at risk for identify theft, the Pentagon said.

The Department of Veterans Affairs said the information, including names, Social Security numbers and dates of birth, may have been stored in the same stolen electronic equipment that contained similar personal data on 26.5 million U.S. military veterans.

Lawmakers and veterans' advocates have expressed alarm that the government failed to safeguard the data, which in the wrong hands could be used in credit card fraud and other crimes.

Law enforcement agencies investigating the incident have no indication the stolen information has been used to commit identity theft, officials said.

Veterans Affairs Secretary Jim Nicholson disclosed last month that unidentified burglars on May 3 broke into the Maryland residence of a Veterans Affairs data analyst who had violated official procedures by taking the data home. The thieves stole equipment containing the veterans' data.

The government over the weekend said personal information on about 50,000 active-duty, National Guard and Reserve personnel may also have been involved in the theft.

But now Veterans Affairs said that as it and the Pentagon compared electronic files, officials discovered that personal information on as many as 1.1 million military members on active duty, 430,000 National Guard troops and 645,000 members of the Reserves may have been taken in the theft.

'CAREFULLY MONITOR'

Bryan Whitman, a Pentagon spokesman, said, "We want to encourage service members to be vigilant and carefully monitor their personal information and any statements related to recent financial transactions."

Whitman said the Pentagon was helping Veterans Affairs inform the affected military personnel about steps they can take to protect against identity theft.

The Department of Veterans Affairs said it receives records for all military troops because they become eligible to receive certain benefits, such as GI Bill educational assistance and a home-loan guaranty program.

Veterans groups have criticized the government for allowing personal data to be compromised and for responding slowly to the theft. Officials have said Nicholson first heard of the May 3 crime on May 16 and only informed the public on May 22, almost three weeks after the theft occurred.

"VA remains committed to providing updates on this incident as new information is learned," Nicholson said in a statement.

Nicholson previously has said the senior career data analyst who took the information home would be fired and that a senior official in whose office the employee worked had been placed on administrative leave. Another senior Veterans Affairs official has resigned.

The burglary from the employee's home in Aspen Hill, Maryland, involved a laptop computer with an external disk drive, officials have said.

Identity theft, or obtaining personal or financial information about someone else to make transactions in that person's name, has mushroomed with the growth of the Internet and electronic business.

source:http://today.reuters.com/news/newsArticle.aspx?type=domesticNews&storyID=2006-06-07T024423Z_01_N06435124_RTRUKOC_0_US-CRIME-USA-VETERANS.xml


Scientists to try to clone human embryos

Stepping into a research area marked by controversy and fraud, Harvard University scientists said Tuesday they are trying to clone human embryos to create stem cells they hope can be used one day to help conquer a host of diseases.

"We are convinced that work with embryonic stem cells holds enormous promise," said Harvard provost Dr. Steven Hyman.

The privately funded work is aimed at devising treatments for such ailments as diabetes, Lou Gehrig's disease, sickle-cell anemia and leukemia. Harvard is only the second American university to announce its venture into the challenging, politically charged research field.

The University of California, San Francisco, began efforts at embryo cloning a few years ago, only to lose a top scientist to England. It has since resumed its work but is not as far along as experiments already under way by the Harvard group.

A company, Advanced Cell Technology Inc. of Alameda, Calif., is trying to restart its embryo cloning efforts. And British scientists said last year that they had cloned a human embryo, though without extracting stem cells.

Scientists have long held out the hope of "therapeutic cloning" against diseases like diabetes, Parkinson's disease and spinal cord injury. But such work has run into ethical objections, a ban on federal funding and the embarrassment of a spectacular scandal in
South Korea.

Now, using private money to get around the federal financing ban, the Harvard researchers are joining the international effort to produce stem cells from cloned human embryos.

"We're in the forefront of this science and in some ways we're setting the bar for the rest of the world," said Dr. Leonard Zon of the Harvard Stem Cell Institute.

Dr. George Daley of Children's Hospital Boston, a Harvard teaching hospital, said his lab has begun its experiments. He declined to describe the results so far, saying the work is in very early stages.

Two other members of the Harvard Stem Cell Institute, Douglas Melton and Kevin Eggan, have also received permission from a series of review boards to begin human embryo cloning, the institute announced.

Daley's work is aimed at eventually creating cells that can be used to treat people with such blood diseases as sickle-cell anemia and leukemia. Melton and Eggan plan to focus on diabetes and neurodegenerative disorders like Lou Gehrig's disease, striving to produce cells that can be studied in the lab to understand those disorders.

"We think that this research is very important, very promising, and we applaud Harvard for taking the initiative to move this work forward," said Sean Tipton, president of the Coalition for the Advancement of Medical Research, which supports cloning to produce stem cells.

Cloning an embryo means taking DNA from a person and inserting it into an egg, which is then grown for about five days until it is an early embryo, a hollow ball of cells smaller than a grain of sand. Stem cells can then be recovered from the interior, and spurred to give rise to specialized cells or tissues that carry the DNA of the donor.

So this material could be transplanted back into the donor without fear of rejection, perhaps after the disease-promoting defects in the DNA have been fixed. That strategy may someday be useful for treating diseases, though Daley said its use in blood diseases may be a decade or more away.

Daley's current research is using unfertilized eggs from an in-vitro fertilization clinic and DNA from embryos that were unable to produce a pregnancy. Both are byproducts of the IVF process and should provide a ready supply of material for research, Daley said in a statement. Later, his team hopes to use newly harvested eggs and DNA from patients.

Eggan said he and Melton will collaborate on work that uses DNA from skin cells of diabetes patients and eggs donated by women who will be reimbursed for expenses but not otherwise paid.

Harvesting stem cells destroys the embryo, one reason that therapeutic cloning has sparked ethical concerns. The Rev. Tad Pacholczyk, director of education for the National Catholic Bioethics Center in Philadelphia, said he found the Harvard developments troubling.

By cloning human embryos to extract stem cells, he said, "you are creating life precisely to destroy it. You are making young humans simply to strip-mine them for their desired cells and parts. And that is at root a fundamentally immoral project that cannot be made moral, no matter how desirable the cells might be that would be procured."

Apart from the controversy, human embryo cloning has also been the subject of a gigantic fraud.

Hwang Woo-suk of Seoul National University in South Korea caused a sensation in February 2004 he and colleagues claimed to be the first to clone a human embryo and recover stem cells from it. He hit the headlines again in May of last year when he said his lab had created 11 lines of embryonic stem cells genetically matched to human patients.

But the promise came crashing down last December and January when Hwang's university concluded that both announcements were bogus.

On the Net:

Harvard Stem Cell Institute: http://stemcell.harvard.edu


Children's Hospital Boston: http://www.childrenshospital.org



source: http://news.yahoo.com/s/ap/20060606/ap_on_sc/harvard_cloning


Puritan Work-Ethic, How I Loathe Thee

I do not like work, even when someone else does it -- Mark Twain

Most video games are a rip-off.

Nearly every video game since "tank pong" has buried its best content behind layers of work. Unlike any other retail product I can think of, when you buy a video game, the chances that you will actually get what you paid for are infinitesimal. I can't think of a single game I've played where I am confident that I've seen every single level; unveiled every coveted secret; unlocked every whatsit and pretty and soundtrack left like kipple by the designers in the dark corners of the code.

I bought it. I want my game.

When I buy a book, the only thing that stands in the way of completing it is the page count. And I know the page count going in. I can read the last chapter standing in the bookstore. If I find Chapter 56 too cumbersome, I can skip it entirely and move on to the better parts of Moby Dick (there are better parts, trust me).

Sometime about 10 years ago, we started measuring video games in terms of "hours". A game that gave you 5 hours of gameplay was somehow a ripoff. A game that proffered 100 was some kind of opus. But the reality is that most gamers play a small fraction of even those 5 hours. Let's face it, a lot of games suck. I buy the game. I play it for an hour or two. I see the pretty. I hear the boom. I go "cool" at the twist or the plot or the theme that made me want to buy it in the first place. Then back it goes into the GameSpot "used" bin.

Some genres are bigger offenders, bigger overall ripoffs, than others.

From Doom to Halo, first person shooters have propagated the evil concept of levels. I'm playing level 12. Level 12 sucks. I suck. I've killed random demon #212 and I'm 5 minutes past the last save point. But I have to get through level 12 to play level 13 where the giant pretty monster of death awaits, and that's supposed to be the coolest thing ever. Excuse me, I bought the game. I'd like to go directly to the pretty monster of death.

Zelda, Oblivion, and all the role playing games in-between take this curse of Calvinist suffering and at least attempt to justify it through story. If I want to meet the Grey Fox I've got some thieving to do first. These games are marginally more palatable and feel like less of a ripoff, because at least the barriers to the content are logical. But as a capitalist consumer, they still keep me from getting what I paid for without an additional investment of hours and hours of time.

But by far the worst offenders are MMORPGs. Oh how the hours have drained from my life as I've made cloth caps or shot rabbits solely to get to the shiny I've ostensibly already paid for with my $14.95. Even highly refined and otherwise excellent games like World of Warcraft, or more recently Guild Wars: Factions, suffer from this curse. Hey, at least with the offline offenders I can spend half an hour with my friend Google to find a magic "cheat" that gives me the hollow satisfaction of sneaking out what I paid for in the first place.

I play games to escape. To go somewhere else. But our industry has so ingrained this concept of "earning" our fun that the best is somehow always saved for last. Like modern day Puritans, we've convinced ourselves that we are not worthy of that for which we've already paid. Sinners in the hands of an angry god, we don't deserve our fun until we pay in blood.

But verily I say unto you all is not lost. I think we might--just might--be in the midst of a rebellion. Increasingly, developers realize that to be successful they must treat the consumer with respect. Most FPS titles are now designed for content-leveling multiplayer from the ground up, or at least contain enough plot and story that they break the "level" mentality. Even Battlefield 2 with its unlocks doesn't systematically leave the casual player subject to the eternal boot-licking scorn of the cheese-eating high school student. The Grand Theft Auto series, while maintaining some traditional story-based barriers, is popular precisely for the sense of freedom it evokes. There's still content below the surface, but you feel like you can go find it if you want it.

Real Time Strategy games, which are often conceived entirely as competitive multiplayer experiences, have dropped their content barriers as well. A few hours with a modern RTS title and most players will have seen the full spectrum of what the game has to offer, and can focus on (heaven forfend) actually playing the game. Sims (all kinds) break the very mold of the problem. In Microsoft Flight Simulator 2004: A Century of Flight (FS9), you can fly any plane, from any location in the world, at any time, in any conditions. All that stands in your way is your own competence. Sandbox games like Spore (we hope!) and Second Life replace the very concept of levels with building and evolution--violent and competitive evolution occasionally--but not illogical puritan denial.

Perhaps this curse of puritan work-ethic is easy to understand. Games are supposed to be challenging. When the opponent is simply a machine, there needs to be some reason to keep playing, a goal. The only coin the game can offer is what it can hide in the bits and bytes of the game itself.

But you know what? Real life is hard enough. Give me back my game.

source:http://www.gamerswithjobs.com/node/24946


Blizzard's 'Secret Sauce'

In 1991, the internet didn't exist.

That is to say, it did exist (and had for some time), but to the majority of Americans it might as well have been a huffalump until the creation of the World Wide Web in (approximately) 1992, when the internet would begin to become both widely understood, and easy-to-use (therefore "of interest" to most people).

Yet in 1991, the internet (such as it was) was neither widely understood nor easy-to-use, which is why the prospect of playing games on the internet may have seemed like a good and bad idea simultaneously. On one hand, nobody was doing it yet - it was a virgin market; on the other, nobody was doing it yet - the risks were terrible.

In 1991, videogame industry leader Sierra launched the Sierra Network (later called the ImagiNation Network). It was geared more-or-less toward children, with cartoon-ish art and themes, but it allowed users to play a variety of games and chat with friends in online chat rooms - all for an hourly fee, of course. It was, in every way, ahead of its time.

Particularly in terms of what users were willing to pay. At one point, the hourly rate for access to Sierra's network had climbed as high as $6 per hour. This was in addition to the subscription fees users were already paying for dial-up access to the internet itself and (in some extreme cases) long distance telephone charges levied by the telephone company. By contrast, many telephone sex chat services charged less than half that amount.

The Sierra Network, not surprisingly, failed and was shut down in 1996 by AOL, who had acquired it from AT&T. Ironically, this was not too long after the internet had become both widely understood and easy-to-use, and right around the same time that several other online gaming services had begun to flourish. Among them, an exciting new service offered by a company called Blizzard.

The Sleeper Has Awakened
In 1992, a revolutionary videogame was released that captured the imaginations of gamers the world over, almost immediately selling half a million copies. One of the first "real- time strategy" games ever made, it tasked the player with building a virtual army by collecting resources and then constructing buildings that would produce their machines of war - all in "real time." While the player was at it, their "enemy" was doing the same, building up to an eventual showdown between the competing armies, after which one side would claim total victory. Whoever had the most machines or the best strategy would win the day. It was like chess combined with backgammon wrapped up in an erector set, and gamers loved it.

That game was not Warcraft.

Westwood Studios' Dune II, predating Warcraft by at least two years, was based on the science fiction books by Frank Herbert, and cast the player as one of three races bent on controlling the spice-infested planet of Arrakis. It has been described as among the best PC games ever made, and many still consider it the best example of its genre ever made. Yet, it was not without its share of problems.

As with any game based on a license, Dune II relied on the players' familiarity with the premise of the original works. The Dune series had sold millions of copies of books world-wide, and had been made into a feature-length film in 1984, but to many people, the story was simply too dense to get their heads around. Case in point: The resource Dune II players were tasked with mining, the spice "Melange," took Herbert an entire novel to attempt to explain. Called "the spice of spices" in his appendices, the fictional Melange has been attributed with prolonging life, allowing users to foresee the future, astrally project objects through time and space, turn people's eyes blue and make giant worms try to kill you. "Catchy" is not the first word which comes to mind here.

Still, the game was among the first of its kind, and as such is fondly remembered and universally considered the grandfather of the RTS genre. The criticism of its universe did not prevent Westwood from controlling RTS production for almost a decade, but combined with the soon-to-be glaring lack of multiplayer capability, did leave a hole large enough for rival

Blizzard to drive an entire franchise through.

How the West Was Won
Officially founded in 1991 as Silicon & Synapse, Blizzard Entertainment had been making their bones producing console titles and second-rate DOS games like Battle Chess II (1990) and The Death and Return of Superman (1994). As with any business, their goal in the first few years was to simply survive. Condor Software co-founder Dave Brevik explains early corporate life by saying "console games were paying the bills."

He would know - Condor was doing the same. Founded by Brevik in 1993 with Max and Erich Schaefer, Condor had been making ends meet by developing low-budget console titles. Then, they got a call from publisher Sunsoft to develop a comic book franchise title for the Sega Genesis.

Dave Brevik tells the story: "We were developing a fighting game (like Street Fighter) using [DC's] Justice League characters ... [Part-way] through development, we got approval to show the game off at CES. This was before E3 existed."

source:http://www.escapistmagazine.com/issue/48/3


Intel's Sales Down, Current Gen of Products Weak

"According to an article in EETimes, Intel's processor sales dropped 52 percent this April as compared with April one year ago. Unit sales dropped 21 percent and prices dropped 40 percent. The article concludes with an industry analyst's assertion that 'Intel has obviously given up on making any money on their current generation of processors and has started a price war with AMD.' The San Jose Mercury News is reporting that Intel has just put several of its money-losing communications businesses up for sale and notes that 'it remains to be seen what Intel will do with its other money-losing businesses, Itanium microprocessors and flash memory chips.' The article quotes an industry analyst saying 'If you look at Intel today, it's hard to find a trace of the technology or the people that they spent more than $10 billion on.' Ouch."

source:http://hardware.slashdot.org/article.pl?sid=06/06/06/1538243

Hidden Dimensions - It's No Game at Apple

Every once in awhile I read a story on the Internet that just doesn't seem to sync with my experiences at Apple. Most of the time, the story is derived from what the author wishes Apple would do for their own benefit. Or perhaps, more generously, they see it as a benefit to Apple, but the perception is based on an incomplete understanding of Apple.

Recently, I read a story about Apple that questioned why Apple hasn't been more active in the gaming world. Let's just start with the general premise that many people who are enthusiastic about Apple and its products and are enthusiastic gamers often express disappointment that Macintoshes aren't stronger gaming platforms and that Apple doesn't seem to ever take steps to make it one. The idea, of course, is that if Macs were supreme game computers, sales would go up. What could be better?

What I'm going to express next is just my opinion, but an opinion derived from experience: Apple has no real corporate interest in the gaming community and does not see computer games as a path to success or a better image for Apple. That's not to say that some parts of Apple don't enjoy games and their promotion. Just look. But the reality is that Apple has struggled for a long time to avoid the perception that Macs are toys, and so their principle emphasis is on science, small business, education, and the creative arts. All very grownup stuff. If a market doesn't appear on Apple's main page tab, you can be sure it's a secondary market.

Of course, all that may seem obvious to many observers of Apple. And yet, many continue to wish that their favorite computer company would put so much effort into the market that the Mac would become the premiere game platform. Right now, that's not a realistic expectation.

One reason is the practical realities of business. Historically, the slim profit margins for modestly priced games require large sales numbers to recoup the investment and turn a profit. For years and years, Apple's market share has barely been sufficient to entice game developers, although there are some notable exceptions.

In my view, this long drought in the gaming business has allowed Apple management to reflect on how they really feel about the game business. Especially during the time that the iCEO became the CEO. To some extent, the recent "Get A Mac" commercials provide some insight into Apple's thinking.

Note that gaming relates to power. The user is in control of his universe and seeks to exert his will. So any discussion of games has to include the utilization of power.

First off, let's look at some facts.

1. Without making any judgments and without getting into a discourse on current military events, it is nevertheless no secret that Steve Jobs has concerns about some components of the military and its leadership. Now that's a complex statement because it has a lot of overtones that I don't need to get into. Because you don't earn respect by being disrespectful, any further comment is irrelevant.

2. The "Get a Mac" ads say something subtle about power. Recall what I said previously about the two actors representing the computer, not the user. There is some additional, subtle symbolism in those ads that says something about Apple's public (not internal) image of power. The PC, who wears a suit, is the computer that's used as an instrument of power. Having been in federal sales, I can tell you that the U.S. Air Force and the U.S. Navy have embraced Microsoft almost completely. [1] The PC can be taken as an instrument of willfulness and power that shouldn't be but often is abused in that role.

3. If you look at the WWDC 2006 list of presentations, there is very little explicit material on gaming. The tracks are focused on core OS technologies and information technologies. And while there has always been a gaming center at WWDC where young developers are kept entertained, you'll see very little high level emphasis. It's just something that is tolerated and allowed to grow and flow at its own pace.

4. Apple sells consumer and professional computers. They differentiate them based on the power of the graphics subsystem. The message there is clear: If you're a professional, you should be editing movies with Final Cut HD or manipulating RAW photos with Aperture. If you're a consumer on a more modest budget, then you get hardware more oriented towards writing and surfing. This is a clear marketing message from Apple that de-emphasizes games for the consumer, no doubt about it.

On the other hand, those who are really into games want the fastest possible hardware and the lowest possible cost. It doesn't take long to find a litany of negative comments on the Internet about how Apple's most affordable consumer systems are just not up to serious gaming. Rather than complain, this should be taken as an outward sign of Apple's most serious branding intentions:

Yes, games are fun, and we love many of them, but this is not the most significant message we want to deliver as a company.

This mixed message confuses and annoys many Apple customers.

I want to close with a comment on why Apple's culture is so mixed on the subject of games. I think it's a recognition by Apple's management that this is a fact of life for most of its younger employees. But amongst many more senior managers, including Steve himself, I suspect there is some lingering concern about the essence of the game market. Computer games, as we've come to know them, are mostly (not always) about aggressive behavior, conflict, battle, wars of power, domination, and sometimes, in the worst cases, some very unwelcome social behavior. To put it bluntly, death and destruction.

Apple's public culture appears to celebrate, on the other hand, creation and life. When you have several hundred senior managers at Apple who are most likely married and typically have children, you'll find a culture of affirmation, family, and life. There have been many instances of Steve doing a keynote and demoing, say, iMovie, in which children are involved. More than once, I heard Steve say, after editing one of those movies on stage, "This is why we do what we do."

Games are a part of life, learning, and growing. Some computer games have terrific redeeming value, and many do not. Action movies and games permeate our culture, and in some ways, they just can't be ignored in our day-to-day lives. But that doesn't mean that Apple's management believes that considerable emphasis needs to be placed on this market when there are so many other more important things for people to do with their lives and their computers.

Remember, it's not in Apple's culture to hold people back. They create insanely great tools for people to build whatever their imagination can conjure up. In addition, Apple could try to build the greatest game machine on earth. Silicon Graphics, Inc. (SGI) built those kinds of expensive graphics toys for years. Today, they are in bankruptcy.

Finally, Apple likes control. They need and love to manage and control the image of their company. If Apple computers were to become the darling of the gaming industry, then the natural evolution of the worst driving out the best would infect their culture. So Apple doesn't mind supporting game developers, but they just don't want to let outrageous success in gaming cause them to lose control of the Apple message.

I know, it's contradictory and complex. But that's the hidden dimension of Apple.

source:http://www.macobserver.com/columns/hiddendimensions/2006/20060605.shtml


This page is powered by Blogger. Isn't yours?