Friday, February 24, 2006
Enzyme computer could live inside you
A molecular computer that uses enzymes to perform calculations has been built by researchers in Israel.
Itamar Willner, who constructed the molecular calculator with colleagues at the Hebrew University of Jerusalem in Israel, believes enzyme-powered computers could eventually be implanted into the human body and used to, for example, tailor the release of drugs to a specific person's metabolism.
The team built their computer using two enzymes - glucose dehydrogenase (GDH) and horseradish peroxidase (HRP) - to trigger two interconnected chemical reactions. Two chemical components - hydrogen peroxide and glucose - were used to represent input values (A and B). The presence of each chemical corresponded to a binary 1, while the absence represented a binary 0. The chemical result of the enzyme-powered reaction was determined optically.
The enzyme computer was used to perform two fundamental logic computations known as AND (where A and B must both equal one) and XOR (where A and B must have different values). The addition of two further enzymes - glucose oxidase and catalase - connected the two logical operations, making it possible to add together binary digits using the logic functions.
Intelligent drug delivery
Enzymes are already widely used to assist calculations using specially encoded DNA. These DNA computers have the potential to surpass the speed and power of existing silicon computers because they can perform many calculations in parallel and pack a vast number of components into a tiny space.
But Willner says his enzyme computer is not designed for speed – it can take several minutes to perform a calculation. Rather, he envisages it eventually being incorporated into bio-sensing equipment and used, for example, to monitor and react to a patient's response to particular dosages of a drug.
"This is basically a computer that could be integrated with the human body," Willner told New Scientist. "We feel you could implant an enzyme computer into the body and use it to calculate an entire metabolic pathway."
Martyn Amos from University of Exeter, UK, also sees great potential for such devices. "The development of fundamental devices such as counters is vital for the future success of bio-molecular computers," he told New Scientist.
"If such counters could be engineered inside living cells, then we can imagine them playing a role in applications such as intelligent drug delivery, where a therapeutic agent is generated at the site of a problem," Amos says. "Counters would also offer a biological 'safety valve', to prevent engineered cells proliferating in an uncontrolled fashion."
Journal reference: Angewandte Chemie International Edition (vol 45, p 1572)
source:http://www.newscientist.com/channel/info-tech/dn8767.html
Kim Talks Halo 3 and Second Wave
Microsoft Game Studios' Shane Kim discusses Halo 3, the second wave of Xbox 360 games, and lessons learned from the recent platform launch
We caught up with Shane Kim at the recent DICE summit to talk about the various issues surrounding first and third party Xbox 360 development. Not surprisingly, we wanted to talk about Halo 3, but Kim points out there's a whole lot more going on at Microsoft than that game...
Next Generation: Let's talk about Halo 3. What are we up to?
Shane Kim: I call it The Mythical Halo 3 — we haven't announced any such game yet! Obviously the Halo franchise is very important to us. When you have Bill Gates being quoted fairly constantly, talking about a game, you know it's important to the company. But his recent comments reflect the position accurately. Which is that, if there were a Halo 3 we would be careful about how we announce and introduce it.
It's exactly the same way we talked about Halo 2, where people wanted it a year after Halo. That would not have accomplished anything.
Is it coming out this year?
It depends. If it's the game that everyone is expecting then, yes. For us it's about making a proper impact on the platform. It has to be something with huge significance, so we won't be rushed.
We don't want all the hype and speculation to overshadow some of the great titles that do have coming this holiday and thereafter. Gears of War has an incredible level of anticipation. I think it has more anticipation than Halo did before Halo came out.
If you remember the E3 before the launch [of Xbox] a lot of people were not sure that Halo would be so great. We learned a lot of lessons at that E3. We chose not to show a lot of titles at this past E3. For people in the industry, their first assessment of a title is based on the visual impact.
PlayStation 3 announcements are coming. When they do, will you guys be making amendments to your plans?
I don't think anything is going to change. We've had our plans in place for some years now. The development of titles, of hardware and of Xbox Live takes many years. Our portfolio for the next few years is well on the way.
Those guys [SCEA] tend not to share their plans with us, so we pretty much develop our own strategy based on our vision and where we want to go. Leaders can't afford to be reactors. We have a very good plan and we will execute that.
What have you learned in the last few months since the launch of Xbox 360?
Some of the innovations on the platform have created new opportunities for us that were not available in the first generation of games. Xbox Arcade is an example of that where we are learning new ways to extend our relationship with customers.
We are starting to stretch the way we imagine what the platform is really capable of delivering as are developers who have spent more time with the hardware for the next wave of titles. They are really showing the power of the platform and the power of Xbox Live.
What are the big bets coming up?
We made some pretty significant announcements at X05, but the attention then was focused on the [hardware] launch. We talked about the Crackdowns and the Too Humans or the Mass Effects, and you'll be able to see more from those titles soon, as well as Gears of War.
How much of an improvement will we see in the second wave?
Our general strategy has been to focus on quality rather than quantity. Those four titles — you'll hear a lot more at E3. We have a few other things that we are working on that we'll announce between now and E3, so we haven't told the world about everything.
The hardware shortages must have had an effect on third parties' desire to release games at this time. Surely, there'll be a gap until the installed base is bigger?
There's always a trade-off when you are the first to go out on a platform. The overall portfolio is smaller so there is less competition at the start of the platform life cycle and that leads some publishers to want to be there. But they understand that the installed base is smaller.
Some people also have this desire to lead from a creative standpoint and to show their ability to harness the power of a new platform, so they want to jump in earlier. Other people have a business model that is predicated on selling to a larger installed base and for these people, I think, who maybe had plans to be out in the second quarter, they would like to see more units out there.
We're working hard to produce as many units as possible. I'm pretty confident we'll catch up with demand in the next few weeks and we'll be able to supply all the demand. Of course we are gratified by the overwhelming demand for Xbox 360 but we would have liked to have sold more units, had they been available. I don't see it as lost sales though, only as a time—shift.
Looking back at the launch, and leaving shortages aside, any regrets?
Any game developer will tell you they wish they had more time with the hardware. That's always the case. If you look at the complexity of this console and you add Live...
With the first generation of Xbox we waited a year before launching Live; this year everything went out together. It would have been better to have more time with the development hardware. But this is not a trivial undertaking. It's very challenging to create this hardware and a network like this.
But I look at the quality of the launch games that came out and that is how to analyse the platform's success. Call of Duty is a great game, and Perfect Dark, Gotham, Kameo, Need For Speed are all great games. The launch portfolio was full of good games. There wasn't anything quirky that might hurt the platform or anything that really rose to the top. It was a level playing field for everybody.
Could it be argued that you had too many titles at launch?
If we had gone with half of the titles, you would have more people criticizing us for having too small a portfolio. The breadth and quality of the portfolio offered something for everyone both from the first and third parties. It appealed to a broad audience, not just the hardcore.
Choice is never a bad thing, unless there is a degradation in quality and I don't think we had that situation. It was the strongest launch portfolio in history.
Some have said that the launch lacked a single killer game.
I don't think that's fair. Everyone is looking for a Halo and a GTA, but we have to realize there are only two of those titles in history, and they weren't those big legends before they launched.
It is easy to look back and say 'nothing the size of Halo came out', but the launch titles were all solid good games and there was some amazing work.
I would put this portfolio up against the original Xbox launch portfolio any day. I don't see the fact that there wasn't one game that everyone wanted to buy as a criticism. The strength was in the variety.
If you look at the attach rate that speaks to the quality of the portfolio of the highest attach rate in history which is an indication of real quality.
soruce:http://www.businessweek.com/innovate/content/feb2006/id20060221_374710.htmHealth concerns limit wireless Internet at Lakehead University
That’s because president Fred Gilbert won’t allow it until he’s satisfied EMF (electric and magnetic fields) exposure doesn’t pose a health risk, particularly to young people.
Gilbert, who was interviewed last week on the CBC about the university’s policy as stated in a town hall meeting last fall, told ITBusiness.ca he based his decision on scientific literature that indicates the potential for “some fairly significant” health consequences.
“These are particularly relevant in younger people (who have) fast-growing tissues, and most of our student body are late teenagers and still growing, so it’s just a matter of taking precautions and providing an environment that doesn’t have a potential risk associated risk,” he said.
Gilbert cited studies done by scientists for the California Public Utilities Commission, whose findings boil down to the fact that while there is no proven link between EMFs exposure and diseases such as leukemia and brain tumours, the possible risk warrants further investigation.
He also said Canadian regulation allows for a higher minimum degree of exposure to EMFs than do some other countries.
“All I’m saying is while the jury’s out on this one, I’m not going to put in place what is potential chronic exposure for our students,” he said. “Admittedly that’s highest around the locations of the antenna sites and the wireless hotspots, but those are the places people tend to gravitate to because they get the best reception.”
Gilbert added he believes there are many environmental impacts that are not manifest for 30 to 40 years after exposure. “Second-hand tobacco exposure is a case in point,” he said. “We’re just finding out now what some of those impacts are. Asbestos is another example.”
Lakehead, which is located at the head of Lake Superior in Thunder Bay, Ont., has some wireless access, but only where the university’s fibre optic network doesn’t reach. There are plenty of computers around campus where students can access the Internet 24 hours a day, so it’s not like they’re cut off, Gilbert said.
And it doesn’t necessarily mean there will never be ubiquitous wireless at Lakehead, he said.
“When we get to the stage where the evidence is conclusive there is no health impact I have no problem putting wireless in place,” said Gilbert. “Even the World Health Organization in its international review says it doesn’t have a great deal of concern but it admits the information is not 100 per cent.”
That will probably change by sometime this year, however, said Robert Bradley, director of consumer and clinical radiation protection at Health Canada.
Bradley said Canada has been working for a number of years with WHO on the International EMF Project, which looks at the research done to date, current research and gaps in studies. Bradley said he expects documents to be published sometime this year that say there are no identified health risks given the exposure levels that are being set as a regulatory limit. “There have been extensive reviews of the science and I’m pretty confident that’s the way it’s going,” he said. Bradley said Health Canada is in the third or fourth revision of the standard that has been adopted by Industry Canada as the regulator for telecommunications equipment. The standard looks at the amount of radio-frequency energy across various frequencies that would have to be absorbed by the human body before it starts to increase its internal temperature by more than one degree. The limit for the public or the consumer is one-fiftieth that of the level deemed safe for a work environment. “So it’s quite a reasonable level and it’s consistent with what most other governments have done,” said Bradley. Other countries that go beyond these standards “basically go beyond what the science says is required,” he added. But while Gilbert is probably the only university president in Canada to take this position, he’s not alone in thinking safety should come before convenience. Jorg-Rudiger Sack, a computer science professor at Carleton University, says he agrees there is not enough information on potential long-term effects to say unequivocally that wireless is safe. “In fact, the long-term effects of such technologies, when combined with other sources, have not been studied,” he said in an e-mail interview. Furthermore, he said, in his personal opinion, while wireless is useful in some situations, such as at airports or in cafes, it’s not really needed in a university campus environment – although that’s not how most students see it. “If faculty/staff/students are at fixed locations, like their offices and labs, wireless connectivity is not necessary and, I believe, could be avoided,” he said. “We do not move our PCs or even laptops around once we are in our offices.” Sack said people who work in wireless environments might see a similarity to the second-hand smoke issue and demand employers ensure they aren’t exposed to EMFs until there is solid proof it has no negative health impacts. For Andrew McAusland, executive director of instructional and information technology services at Concordia University in Montreal, the issue of EMFs exposure was also a factor in deciding whether or not to implement wireless networks on campus. Ultimately, McAusland was satisfied the risk was low enough to proceed, and made sure the wireless LANs conform to Health Canada safety codes. “We reviewed the literature, we researched the topic, we provide ongoing documentation on our Web sites on the topic and we stay on top of it,” said McAusland. “It’s not an issue you should ignore at all, but wireless local area networks use a very low level of frequency.” Concordia currently has about 80 per cent wireless coverage, and usage growth has been by the hundreds of percentage points a year. “It’s a necessary part of our infrastructure now,” he said.