Monday, April 10, 2006
Return of the Web Mob
As director of the Rapid Response Team at VeriSign-owned iDefense, of Dulles, Va., Dunham and his team of malware hunters infiltrate black hat hacker forums, chat rooms and newsgroups, posing as online criminals to gather intelligence on the dramatic rise in rootkits, Trojans and botnets.
Based on all the evidence gathered over the last two years, Dunham is convinced that groups of well-organized mobsters have taken control of a global billion-dollar crime network powered by skillful hackers and money mules targeting known software security weaknesses.
"There's a well-developed criminal underground market that's connected to the mafia in Russia and Web gangs and loosely affiliated mob groups around the world. They're all involved in this explosion of phishing and online crime activity," Dunham said in an interview with eWEEK.
Just two years after the Secret Service claimed a major success with "Operation Firewall," an undercover investigation that led to the arrest of 28 suspects accused of identity theft, computer fraud, credit card fraud and money laundering, security researchers say the mobsters are back, with a level of sophistication and brazenness that is "frightening and surreal."
"They never really went away," Dunham said. "They scurried away for a few months and tightened their security controls. It became harder to get on their lists and into their chat rooms."
Not these days. A law enforcement official familiar with several ongoing investigations showed eWEEK screenshots of active Web sites hawking credit card numbers, Social Security numbers, PayPal and eBay credentials, and bank login data by the bulk.
"They're very public about all this, especially on the Russian sites. It's almost comical how open and barefaced they are," said the official, who requested anonymity because of the sensitive nature of the ongoing probe.
Black hat hackers have set up e-commerce sites offering private exploits capable of evading anti-virus scanners. An e-mail advertisement intercepted by researchers contained an offer to infect computers for use in botnets at $25 per 10,000 hijacked PCs.
Skilled hackers in Eastern Europe, Asia and Latin America are selling zero-day exploits on Internet forums where moderators even test the validity of the code against anti-virus software.
"I saw one case where an undetectable Trojan was offered for sale and the buyers were debating whether it was worth the price. They were doing competitive testing to ensure it actually worked as advertised," said Jim Melnick, a member of Dunham's team.
"We even have proof of actual job listings on Russian-language sites offering lucrative pay for coders who can create exploits and launch denial-of-service attacks. We've seen evidence of skilled hackers stealing corporate data on behalf of competitors. This isn't just about credit card and bank information. It has all the elements on traditional mafia-type crime," Melnick said.
Roger Thompson, a computer security pioneer who created the first Australian anti-virus company in the late 1980s, is convinced the secretive Russian mafia is masterminding the use of sophisticated rootkits in botnet-seeding Trojans.
"They are paying to recruit bright young hackers and using teenage kids around the world to move money around. They're into everything: spyware installations, denial-of-service shakedowns, you name it. It's the traditional mafia finding it easy to make money on the Internet," said Thompson, who now runs Exploit Prevention Labs, in Atlanta.
Yury Mashevsky, a virus analyst at Kaspersky Lab, said there is even evidence of turf wars in the criminal underworld. "They use malicious programs that destroy the software developed by rival groups and include threats directed at each other, anti-virus vendors, police and law enforcement agencies in their creations," Mashevsky said, in Woburn, Mass.
He has also seen fierce online confrontation in the battle to control the resources of infected computers. In November 2005, Mashevsky discovered an attempt to hijack a botnet. "[The] network of infected computers changed hands three times in one day. Criminals have realized that it is much simpler to obtain already-infected resources than to maintain their own botnets, or to spend money on buying parts of botnets which are already in use," he said.
On message boards and newsgroups where malicious code is put up for sale, Mashevsky said flame wars and attacks against each other to steal virtual property amounts to normal everyday activity.
Dunham, who frequently briefs upper levels of federal cyber-security authorities on emerging threats, said there have been cases in Russia where mafia-style physical torture has been used to recruit hackers.
"If you become a known hacker and you start to cut into their profits, they'll come to your house, take you away and beat you to a pulp until you back off or join them. There have been documented cases of this," Dunham said.
One key aspect of Web mob activity that flies under the radar is use of "money mules," or individuals who help to launder and transfer money from hijacked online bank accounts.
On career Web sites such as Monster.com, a job listing for a "private financial receiver," "shipping manager," or "country representative" invariable is an active attempt to recruit people around the world to withdraw funds and deliver it to crime bosses, according to a detailed research report by iDefense on the so-called money mules.
Money is transferred into the mule's account, withdrawn as cash and then wired to an offshore account.
"We've only scratched the surface of what's going on in the underworld. It's like the iceberg that took down the Titanic. No one knew how big and dangerous it was," Dunham said.
He cited the recent discovery of MetaFisher, also known as SpyAgent, a Trojan connected to a Web-based command and control interface that highlighted just how advanced the attackers have become.
"In just a few weeks, MetaFisher spread to thousands of computers. We found conclusively that these attacks were going on undetected for more than a year. Can you imagine the amount of data that has already been stolen? It's unimaginable," Dunham said.
Eric Sites, vice president of R&D Sunbelt Software, in Clearwater, Fla., showed eWEEK screenshots of the Web interface that showed specific targeted phishing attacks against European banks and keeps detailed statistics on actual bot infections around the world.
The interface also can be used to add exploits, keep track of anti-virus signature definitions and keep track of callback from injected machines.
"This isn't the work of the guy in the basement. This is organized and simplified to make it super easy to control all those bot drones," Sites said.
source:http://www.eweek.com/article2/0,1895,1947576,00.asp
Google wins rights to Aussie algorithm
Google has snapped up the rights to an advanced text search algorithm invented by a University of NSW student.
The algorithm, or search engine tool, is called Orion and was developed by UNSW PhD student Ori Allon at the university's School of Computer Science.
Orion works as an add-on to existing search engines to improve the relevance of searches and won praise from Microsoft founder Bill Gates last year.
The algorithm is a problem-solving computational procedure and is the building block for all search engines such as those operated by Google and Yahoo!
Orion finds pages where the content is about a topic strongly related to the key word. It then returns a section of the page, and lists other topics related to the key word so the user can pick the most relevant.
The results of the query are displayed immediately in the form of expanded text extracts, giving the searcher the relevant information without having to go to the website - although there is still that option.
Mr Allon, a 26-year-old computer scientist, was born in Israel but came to study at Melbourne's Monash University in the '90s. After completing his bachelor and masters degrees, he moved to UNSW to further his studies and research.
The Israeli newspaper Ha'aretz reported on Sunday that Google had acquired Mr Allon's advanced text search algorithm.
Mr Andrew Stead, the business development manager at UNSW's NewSouth Innovations agency confirmed that Mr Allon left Australia six weeks ago and was now working at Google's headquarters at Mountain View, California.
Mr Stead said the move was not a secondment; Mr Allon's move was permanent.
Some work on the project, however, would continue to be undertaken by Mr Allon's supervisor in Sydney, Dr Eric Martin.
Mr Stead confirmed that the university had held talks with the big three internet search operations: Google, Yahoo! and MSN.
Beyond confirming that Mr Allon was now working for Google, Mr Stead was not able to confirm any other of the reported details.
However, given Google's desire to continue dominating the search business and the fact that there were other interested parties, the deal could potentially be worth millions.
While Mr Allon is the key person behind Orion, the university retains ownership of the intellectual property as it was developed within the university's research facilities.
Mr Stead said Mr Allon, who is an Australian citizen, hoped to complete his PhD with the university and one day hoped to return to Australia.
source:http://www.smh.com.au/news/breaking/aussies-formula-for-a-fortune/2006/04/10/1144521239582.html
Biotechnology’s advance could give malefactors the ability to manipulate life processes--and even affect human behavior.
source:http://science.slashdot.org/article.pl?sid=06/04/09/1944226
Startup called Webaroo touts 'Web on a hard drive'
This seeming impossibility is what a Bellevue, Wash.-based startup called Webaroo has set out to realize -- they call it "search unplugged" -- and even company president Brad Husick concedes that he found the idea "crazy" at first blush.
Of course, it wasn't that long ago that a laptop with an 80-gigabyte hard drive seemed crazy, too. But ever-more-monstrous drives are common today and they serve as the foundation upon which Webaroo is basing its free, ad-supported search service. The company and service officially emerge from behind their stealth shield tomorrow armed with a flashy bundling agreement from laptop maker Acer.
"It's not inconceivable that a couple of years from now laptops are going to have 400- or 500-gigabyte drives in them," says Husick, who co-founded Webaroo in 2004 with CEO Rakesh Mathur and CTO Beerud Sheth. "What if you could take that space and it would be enough to carry the Internet with you? If you think about searching the Web without being tied to a connection of some kind -- and then periodically connecting to get refreshed -- that was the kernel of our idea. How do you put the Web on a hard drive? … That's why it was so crazy."
The first thing to acknowledge is that the phrase "put the Web on a hard drive" is not to be taken literally. As Husick explains: “Let's say the HTML Web is 10 billion pages -- it's actually a little less than that -- but at 10K per page that's 1 million gigabytes, also known as a petabyte. It's going to be a long time before notebooks have million-gigabyte hard drives. So how do you get a million gigabytes down to what you need?”
Webaroo does it, he says, through "a server farm that is of Web scale" and a set of proprietary search algorithms that whittle the million gigabytes down to more manageable chunks that will fit on a hard drive: up to 256 megabytes for a growing menu of "Web packs" on specific topics -- your favorite Web sites, city guides, news summaries, Wikipedia and the like -- that make up the service's initial offerings; and something in the neighborhood of 40 gigabytes for the full-Web version the company intends to release later this year.
"We've developed these algorithms that give you a set of meaningful, relevant results for anything on which you search," Husick says. "In effect, we give you the first couple pages of results."
That's all you really need, the company argues, because studies show that most people rarely look beyond the first 10 to 20 results returned by a typical search. With Webaroo you're being returned not just a list of pages, but the pages themselves -- with all graphics intact -- as well as key live links from those pages and the pages to which they lead. They're talking roughly 10,000 pages per "Web pack," or plenty to provide a meaningful search experience for whatever the subject matter at hand, Husick says.
Users must download and install 5 megabytes worth of Webaroo software to get started and then synch up with the Webaroo service site to refresh the content in their topic-specific packs, or, later this year, the full-Web version. Husick insists these pack updates take only minutes, but I’m already seeing corporate network managers wincing at the notion of this application sweeping the workplace.
All in all, though, there's no denying the "wow" factor here.
"It's kind of surprising that nobody else has done something like this," says Rob Enderle, president of the Enderle Analyst Group. "It's one of those things that a lot of folks will download."
Enderle believes the service could be a big hit among those whose jobs regularly take them away from their 'Net connections -- frequent fliers, for example.
"It's going to be a while before hot spots are in all the places we need to have them," he says.
Which isn't to say that ever more ubiquitous 'Net connections won't pose a challenge to the Webaroo business model.
"Long-term their opportunity may have more to do with [search] performance" than the offline capability itself, Enderle says.
Husick tells me that performance benefit was reinforced for the company by a rousing reception their service received from Japanese mobile operators who he says were salivating over Webaroo as a means to siphon search traffic away from their increasingly crowded wireless broadband networks.
Webaroo will also be touting the potential cost savings and convenience of its service.
"Every hotel I go to wants to charge me $10 to $15 a night for Internet. Every airport wants to charge me another $10 to get connected," Husick says. "If I've got five minutes before I have to board my flight, do I want to spend that five minutes connecting or do I want to spend five minutes getting my search answer?"
There's a fine line between crazy and audacious -- we'll know soon enough which side Webaroo falls on.
source:http://www.networkworld.com/community/?q=node/5388
Nanotech Gone Awry?
source:http://science.slashdot.org/article.pl?sid=06/04/09/0333234
This Boring Headline Is Written for Google
JOURNALISTS over the years have assumed they were writing their headlines and articles for two audiences — fickle readers and nitpicking editors. Today, there is a third important arbiter of their work: the software programs that scour the Web, analyzing and ranking online news articles on behalf of Internet search engines like Google, Yahoo and MSN.
The search-engine "bots" that crawl the Web are increasingly influential, delivering 30 percent or more of the traffic on some newspaper, magazine or television news Web sites. And traffic means readers and advertisers, at a time when the mainstream media is desperately trying to make a living on the Web.
So news organizations large and small have begun experimenting with tweaking their Web sites for better search engine results. But software bots are not your ordinary readers: They are blazingly fast yet numbingly literal-minded. There are no algorithms for wit, irony, humor or stylish writing. The software is a logical, sequential, left-brain reader, while humans are often right brain.
In newspapers and magazines, for example, section titles and headlines are distilled nuggets of human brainwork, tapping context and culture. "Part of the craft of journalism for more than a century has been to think up clever titles and headlines, and Google comes along and says, 'The heck with that,' " observed Ed Canale, vice president for strategy and new media at The Sacramento Bee.
Moves to accommodate the technology are tricky. How far can a news organization go without undercutting its editorial judgment concerning the presentation, tone and content of news?
So far, the news media are gingerly stepping into the field of "search engine optimization." It is a booming business, estimated at $1.25 billion in revenue worldwide last year, and projected to more than double this year.
Much of this revenue comes from e-commerce businesses, whose sole purpose is to sell goods and services online. For these sites, search engine optimization has become a constant battle of one-upmanship, pitting the search engine technologists against the marketing experts and computer scientists working for the Web sites.
Think of it as an endless chess game. The optimizer wizards devise some technical trick to outwit the search-engine algorithms that rank the results of a search. The search engines periodically change their algorithms to thwart such self-interested manipulation, and the game starts again.
News organizations, by contrast, have moved cautiously. Mostly, they are making titles and headlines easier for search engines to find and fathom. About a year ago, The Sacramento Bee changed online section titles. "Real Estate" became "Homes," "Scene" turned into "Lifestyle," and dining information found in newsprint under "Taste," is online under "Taste/Food."
Some news sites offer two headlines. One headline, often on the first Web page, is clever, meant to attract human readers. Then, one click to a second Web page, a more quotidian, factual headline appears with the article itself. The popular BBC News Web site does this routinely on longer articles.
Nic Newman, head of product development and technology at BBC News Interactive, pointed to a few examples from last Wednesday. The first headline a human reader sees: "Unsafe sex: Has Jacob Zuma's rape trial hit South Africa's war on AIDS?" One click down: "Zuma testimony sparks HIV fear." Another headline meant to lure the human reader: "Tulsa star: The life and career of much-loved 1960's singer." One click down: "Obituary: Gene Pitney."
"The search engine has to get a straightforward, factual headline, so it can understand it," Mr. Newman said. With a little programming sleight-of-hand, the search engine can be steered first to the straightforward, somewhat duller headline, according to some search optimizers.
On the Web, space limitations can coincide with search-engine preferences. In the print version of The New York Times, an article last Tuesday on Florida beating U.C.L.A. for the men's college basketball championship carried a longish headline, with allusions to sports history: "It's Chemistry Over Pedigree as Gators Roll to First Title." On the Times Web site, whose staff has undergone some search-engine optimization training, the headline of the article was, "Gators Cap Run With First Title."
The Associated Press, which feeds articles to 11,000 newspapers, radio and television stations, limits its online headlines to less than 40 characters, a concession to small screens. And on the Web, there is added emphasis on speed and constant updates.
"You put those demands, and that you know you're also writing for search engines, and you tend to write headlines that are more straightforward," said Lou Ferrara, online editor of The Associated Press. "My worry is that some creativity is lost."
Whether search engines will influence journalism below the headline is uncertain. The natural-language processing algorithms, search experts say, scan the title, headline and at least the first hundred words or so of news articles.
Journalists, they say, would be wise to do a little keyword research to determine the two or three most-searched words that relate to their subject — and then include them in the first few sentences. "That's not something they teach in journalism schools," said Danny Sullivan, editor of SearchEngineWatch, an online newsletter. "But in the future, they should."
Such suggestions stir mixed sentiments. "My first thought is that reporters and editors have a job to do and they shouldn't worry about what Google's or Yahoo's software thinks of their work," said Michael Schudson, a professor at the University of California, San Diego, who is a visiting faculty member at the Columbia University Graduate School of Journalism.
"But my second thought is that newspaper headlines and the presentation of stories in print are in a sense marketing devices to bring readers to your story," Mr. Schudson added. "Why not use a new marketing device appropriate to the age of the Internet and the search engine?"
In journalism, as in other fields, the tradition of today was once an innovation. The so-called inverted pyramid structure of a news article — placing the most important information at the top — was shaped in part by a new technology of the 19th century, the telegraph, the Internet of its day. Putting words on telegraph wires was costly, so reporters made sure the most significant points were made at the start.
Yet it wasn't all technological determinism by any means. The inverted pyramid style of journalism, according to Mr. Schudson, became standard practice only in 1900, four decades or more after telegraph networks came into use. It awaited the rise of journalists as "an avowedly independent, self-conscious, professionalizing group," confident of their judgments about what information was most important, he said.
The new technology shaped practice, but people determined how the technology was used — and it took a while. Something similar is the likely path of the Internet.
"We're all struggling and experimenting with how news is presented in the future," said Larry Kramer, president of CBS Digital Media. "And there's nothing wrong with search engine optimization as long as it doesn't interfere with news judgment. It shouldn't, and it's up to us to make sure it doesn't. But it is a tool that is part of being effective in this medium."
source:http://www.nytimes.com/2006/04/09/weekinreview/09lohr.html?ei=5058&en=2704d8f63693bc0f&ex=1145160000&partner=IWON&pagewanted=all
ARM offers first clockless processor core
ARM and Handshake (Eindhoven, The Netherlands) announced they were developing the processor back in October 2004, along with an unnamed lead customer, which it appears could be Philips.
The processor was designed to use Handshake Solutions’ clockless IC design technology and is said to be suitable for automotive, medical and deeply embedded control applications. Although reduced power consumption, due to the lack of clock circuitry, is one benefit the clockless design also produces a low electromagnetic signature because of the diffuse nature of digital transitions within the chip.
Because clockless processors consume zero dynamic power when there is no activity, they can significantly extend battery life compared with clocked equivalents.
“Handshake Solutions has collaborated with ARM to provide the design community with a new type of low-power processor with very low EMI,” said Wouter Van Roost, chief executive officer of Handshake Solutions, in a statement issued by ARM (Cambridge, England).
“Now we can provide our Partners with the first commercially-available clockless processor that is very reliable over a wide range of conditions, maintaining real-time responsiveness while also extending battery life for applications in automotive, medical and deeply embedded consumer devices,” said John Cornish, vice president of marketing for the ARM processor division.
“As long-time users of Handshake Technology, we recognize its potential to become a key ingredient for automotive microcontrollers, addressing important automotive requirements such as robustness, low power and low EMI,” said Harry Inia, general manager, of the automotive business line at Philips Semiconductors.
At that time the design was expected to be completed early in 2005 with the lead customer shipping to customers in a 0.14-micron manufacturing process before the end of 2005. ARM did not
The ARM996HS processor can be used in both synchronous and asynchronous system-on-chip designs, ARM said. The lack of clock-edge driven current peaks should enable easier integration with analog components, in mixed-signal SOCs.
ARM has recognized the potential of clockless IC design for many years and supported the Amulet project, based on an early ARM instruction set, led by Professor Steve Furber at Manchester University. Handshake’s technology, when originally developed by Philips, was used in 8051 microcontroller products that were produced in millions.
Handshake, has been included on the last two iterations of the Silicon 60, a list of emerging technology startups.
source:http://www.eetimes.com/news/latest/showArticle.jhtml?articleID=179101800
Study, in a First, Explains Evolution's Molecular Advance
By reconstructing ancient genes from long-extinct animals, scientists have for the first time demonstrated the step-by-step progression of how evolution created a new piece of molecular machinery by reusing and modifying existing parts.
The researchers say the findings, published today in the journal Science, offer a counterargument to doubters of evolution who question how a progression of small changes could produce the intricate mechanisms found in living cells.
"The evolution of complexity is a longstanding issue in evolutionary biology," said Joseph W. Thornton, professor of biology at the University of Oregon and lead author of the paper. "We wanted to understand how this system evolved at the molecular level. There's no scientific controversy over whether this system evolved. The question for scientists is how it evolved, and that's what our study showed."
Charles Darwin wrote in The Origin of Species, "If it would be demonstrated that any complex organ existed which could not possibly have formed by numerous, successive, slight modifications, my theory would absolutely break down."
Discoveries like that announced this week of a fish with limblike fins have filled in the transitions between species. New molecular biology techniques let scientists begin to reconstruct how the processes inside a cell evolved over millions of years.
Dr. Thornton's experiments focused on two hormone receptors. One is a component of stress response systems. The other, while similar in shape, takes part in different biological processes, including kidney function in higher animals.
Hormones and hormone receptors are protein molecules that act like pairs of keys and locks. Hormones fit into specific receptors, and that attachment sends a signal to turn on — or turn off — cell functions. The matching of hormones and receptors led to the question of how new hormone-and-receptor pairs evolved, as one without the other would appear to be useless.
The researchers found the modern equivalent of the stress hormone receptor in lampreys and hagfish, two surviving jawless primitive species. The team also found two modern equivalents of the receptor in skate, a fish related to sharks.
After looking at the genes that produced them, and comparing the genes' similarities and differences among the genes, the scientists concluded that all descended from a single common gene 450 million years ago, before animals emerged from oceans onto land, before the evolution of bones.
The team recreated the ancestral receptor in the laboratory and found that it could bind to the kidney regulating hormone, aldosterone and the stress hormone, cortisol.
Thus, it turned out that the receptor for aldosterone existed before aldosterone. Aldosterone is found just in land animals, which appeared tens of millions of years later.
"It had a different function and was exploited to take part in a new complex system when the hormone came on the scene," Dr. Thornton said.
What happened was that a glitch produced two copies of the receptor gene in the animal's DNA, a not-uncommon occurrence in evolution. Then, for reasons not understood, two major mutations made one receptor sensitive just to cortisol, leading to the modern version of the stress hormone receptor. The other receptor became specialized for kidney regulation.
Dr. Thornton said the experiments showed how evolution could and did innovate functions over time. "I think this is likely to be a very common theme in how complex molecular systems evolved," he said.
Christoph Adami, a professor of life sciences at the Keck Graduate Institute in Claremont, Calif. who wrote an accompanying commentary in Science, said the research showed how evolution "takes advantage of lucky circumstances and builds upon them."
Dr. Thornton said the experiment refutes the notion of "irreducible complexity" put forward by Michael J. Behe, a professor of biochemistry at Lehigh University.
Dr. Behe, a main advocate of intelligent design, the theory that life is so complicated that the best explanation is that it was designed by an intelligent being, has compared an irreducibly complex system to a mousetrap. Take away any piece, and the mousetrap fails to catch mice. Such all-or-none systems could not have arisen with incremental changes, Dr. Behe has argued.
Dr. Thornton said the key-and-lock mechanism of a hormone-receptor pair was "an elegant exemplar of a system that has been called irreducibly complex."
"Of course," he added, "our findings show that it is not irreducibly complex."
Dr. Behe described the results as "piddling." He wondered whether the receptors with the intermediate mutations would be harmful to the survival of the organisms and said a two-component hormone-receptor pair was too simple to be considered irreducibly complex. He said such a system would require at least three pieces and perform some specific function to fit his notion of irreducibly complex.
What Dr. Thornton has shown, Dr. Behe said, falls within with incremental changes that he allows evolutionary processes can cause.
"Even if this works, and they haven't shown that it does," Dr. Behe said, "I wouldn't have a problem with that. It doesn't really show that much."
source:http://www.nytimes.com/2006/04/07/science/07evolve.html
Two-year-old Academic Initiative enhances computer science curricula, seeks to reverse student decline
11 Apr 2006
Gina Poole of IBM® talks about computer science as a major and as a career, including the role of outsourcing, mainframes, open source and open standards, and particularly the IBM Academic Initiative, of which she is director.
The IBM Academic Initiative began in 2004 as an outgrowth of the IBM Scholars Program, with a focus on enhancing the computer science curricula at universities around the world and on encouraging more students to earn degrees and enter careers in computer science. In the two years since its establishment, the IBM Academic Initiative has attracted more than 1,900 institutions, over 11,000 faculty members, and more than 440,000 students to its training programs.
What are the motivations for the Initiative? How does it work? What is outsourcing doing to the job market? Are there still career opportunities in computer science in the U.S.?
developerWorks asked Gina Poole about the purposes and directions of the Academic Initiative and what it can do for the industry as a whole. As IBM Vice President for Innovation and University Relations, Gina has worldwide responsibility for developing and executing internal programs that drive the IBM strategic imperative for innovation further into the IBM culture and external programs for collaborating with clients, partners, governments, and academia to foster innovation. She was previously Vice President of Developer Relations for IBM with worldwide responsibility for the IBM developer programs and also led the IBM Academic Initiative. Gina began her career with IBM in 1984 as a programmer in the personal computer division. She has held a number of management positions in IBM software and hardware divisions. She is a certified Project Management Professional (PMP) and holds degrees in computer science, business management, and economics.
You cite an alarming decline in the number of U.S. students majoring in computer science and engineering, particularly among women and minorities. What has caused this decline?
Gina: In the U.S., we've seen a decline in science and engineering degrees over the past ten years, while the number of newly declared computer science majors has actually declined by 32% over the last four years. Ever since the dot.com bust, there's been a steep drop-off. Clearly, women and under-represented minorities are leaving at alarming rates or not even considering science and engineering programs.
There are a couple of reasons: one is a myth, believed by parents, students, and high school guidance counselors, that computer science and engineering jobs are all being outsourced to China and India. This is not true. The percentage of the total number of jobs in this space is quite small -- less than 5%. According to a government study, the voluntary attrition in the U.S. has outpaced the number of outsourced jobs to emerging nations. Further, for every job outsourced from the U.S., nine new jobs are actually created in the U.S.
Do perceptions of computer science and engineering as careers tend to discourage students, especially women and minorities?
Gina: There are programs in place to encourage women and minorities to take STEM (Science, Technology, Engineering, and Mathematics) classes and to choose those careers, but they aren't moving the needle to the degree that we'd like. Why do women shy away from this field? Reason number one is the view that it is for loners and geeks, kind of like Dilbert, while women prefer more interactive, team-based positions. But there are very few jobs in the computer field where you're not working and collaborating as part of a team.
The way computer science was traditionally taught, you were assigned individual projects, you worked on them alone, and that became your view of the working world. I believe if the education better matched the real, team-based experience, where skills are applied to solving real-world problems, it would have more appeal. As an industry, we need to do more to get that message out.
Also, a lot of students don't understand the flexibility they can have. You can travel the globe; you have flexibility whether working from an office, from home, full-time, part-time. That flexibility isn't available in a lot of other careers.
You name software engineering as the fastest-growing occupation in coming years. Where is most of this growth taking place?
Gina: The growth is everywhere. The U.S. Bureau of Labor Statistics has identified computer-based jobs as one of the hottest areas, and those involving specific skill sets -- systems analysts, database administrators, computer scientists -- as some of the fastest-growing occupations through 2012, with growth rates anywhere from 40 to 70% in the U.S. alone. Further, at least 1.5 million additional IT field professionals will be needed by the end of this year.
Another factor: approximately 70 million baby-boomers will leave the workforce over the next 15 years, with only 40 million new workers coming in, and that will make the shortage of computer-skilled folks even more dramatic. Canada and EMEA foresee similar retirement rates. And even looking at India or China or Russia, where there are explosions of activity, they are trying to move as quickly as possible from agricultural to manufacturing to services economies. In developed nations in Europe and North America, about 70% of the economy is based on services and knowledge workers, and this is where India, China, and Russia would like to be.
Over 50% of students entering university in India and China select degree programs in science, technology, math, and computer science, but they still don't have enough skilled workers to meet the demand.
Some estimates place the percentage of IT jobs eligible for outsourcing at 20%. With that in mind, can you still predict significant IT job growth here at home?
Gina: Absolutely. To say, "20% of IT jobs are being outsourced" is alarming, but there are whole new fields opening up, new disciplines that will be in huge demand. Some of the more traditional IT positions -- application maintenance, transcription services, base application development -- may be outsourced for a number of reasons, principally cost and availability of workers.
But if you think of the exciting jobs marrying technology and business and really making an impact -- data mining, business intelligence, network architecture, Internet and Web architecture, Web services -- these will be the hot jobs as technology becomes more pervasive, less costly, and as more uses are found for it. There's even a view that outsourcing actually will help grow jobs.
How are "open source" and "open standards" technologies defined in the Academic Initiative?
Gina: Open source basically refers to any program whose source code is available for use or modification by users or developers as they see fit. The power of open source software is that it's developed as a public collaboration and made freely available. Linux® is the most obvious example, but there are others where IBM has been actively involved. Eclipse is a good example, where IBM donated some originally proprietary software to the open source community to build a strong framework for application development.
Open standards are publicly available specifications that provide a common method of achieving a particular goal. Open standards enable interoperability which, again, helps fuel collaborative innovation, because they make working together a lot easier. Individual companies can build value-added components on top of them. IBM is actively involved in all of the open standards communities and is building its offerings on open standards.
Will learning and adopting open source and open standards technologies enable IT and software engineers to avoid the obsolescence associated with older, proprietary technologies?
Gina: I think it will, because the collaborative model used to develop open source and open standards-based technologies ensures that these technologies will always be leading-edge and available for everyone in the developer community to see and use. If you keep up with the latest developments in open source and open standards, you'll be way ahead of those who focus narrowly on proprietary offerings, because they won't see the explosive evolution that you're seeing in open source and open standards-based offerings. Interoperability also ensures that you have a much broader set of places where you can apply your skills. Ultimately, you'll be in higher demand.
How is the penetration by open source and Linux in college and university curricula progressing?
Gina: IDC says that the overall revenue for servers, desktops, and packaged software running on Linux will probably reach about US$36 billion in the next four years, so the growth is significant. People view Linux as a great choice because the costs are lower, the reliability is better, and it scales from the smallest platforms to the biggest supercomputers, so it's applicable across a wide range of platforms.
In academia, they like to focus on the hottest technologies and what's going to be of most value to the industry. I don't think I've met with a single university that isn't using Linux in its curriculum, if not running its own infrastructure on it -- not just in the U.S. but around the world.
Would you say Linux is being adopted in higher education faster or more slowly than by industry as a whole?
Gina: Linux is being used somewhere in most of the enterprises that we talk to. It may start out in departmental solutions and work its way up to enterprise-wide solutions. There's Linux activity in just about every enterprise as well as in just about every university. IBM's entire portfolio -- all of our hardware platforms, our software -- can run Linux or run on Linux, and we're seeing really good demand and a lot of growth in that business.
Does faculty participation in the initiative require institutional cooperation and/or commitment of resources or time?
Gina: There are absolutely no strings attached to being a member of the Academic Initiative. Any faculty at any school can go to the www.IBM.com/university site, sign up as a member, and get free access to all of the resources that we have available to the program, from our software portfolio to curriculum and courseware to faculty enablement materials. All of that is available free of charge and with no strings attached.
How is the curriculum linked to teaching or use of IBM technology?
Gina: We find there's huge demand from our key customers for skills in what in the past were called mainframes -- IBM System z™ and System i™ -- the big systems. I think 98 of the Fortune 100 companies are using either System z or System i in the enterprise. At one time, universities shifted away from using and teaching big systems in their curricula, so what students learned from the academic environment was that the world must run on PCs. But when they get out into just about any medium or large enterprise, they find these large systems, and that skill set is going to be in huge demand because the baby-boomer folks who had those skills are going to be leaving the workforce.
We've stepped up our investment in academic programs for System z and System i. We're seeing more and more universities begin to include this training or even to build curricula around enterprise computing degree programs. We have some excellent partnerships with universities around the world where we're able to match students to customers the minute they graduate, so every one of those students gets snapped up.
So students can look forward to really interesting jobs?
Gina: Oh, yes. If we go back to women or anyone who wants to work on significant, perhaps change-the-world kinds of projects, some of the coolest I've seen require massive computing power, large amounts of streaming data, historical data, and real-time data, helping to make business decisions.
For example, we have a partnership in the Netherlands with a university and 15 supply chain partners in the fresh foods industry. Their concern is the 40 to 50% waste that occurs from the time fresh foods or flowers leave the farm until they reach retail outlets. If they can reduce the waste by 5%, that's significant in an industry with incredibly small margins. They're looking at different technologies -- temperature sensors and actuators, historical data, real-time data during transport -- to determine safe delivery times. A lot of the data analysis must be done in real-time, and that requires big, powerful systems.
How can IBM Business Partners participate in the Academic Initiative?
Gina: We have three-way partnerships with universities and clients to help ensure that the universities meet their needs of the clients. We're also doing a program with Texas A&M called the SSI [Shared Software Infrastructure] Hub. With some grants and investments, Texas A&M built the hardware and software infrastructure and curriculum, which they share openly with some 20 schools. While system management and maintenance are done at Texas A&M, the other schools can log in and take advantage of the environment, get their students installed, and actively build and share the curriculum.
Our initial business partner in this effort is Avnet, one of our big distributors, which joined in the initial investment to create this environment. I believe this kind of partnership is going to do a lot, particularly at schools that really can't invest in creating and maintaining their own environments. Smaller schools really appreciate being able to tap into it.
Are courses taught in a classroom or are they of the "distance learning" or e-learning variety?
Gina: They can be anything. Many universities still focus on classroom learning, but a lot of courses or pieces of them are done by distance learning. The SSI Hub is really an enabler of distance learning. A lot of universities use IBM's e-learning offerings from our Lotus® portfolio. They're also supplementing the materials with more than 500 on-demand tutorials on the developerWorks site -- short, one-hour tutorials that you can download and learn new aspects of a technology or a product. That really helps supplement the classroom learning.
Do you encourage the Academic Initiative's courses or courseware to become permanent parts of the curriculum at participating schools?
Gina: Yes, absolutely. We have courseware and curriculum materials available free from this program. Faculty can do anything they want with it - use an entire chunk or modify it to their heart's content. We provide faculty enablement training, often onsite, or we'll invite them to sessions or provide some of it in an e-learning mode.
We're also working with schools to create new curriculum. We don't view ourselves as the curriculum creation experts, so we encourage schools to work with us or with each other, because they are the experts in creating curricula that work best for the way they deliver training, and then to share it with the broader university community.
Do participating schools gain an incentive, financial or otherwise, to acquire IBM equipment, software, or other technology?
Gina: Yes. The software is free for training purposes. There are also opportunities to get significant discounts on hardware through 1% leases, loaner programs and other discount programs. We also do a lot of joint research projects, faculty awards, and Ph.D. fellowships with universities around the world. The more engaged a university becomes with us, the greater the mutual benefits.
Universities are looking to disseminate knowledge and to prepare their students to be future leaders, so they want their students to get good jobs, but they don't want to be tied to one vendor's offering. In this program, they can take advantage of all the open source and open standards-based offerings we have. They don't have to use DB2® in the classroom; they can use open source databases, and so on, but they still get value from the program, and there are no strings attached.
source:http://www-128.ibm.com/developerworks/power/library/pa-nl29-directions/?ca=dgr=lnxw01GPQA
Microsoft buyout of ailing Sony possible
In a recent interview through a European media outlet, Mr. Fornay Vice president of SCEE (Sony computer entertainment of Europe) made some rather astounding remarks regarding the PS3 price-point and the views being placed upon it. He hastily hinted that a Blu-ray player in the neighborhood of $599.99 would be an absolute steal, but as a stand alone gaming machine would be quite overpriced, as proven in the past. Sony was quick to issue a rebuttal on his behalf stating these were his personal views and not official commenting on the price of the PS3. Damage control at its finest. But keep in mind the massive cover up after Sony’s own CEO Howard Stringer hinted at the PS3's launch delay to Vanity Fair. Sony PR was quick to label it as "pure speculation" coming from the company’s own CEO, which was later to be proven completely accurate when the delay was made official mid-march of 2006. Sony isn’t one for honesty as a company entity however their head honcho's loose lips seem to deliver the only truth passing Sony’s door these days.
If the PS3 is delivered at $599.99 what does this mean for Sony? Historically gaming machines would have never sold for anything north of $299.99 (that was until the Xbox360 moved the line to 399.99 in winter of 2005). Just look at the complete failure of the 1990's powerhouse, the 3DO, at the relatively respectable price of $399.99 (roughly $449.99 inflation adjusted). With the PS3 now touting the largest price tag for any console in the history of gaming and higher then most mid-low end PC's at this point, does this spell disaster for the ailing hardware giant? Sony hasn’t posted net gains in the past three years until 2005 when they posted a depressing $580 million dollar profit after posting loss's in the billions for the last three fiscal years.
The true death blow at this point would really point to the HDCP compliance required for Blu-ray movie playback (as well as HD-DVD), making the PS3 no more then a video game system with a capacity of 50 gig games for anyone who purchased a HDTV in the past 6 years. So were does the PS3's $600.00 price tag come into play for us non-HDCP HDTV owners? The answer is nowhere.
PS3 production pricing at this point has been laid out to be anywhere from $850 - $956 per unit, meaning there will be a loss on each machine of approx $250 - $350 per console sold. If one million consoles sell in the first day of availability Sony will have wiped their entire fiscal gain for all of 2005 in under 24 hours.
In retrospect, Microsoft lost $150 per Xbox (Version 1) console, ending in a 4 billion dollar loss in the lifespan of Xbox (Version 1). With pockets deeper then the Atlantic Ocean, even Microsoft had to re-consider their standings in the console market, thus leading to an abruptly ended Xbox life span. So how will Sony bounce back for a variable $2 billion dollar hole in 2006? Answer is they won't at the price point they have “announced”.
Sony’s official statement in the face of the $599.99 price tag was "The console will not cost $750 dollars", meaning in short, I believe the damage control was laid down not in the face of the price tag being over priced, but more like being under priced. Expect the PS3 to weigh in above the $650 mark for Sony to have a substantial chance at a 2006 fiscal year that doesn’t sink the proverbial ship.
But what does all this mean for gamers as a whole? Basically with an astounding high price, and seemingly useless features for those of us with standard definition televisions or high definition televisions purchased in the past 6 years, we will be purchasing a half-complete machine for a fully complete hi-def included player price. This is Sony’s main hurdle, finding a way around the HDCP protocol. A blu-ray player with a price tag of $100.00 is still worthless to over 350 million Americans which is, I might add, Sony’s main demographic for hardware sales (North America).
Ailing stock prices, poor movie sales and a failing UMD (Sony PSP media) market, might add up to a low-cost buyout in Sony’s future. I personally see this happening regardless of the Blu-ray market by 2014. I for one can just see the looks on Sony’s main competitors faces at this time, all the gleaming in the world coming from One Microsoft way ( the road both Nintendo HQ and MS HQ reside on) as this could turn out to be the largest upset in consumer electronics history.
All this added up, you have to ask yourself. Will the next Playstation you purchase post-PS3 run a Microsoft operating system and have backwards compatibility for PS1 PS2 PS3 Xbox and Xbox360? Putting your rabid love for Sony aside, this doesn’t seem as far fetched as it once did, when the Sony name wasn’t covered in enough red tape to fill the Grand Canyon.
Blu-ray, Cell and a self-destructive obsession with one upping Microsoft, might have possibly put this king to rest... for good.
source:http://www.bonafidereviews.com/article.php?id=148
In-Depth ajaxWrite Review
source:http://developers.slashdot.org/article.pl?sid=06/04/07/2240230