Friday, May 12, 2006
Congress may slap restrictions on SSN use
WASHINGTON--Democratic and Republican politicians on Thursday both promised to enact new federal laws by the end of the year that would restrict some commercial uses of Social Security numbers, which are often implicated in identity fraud cases.
"Whether Social Security numbers should be sold by Internet data brokers to anyone willing to pay, indistinguishable from sports scores or stock quotes... to me, that's a no-brainer," Texas Republican Joe Barton, chairman of the U.S. House of Representatives Energy and Commerce Committee, said at a hearing. Such a practice should not be allowed, he said, "period, end of debate."
In both the House and the Senate, there are at least three pieces of pending legislation that propose different approaches to restricting the use and sale of SSNs. Politicians have expressed astonishment at what they see as a rising identity fraud problem, frequently pointing to a 2003 Federal Trade Commission survey that estimated nearly 10 million consumers are hit by such intrusions each year.
One bill, sponsored by Massachusetts Democrat Edward Markey, would require the FTC to make new rules limiting the sale and purchase of those identifiers, with exceptions for law enforcement, public health, certain emergency situations and selected research projects.
Another measure, sponsored by Florida Republican Clay Shaw, would restrict the display of SSNs on credit reports and on various government-issued documents and identification tags. It would also make it illegal in certain cases for anyone to refuse to do business with people who decline to supply their SSNs.
Testifying at Thursday's hearing, FTC Commissioner Jon Leibowitz stopped short of endorsing either bill, but he readily acknowledged that the identifiers "are overused, and they are underprotected."
"Users of Social Security numbers should migrate toward using less-sensitive identifiers whenever possible," he said, adding that companies also need to do more to protect the data they possess.
The SSN hasn't always had such broad applications. Back in 1935, Congress first directed the Social Security Administration to develop an accounting system to track payments to the fund. Out of that mandate came a unique identifier that has ultimately found applications in everything from issuing food stamps to tracking down money launderers.
One use of particular concern to the privacy community is the vast databases compiled by commercial "data brokers" about the American population that financial institutions can use to verify identities. One such company, ChoicePoint, grabbed headlines last year after a breach of its database came to light. That incident and other high-profile breaches unleashed a number of proposals in Congress, some of which target what some deem unregulated data brokers.
The controversy over the connection between SSNs and identity fraud is hardly new, and a number of states have already enacted restrictions in that area. Several federal laws, including the Fair Credit Reporting Act and the Health Insurance Portability and Accountability Act, better known as HIPAA, also include restrictions on use and disclosure of the identifiers.
As they pursue new laws, politicians said they're facing a difficult "balancing act" between rooting out abuses of Social Security numbers and protecting uses that tax collectors, the financial sector and law enforcement officials, among others, claim are invaluable.
Numerous industries have found a number of "beneficial uses" for SSNs, said Oliver Ireland, who testified on behalf of the Financial Services Coordinating Council. That group represents trade associations for the banking, securities, and insurance industries.
The numbers, for instance, "are critical for fraud detection," Ireland said in prepared testimony.
Also on Thursday, a California Senate committee approved an identity fraud bill that would improve state residents' ability to freeze their own credit reports when mischief is suspected.
source:http://news.com.com/Congress+may+slap+restrictions+on+SSN+use/2100-7348_3-6071441.html?tag=nefd.top
Poll: Most Americans Support NSA's Efforts
The new survey found that 63 percent of Americans said they found the NSA program to be an acceptable way to investigate terrorism, including 44 percent who strongly endorsed the effort. Another 35 percent said the program was unacceptable, which included 24 percent who strongly objected to it.
A slightly larger majority--66 percent--said they would not be bothered if NSA collected records of personal calls they had made, the poll found.
Underlying those views is the belief that the need to investigate terrorism outweighs privacy concerns. According to the poll, 65 percent of those interviewed said it was more important to investigate potential terrorist threats "even if it intrudes on privacy." Three in 10--31 percent--said it was more important for the federal government not to intrude on personal privacy, even if that limits its ability to investigate possible terrorist threats.
Half--51 percent--approved of the way President Bush was handling privacy matters.
The survey results reflect initial public reaction to the NSA program. Those views that could change or deepen as more details about the effort become known over the next few days.
USA Today disclosed in its Thursday editions the existence of the massive domestic intelligence-gathering program. The effort began soon after the Sept. 11 terrorist attacks. Since then, the agency began collecting call records on tens of millions of personal and business telephone calls made in the United States. Agency personnel reportedly analyze those records to identify suspicious calling patterns but do not listen in on or record individual telephone conversations.
Word of the program sparked immediate criticism on Capitol Hill, where Democrats and Republicans criticized the effort as a threat to privacy and called for congressional inquiries to learn more about the operation. In the survey, big majorities of Republicans and political independents said they found the program to be acceptable while Democrats were split.
President Bush made an unscheduled appearance yesterday before White House reporters to defend his administration's efforts to investigate terrorism and criticize public disclosure of secret intelligence operations. But he did not directly acknowledge the existence of the NSA records-gathering program or answer reporters' questions about it.
By a 56 percent to 42 percent margin, Americans said it was appropriate for the news media to have disclosed the existence of this secret government program.
A total of 502 randomly selected adults were interviewed Thursday night for this survey. Margin of sampling error is five percentage points for the overall results. The practical difficulties of doing a survey in a single night represents another potential source of error.
source:http://www.washingtonpost.com/wp-dyn/content/article/2006/05/12/AR2006051200375_pf.html
Phone tower cancer fears
As staff reacted with shock, the university yesterday shut the top two floors of the Bourke Street building and ordered more than 100 employees to work from home for the next fortnight.
The closure follows the discovery of five brain tumours in the past month and two others in 1999 and 2001. Two were malignant and five were benign.
WorkCover has launched an investigation and RMIT has promised its own inquiry.
The academics' union last night expressed concern that the tumours were caused by the communications towers on the roof of the former Tivoli Theatre site.
National Tertiary Education Union state secretary Matthew McGowan warned that anecdotal reports from hastily arranged staff meetings yesterday suggested the number of people affected would grow.
"You have to ask some pretty serious questions and we're obviously concerned that it could be linked to the tower," he said.
"This would appear to be much more than coincidence and RMIT has a responsibility to leave no stone unturned in seeking the truth."
Five of the seven affected work on the top floor of the 17-storey building. All except one have worked in the building for at least a decade.
An RMIT academic who did not want to be named said staff — the 16th and 17th floors are home to offices of senior management and lecturers — were "in disbelief, concerned and upset" as they attended meetings and left the building late yesterday.
Medical experts contacted by The Age said no definitive link had been proved between mobile phone tower radiation and cancer.
Australian Medical Association president Mukesh Haikerwal said there was no proof of a connection but "if you get clusters of disease it's sensible to investigate."
Dr John Gall, from private health company Southern Medical Services, which has been called in to assess the sick, said last night three of those affected had tumours showing symptoms consistent with radiation.
But he said there was no causal link with the building based on preliminary observations.
A spokesman for state Health Minister Bronwyn Pike said WorkCover would investigate the matter and the Department of Human Services would provide any expertise needed.
RMIT chief operating officer Steve Somogyi said testing was carried out on the building after the first two of the seven tumours were reported in 1999 and 2001. It found radiation and air quality levels within recommended guidelines.
"We value the health and safety of our staff and students very highly. The incidence of illness is disturbing and we shall continue to check for any possible cause connected to the building," Mr Somogyi said.
But RMIT union branch president Jeanette Pierce said the university agreed to shut the two floors only after being pressured by the union. "I'm a bit mystified that the university wasn't planning to vacate and that we had to make the point that they needed to vacate those two floors," she said.
There are more than 160 mobile phone towers in central Melbourne alone. A Telstra spokeswoman said last night the company had two towers at the Tivoli site, but both met health and safety standards and were tested regularly.
"An enormous amount of medical research has been conducted without any substantiated evidence of a link between mobile phone technology and adverse health effects, including cancer," she said.
RMIT management emailed all staff and students late yesterday and said health check-ups and counselling would be made available. About 600 staff work in the building.
Mr McGowan said shutting the two floors should be just the first step. "We think they should be testing all staff who have worked on those levels and not just for tumours. We need to understand what are the health risks that people are suffering," he said.
A help line for students and family members is available on 1800 155 945.
Tanya Stoianoff, the executive director of the Mobile Carriers Forum, which represents mobile phone companies, said there was no credible scientific evidence of health effects from living or working near a mobile phone base station.
source:http://www.theage.com.au/news/national/phone-tower-cancer-fears/2006/05/11/1146940676777.html?page=2
Turning viral videos into a net brand Interview with Steve Chen and Chad Hurley, Co-founders, YouTube
(FORTUNE Magazine) - In just five months, YouTube has gone from beta testing to part of the national zeitgeist. The website is a place where anyone with a home video can post it online and create an endlessly entertaining diversion for bored office workers--who've been watching 40 million clips a day.
Giving new meaning to the value of eyeballs over revenue (of which YouTube has precious little), the startup has raised $11.5 million in venture capital. FORTUNE's Adam Lashinsky recently visited the company's founders, CEO Hurley, 29, and chief technology officer Chen, 27, both early employees of PayPal.
They chatted at YouTube's San Mateo, Calif., office about how they're keeping pornography off the site, what ideas they've got for collecting money, and why they're not scared of Google (Research) and Yahoo (Research).
What made you think there was a burning desire out there for do-it-yourself videos?
Hurley: Steve and I were at a dinner party in January 2005, and we were taking digital photos and videos. The next day we found it difficult to share the video files because they were too large to e-mail and it took too much time to get them online. We thought there could be a better way. In February we started developing the product. In May we had our first public preview. And in December we officially launched YouTube. By that time we were serving over three million videos a day. Today it's well over 40 million. We're three to four times larger than Google Video or Yahoo Video Search. And growing much faster too. It's a side project for them.
How do you keep the site G-rated?
Hurley: Well, we say no nudity, obscenity, profanity, or violence.
Chen: More important, it's the community of users themselves. They feel like they've built it up, so they want to try to keep it clean. They let us know when there's content that shouldn't be there, and we take it down.
The important question: How are you going to make money?
Hurley: We're going to sell sponsorships and direct advertisements. But we are building a community, and we don't want to bombard people with advertising.
Chen: If we wanted to, we could instantly turn this into $10 million in revenue per month by running pre-rolls [short video ads] on the videos. But at the same time, we're going to make sure that whatever revenue model we've built is going to be something that's accepted by the users.
Hurley: We're building relationships with studios, networks, and labels because they're looking for ways to reach new audiences, and we have a great platform and a great stage to make that happen.
But it sounds as if you're not ready to talk about any deals in the works. Have you hired a salesperson yet?
Hurley: Not yet. We're really trying to develop something that works for our community first.
source:http://money.cnn.com/magazines/fortune/fortune_archive/2006/05/15/8376860/index.htm
|
Light's Most Exotic Trick Yet: So Fast it Goes Backwards
In the past few years, scientists have found ways to make light go both faster and slower than its usual speed limit, but now researchers at the University of Rochester have published a paper today in Science on how they've gone one step further: pushing light into reverse. As if to defy common sense, the backward-moving pulse of light travels faster than light.
Confused? You're not alone.
"I've had some of the world's experts scratching their heads over this one," says Robert Boyd, the M. Parker Givens Professor of Optics at the University of Rochester. "Theory predicted that we could send light backwards, but nobody knew if the theory would hold up or even if it could be observed in laboratory conditions."
Boyd recently showed how he can slow down a pulse of light to slower than an airplane, or speed it up faster than its breakneck pace, using exotic techniques and materials. But he's now taken what was once just a mathematical oddity—negative speed—and shown it working in the real world.
"It's weird stuff," says Boyd. "We sent a pulse through an optical fiber, and before its peak even entered the fiber, it was exiting the other end. Through experiments we were able to see that the pulse inside the fiber was actually moving backward, linking the input and output pulses."
So, wouldn't Einstein shake a finger at all these strange goings-on? After all, this seems to violate Einstein's sacred tenet that nothing can travel faster than the speed of light.
"Einstein said information can't travel faster than light, and in this case, as with all fast-light experiments, no information is truly moving faster than light," says Boyd. "The pulse of light is shaped like a hump with a peak and long leading and trailing edges. The leading edge carries with it all the information about the pulse and enters the fiber first. By the time the peak enters the fiber, the leading edge is already well ahead, exiting. From the information in that leading edge, the fiber essentially 'reconstructs' the pulse at the far end, sending one version out the fiber, and another backward toward the beginning of the fiber."
Boyd is already working on ways to see what will happen if he can design a pulse without a leading edge. Einstein says the entire faster-than-light and reverse-light phenomena will disappear. Boyd is eager to put Einstein to the test.
So How Does Light Go Backwards?
Boyd, along with Rochester graduate students George M. Gehring and Aaron Schweinsberg, and undergraduates Christopher Barsi of Manhattan College and Natalie Kostinski of the University of Michigan, sent a burst of laser light through an optical fiber that had been laced with the element erbium. As the pulse exited the laser, it was split into two. One pulse went into the erbium fiber and the second traveled along undisturbed as a reference. The peak of the pulse emerged from the other end of the fiber before the peak entered the front of the fiber, and well ahead of the peak of the reference pulse.
But to find out if the pulse was truly traveling backward within the fiber, Boyd and his students had to cut back the fiber every few inches and re-measure the pulse peaks when they exited each pared-back section of the fiber. By arranging that data and playing it back in a time sequence, Boyd was able to depict, for the first time, that the pulse of light was moving backward within the fiber.
To understand how light's speed can be manipulated, think of a funhouse mirror that makes you look fatter. As you first walk by the mirror, you look normal, but as you pass the curved portion in the center, your reflection stretches, with the far edge seeming to leap ahead of you (the reference walker) for a moment. In the same way, a pulse of light fired through special materials moves at normal speed until it hits the substance, where it is stretched out to reach and exit the material's other side [See "fast light" animation].
Conversely, if the funhouse mirror were the kind that made you look skinny, your reflection would appear to suddenly squish together, with the leading edge of your reflection slowing as you passed the curved section. Similarly, a light pulse can be made to contract and slow inside a material, exiting the other side much later than it naturally would [See "slow light" animation].
To visualize Boyd's reverse-traveling light pulse, replace the mirror with a big-screen TV and video camera. As you may have noticed when passing such a display in an electronics store window, as you walk past the camera, your on-screen image appears on the far side of the TV. It walks toward you, passes you in the middle, and continues moving in the opposite direction until it exits the other side of the screen.
A negative-speed pulse of light acts much the same way. As the pulse enters the material, a second pulse appears on the far end of the fiber and flows backward. The reversed pulse not only propagates backward, but it releases a forward pulse out the far end of the fiber. In this way, the pulse that enters the front of the fiber appears out the end almost instantly, apparently traveling faster than the regular speed of light. To use the TV analogy again—it's as if you walked by the shop window, saw your image stepping toward you from the opposite edge of the TV screen, and that TV image of you created a clone at that far edge, walking in the same direction as you, several paces ahead [See "backward light" animation].
"I know this all sounds weird, but this is the way the world works," says Boyd.
About the University of Rochester
The University of Rochester (www.rochester.edu) is one of the nation's leading private universities. Located in Rochester, N.Y., the University's environment gives students exceptional opportunities for interdisciplinary study and close collaboration with faculty. Its College of Arts, Sciences, and Engineering is complemented by the Eastman School of Music, Simon School of Business, Warner School of Education, Laboratory for Laser Energetics, and Schools of Medicine and Nursing.
source:http://www.scienceblog.com/cms/lights-most-exotic-trick-yet-so-fast-it-goes-backwards-10590.html
Intel claims 40% performance gain with new Core 2 Extreme processor
Los Angeles (CA) - Up until this year, the enthusiast gamer has been, by definition, a PC user - or, in deference to Apple, a "computer user." This year, thanks in large part to Sony, that's changing. From a marketing perspective, the enthusiast is the person who is more willing to spend money and adopt early. The money an early adopter might end up investing in a high-end, PlayStation 3-based gaming console is essentially what he might spend on at least a high-end computer, if not necessarily the top of the line.
For that matter, the PlayStation 3 is essentially a computer, albeit without a keyboard and mouse. But it does, after all, run Linux, as do a growing number of Intel-based PCs. So the comparison is inevitable: Of the two platforms - the PC and the PS3 - which is the better investment from purely a gaming perspective? Intel, of course, has a tremendously biased opinion on this subject. But in the light of more substantive competition than it has ever faced before, will Intel's perennial arguments continue to hold up? Or will subtle changes to those arguments shed some light as to where Intel wants to go with its push into gaming?
The company rented out a conference room to itself at this year's E3 Expo, where it showed off Core 2 Extreme-based desktop and mobile PCs that were clearly cranking out pixels at faster speeds than their predecessors - according to Intel, about 40% faster. Humphrey Cheung and Scott Fulton of TG Daily spoke with David Tuhy, Intel's general manager for its desktop products division; and Jodi Geniesse, the company's communications manager for mobile marketing, about some of the underlying issues governing Intel's new performance message - including lower power - and how it plays out to a new breed of customer who may be moving up from a generation not of PCs, but of consoles.
TG Daily: There are two top-of-the-line processors that are being tested here at E3 for performance. One of them is the Cell. The other one is in this room. Brass-tacks-wise, from the gamer's perspective, why is the Core 2 Extreme better?
David Tuhy: You almost have to look at it from a system level...Cell, which is an IBM and Sony collaboration, is a very, very dedicated processor specifically to their console frame, highly optimized for the pixel processing. Obviously, we're putting our thing into a PC, which is an open environment, and we're trying to advance on standard operating systems, to run games on top of this. So we have to worry about a lot more...
![]() |
Intel's general manager for desktop platforms, David Tuhy, showing off one example of a portable desktop Viiv PC. |
In the Cell environment, it's a Cell processing engine plus a third-party graphics vendor. It's highly optimized to that workload for giving the gaming experience to the console. I think they've done a great job; they've actually got a three-operand flow through their architecture, we actually have two. They can do multiply/accumulates based on their architecture, where we do our multiply through SSE [Streaming SIMD Extensions], and on our Core 2 Extreme, we double the SSE performance. It's actually got a true 128-bit SSE engine, which we market as "Advanced Media Processing."
The PC version of that is a CPU plus a graphics card, sometimes two separate, sometimes more. We have a system out here which has our top-of-the-line microprocessor, the Core 2 (the combined brand for the next-gen CPUs formerly code-named "Conroe" and "Merom"), with a quad Nvidia system for graphics. It's a different price point: It's going after the enthusiast gamers, but it can draw a heck of a lot of polygons. [I don't know off the top of my head] the number of polygons it can draw versus a Cell, but I think it's going to be higher, because there's a lot more bandwidth on the quad system than on the Cell system.
TG Daily: Up to this point, for the last 15 years, the argument in favor of PC gaming has been, "Hey, we have the performance lead. We can put gamers into the experience." Surely, Intel is reliant upon second parties like Nvidia and ATI to help you out getting [polygons drawn], but even that's still part of the Intel architecture. How does that improve and evolve with Core 2 Extreme?
DT: Specifically, we have taken a very different approach with Core than we have taken with any other architecture before on the desktop. We're leveraging what we've learned with the mobile architecture, that we learned back with Centrino, and we improved that in the desktop and the server architecture with 64 bits, virtualization technology, floating-point engines - we bill them together as "Core Engine." So for us, Core 2 is the biggest move forward in terms of an architecture advance that we've had in the last five years...in terms of net gain, both in terms of its performance - which is averaging 40% more in desktop - and also, at the same time, lower power. Usually, that's not what happens for us. Generally, when our performance goes up, so goes the power. We went a different way with a very power-optimized architecture.
The second thing we did is, we didn't require new instructions. We chose to optimize the current instruction set and give it more resources, so the SSE engine is a true 128-bit SSE engine, the cache now has advanced dynamics, it's a four-wide, out-of-order [instruction queue] machine. So several things we've done in the architecture where you can basically take an existing game and it just runs faster. Which is great, because it can plug and play with existing stuff.
Does there really have to be parity between PCs and consoles?
TG Daily: Yesterday at the Microsoft conference, Bill Gates unveiled Live Anywhere, giving people the idea of multi-platform gaming - where some people on a network are playing a game on an Xbox 360, and others are playing the same game on a PC. I'm scratching my head and thinking, doesn't this give the PC player an unfair advantage?
George Alfs, PR manager, Intel: Who said that's unfair?
TG Daily: Well, unfair in my favor, which is not bad.
DT: You get that from two dimensions. You get that from the rendering and the detail in the content, the depth of it and the physics, and you get it from [the fact that] these quad graphics systems demand more performance. These are not $299 systems. These are enthusiast systems. That's one thing that's different. Now, you're talking $599 plus an HDTV, you're getting into $1,000. But what we've seen anyway is, the PC has surpassed the PlayStation 2 in terms of its net polygons. Game consoles have taken a big move forward, and certainly on paper, they can generate a lot more polygons with that product than we can with a single graphics product. I think some of these quad-graphics systems can get there now. You take them and multiply them out by Moore's Law, and unless they do it in console about every two years, these PC systems will go past them...Once you get to that, you can ask yourself, "You know what, if the games are moving forward, if the quality...is getting more 3D-like, it's not getting that jittery stuff that gives people headaches, people who have a PC may have an advantage, because they get a lot more smooth experience and a faster response rate."
You look at a first-person shooter game like Day of Defeat and you see the shadow of the guy coming around the corner, those types of realism effects, if you get a better-quality display and a better-quality engine, those things give you a bit of an advantage, because the other guy doesn't see that shadow as clearly on his background as people are coming around the corner.
TG Daily: Consoles can get their performance because they can wring the best performance of the graphics, the sound, all the different components, and put them into a single box. Do you see Intel maybe one day coming out with their own console? You guys already do integrated graphics. You already have network stuff. I don't see where it would be that big of a jump to maybe do your own console.
DT: No, we haven't made any plans to make a console. Over half of [the job of maintaining a game console] is ecosystem - engagements, licensing, how you can get people to put the appliance in...almost a negative margin. And then to sell titles. In terms of silicon technology, for us, we see it. But to be able to do that other half, that's a big deal. That's something that takes a lot more consideration.
We're very happy with the PC model. I think a lot of people try to say PC gaming is dead, which is totally wrong. PC gaming continues to evolve...The openness of the PC allows a base of innovation from the Internet side and the software side, that just won't stop.
TG Daily: Not in the last two decades have game programmers really had to seriously consider the nuts and bolts of the microprocessor. There haven't been assembly programmers for games for years. The development engines allow developers to use high-level languages and graphical environments. So game developers don't really have a one-on-one meeting with the processor at any point in time. And Intel's message today seems to be, dual-core plus hyperthreading. If I'm a game developer, what are four cores going to get me?
DT: We did demonstrate a quad-core architecture in a server environment, which is extremely valuable for servers, because they can do online transaction processing, which is very parallel, a lot of queries. We've been working with [over 20 partnered] gaming companies to do general threading. Dual-core is, basically, an evolution of hyperthreading in the client space. Quad-core is, how many more parallel workloads can you do? Turns out that media, for sure, is an extremely parallel workload. You could take the screen and divide it into four. It's a very parallelizable workload. What we've been finding is that physics and AI also respond to multithreading. We certainly know that the graphics processors do a majority of the rasterization, the vertex and pixels, but even they use the processors a lot for setup, for [counting and preparing] the polygons...so there's still a graphics part of the processor. And that, as we know, is a very parallel workload, because of the nature of rasterization.
We have run some stuff internally to see what some things might look like, like a Winmark 3D 2006 on a quad-core, and the scores go up significantly. So we know there's inherent value there. We're working with [partner game developers] on threading, so they know that they can scale with the number of threads that come into the system. Because once you've launched a process, as long as you can launch eight of them, they can take their time or, if they each have a core, they can all go through.
TG Daily: Is there a value proposition for the casual gamer to go to multicore?
DT: You know, that's funny, because the more resource-starved a person's PC is, the more you actually benefit from having the dual-core. A lot of people have things going on in the background of their PC. They'll have the game that they're playing in front of them, and then people have a little firewall, or virus protection software, or they'll have little accessories or popups going on in the background. You'll [often] see 50 processes running on a PC today. So what we're finding is, all those background tasks can really be bothersome to someone when they're trying to game, because it interrupts them. So they'll turn all that off. The busiest gamers will get everything out of the Start Menu, every single thing off of their control bar, so they don't get interrupted. Well, dual-core helps that a lot. It takes care of those processes in the background, relieves you, and balances out your system. We're seeing that in the gaming world, and in the corporate world too. People will turn off their virus protection software that I.T. has put on there, because it's so bothersome. Their media slows down, and imagine someone in your corporation turning off their virus protection software. Not a good deal.
![]() |
Intel's Jodi Geniesse explains her company's concept of handheld gaming to TG Daily's Humphrey Cheung. |
Jodi Geniesse: We went from Centrino [last] December, to January introducing our first dual-core. We had about a 70% performance increase there. Nine months later, we're talking about shipping on Merom. So from December to August, we're roughly doubling the performance...with a 28% reduction in power. That's an awesome engineering feat. Gaming performance today, versus last year, isn't comparable.
TG Daily: It seems to me, with the evolution of the mobile platform in gaming...in a sense, you're creating a virtual console for yourself.
JG: Yeah. We're starting to see more segmentation in the [mobile] market, moving towards more gaming-specific laptops. These days, Dell and Toshiba, these are the big guys introducing segment-specific gaming laptops. I think the trend - and you see larger OEMs starting to do the same - is towards a mobile gaming form factor.
TG Daily: What aspect of multicores will benefit gaming in the future? I hear a lot about speech recognition...How will multicore help with that?
DT: There's a general category of workloads that we've been watching for quite a while...voice recognition being part of it, visual recognition being part of it. Data mining on the desktop: I actually run a lot of searches on my own desktop for stuff that I know I have. Recognition, identifying things, converting things to different languages, data mining - being able to search through all your photos and find pictures of your kid, or a wedding. We see that happening with virtual workloads, the most popular case being ray tracing, which is the future generation of how people will be doing these games. Today, it's rasterization; but if you look at Hollywood, a lot of the new movies are being made using ray tracing, which is a very different algorithm for rendering the content. That's how they get so much realism into their [films]. That's a very [parallel] workload. For a CPU guy like me, my benefit is that I can jump very quickly. My graphics controller is great at taking a block of data, and doing XORs, comparisons like mad, and them moving that data and drawing it out. So recognition, mining, and synthesis - those workloads are the key to how multicore will evolve. It's financial, it's professional, and it's consumer. It's all three of those.
You said something about the way gaming is developed today. There's actually a gaming engine - the id engine, Half-Life, Valve - and it's really those folks that we need to work most closely with, and we do, to be able to get their engines to respond to multicore, dual-core. That's a big chunk of our research also, to work with those leaders in that technology, and figure out where they want to go, and make our technology intercept it.
source:http://www.tgdaily.com/2006/05/11/intel_makes_its_core2extreme_gaming_case/
Tech Workers of the World, Unite
Most of my professional life has been spent as a member of a union (The Newspaper Guild) and I'm under no illusion that the working class would be better off had things been left to the largesse of the bosses, or to the vagaries of a cutthroat, free-market meritocracy. Regardless of the industry, virtually all of the workplace comforts and benefits we take for granted today exist solely because of battles fought and won long ago by once-powerful unions.
Forty years ago, nearly one private-sector worker in three belonged to a union. Today, that number has dwindled to around 10 percent and there's little to suggest that a revival is nigh. Although unions remain fairly strong in the blue-collar world, that world is shrinking. (Can you say "technology"?) Consequently, labor unions don't wield the political clout they used to.
It's sad to see the anemic state of organized labor in this country today. Worse, it kills me to admit that, to a large degree, the erosion of the labor movement is the fault of the unions themselves. Their refusal or inability to change with the times, to keep the movement relevant in the face of globalization and the digital conversion -- the so-called new economy -- has been disastrous.
Disastrous, I might add, for union members and nonunion workers alike.
Just as the Democratic Party has largely ceded the battlefield to Republican stridency in recent years, so, too, has organized labor wilted before an economy where the unrestrained market rules all. The result is unsurprising: The rich get richer, the shareholder is valued more than the employee, jobs are eliminated in the name of bottom-line efficiency (remember when they called firing people "right-sizing"?) and the gulf between the rich and the working class grows wider every year.
You see this libertarian ethos everywhere, but nowhere more clearly than in the technology sector, where the number of union jobs can be counted on one hand. Tech is the Wild West as far as the job market goes and the robber barons on top of the pile aim to keep it that way. They'll offshore your job to save a few bucks or lay you off at the first sign of a slump, but they're the first to scream, "You're stifling innovation!" at any attempt to control the industry or provide job security for the people who do the actual work.
I was raised in a union house. My father was a passionate supporter of the labor movement and I've never forgotten that the union helped feed and clothe me when I was a kid.
Most of my professional life has been spent as a member of a union (The Newspaper Guild) and I'm under no illusion that the working class would be better off had things been left to the largesse of the bosses, or to the vagaries of a cutthroat, free-market meritocracy. Regardless of the industry, virtually all of the workplace comforts and benefits we take for granted today exist solely because of battles fought and won long ago by once-powerful unions.
Forty years ago, nearly one private-sector worker in three belonged to a union. Today, that number has dwindled to around 10 percent and there's little to suggest that a revival is nigh. Although unions remain fairly strong in the blue-collar world, that world is shrinking. (Can you say "technology"?) Consequently, labor unions don't wield the political clout they used to.
It's sad to see the anemic state of organized labor in this country today. Worse, it kills me to admit that, to a large degree, the erosion of the labor movement is the fault of the unions themselves. Their refusal or inability to change with the times, to keep the movement relevant in the face of globalization and the digital conversion -- the so-called new economy -- has been disastrous.
Disastrous, I might add, for union members and nonunion workers alike.
Just as the Democratic Party has largely ceded the battlefield to Republican stridency in recent years, so, too, has organized labor wilted before an economy where the unrestrained market rules all. The result is unsurprising: The rich get richer, the shareholder is valued more than the employee, jobs are eliminated in the name of bottom-line efficiency (remember when they called firing people "right-sizing"?) and the gulf between the rich and the working class grows wider every year.
You see this libertarian ethos everywhere, but nowhere more clearly than in the technology sector, where the number of union jobs can be counted on one hand. Tech is the Wild West as far as the job market goes and the robber barons on top of the pile aim to keep it that way. They'll offshore your job to save a few bucks or lay you off at the first sign of a slump, but they're the first to scream, "You're stifling innovation!" at any attempt to control the industry or provide job security for the people who do the actual work.
source:http://www.wired.com/news/columns/theluddite/0,70858-0.html
Dell Cheating on the Direct-Sales Model?
source:http://hardware.slashdot.org/article.pl?sid=06/05/11/1654257
Microsoft sides with Nintendo in fight vs Sony
LOS ANGELES (Reuters) - Microsoft Corp. (MSFT.O) sided with rival Nintendo Co. Ltd. (7974.OS) on Wednesday in the fight to unseat video game leader Sony Corp.(6758.T), saying many consumers will choose to buy both of their machines for the price of one PlayStation 3.
The bad blood between Sony and Microsoft in the fight for dominance in the nearly $30 billion video game industry has escalated over the last few days, with both sides trading barbs at the E3 Expo, the video game industry's annual trade show.
Microsoft entered the next-generation game console market first with its Xbox 360 last November.
Sony aims to extend its market leadership with its upcoming PS3, while Nintendo plans to offer a new game machine called Wii in the fourth quarter.
"Tell me why you would buy a $600 PS3?" Peter Moore, a Microsoft vice president, said in an interview. "People are going to buy two (machines.) They're going to buy an Xbox and they're going to buy a Wii ... for the price of one PS3."
Microsoft predicted on Tuesday it will have 10 million Xbox 360 consoles in the market before Sony launches the PS3. The high-end Xbox 360 sells for $399, but it does not include a built-in high-definition DVD video player that comes with Sony's PS3.
Sony plans to sell a premium PS3 model for $599 when it debuts in North America on November 17, and Nintendo has not yet disclosed pricing for Wii.
Wii comes equipped with motion sensitive controllers to allow users to mimic the motion of wielding a sword or swinging a tennis racket.
Moore then turned pitchman for Nintendo's Wii, the latest offering from the Japanese company that once dominated the video game industry.
"People will always gravitate toward a competitively priced product -- like what I believe Wii will be -- with innovative new designs and great intellectual property like Mario, Zelda and Metroid," Moore told Reuters.
Sony currently dominates the worldwide video game market with a 66 percent share, while Microsoft and Nintendo each hold 17 percent, according to Strategy Analytics.
"We have 100 percent market share of the next-generation, and their job is to take that from us," said Moore.
"When I think on everything that we've got going right now that is real versus what Sony promises to do six, seven months from now, obviously we feel very good about where we stand."
LOST IN TRANSLATION
Despite Microsoft's head start with the Xbox 360, the software giant still faces an uphill climb in Sony and Nintendo's home turf.
Microsoft received a tepid response to its Xbox 360 debut in Japan and demand fell short of expectations during last year's holiday season when it sold about 100,000 machines.
The company introduced a competitively priced console in Japan, but some of its game titles did not appeal to Japanese gamers. Moore expects upcoming role-playing games like "Lost Odyssey" and "Blue Dragon" from the creator of the popular "Final Fantasy" series to do well in Japan.
"Quite frankly," said Moore, "if we're sitting here a year from now and things continue to fall flat, then we might say 'we don't know what to do anymore."'
source:http://www.washingtonpost.com/wp-dyn/content/article/2006/05/11/AR2006051100612.html