Wednesday, April 12, 2006

Top 10 best jobs

MONEY Magazine and Salary.com researched hundreds of jobs, considering their growth, pay, stress-levels and other factors. These careers ranked highest. (more)

source:http://money.cnn.com/magazines/moneymag/bestjobs/

The 2 Megapixel Story

Camera phones are fast replacing a number of regular phones. The vengeance with which manufacturers are integrating cameras into handsets has ensured that even the handsets aimed at business users, such as the Nokia 9300i and the O2 Atom/Mini incorporate some pretty heavy duty cameras.

The camera integration has in fact reached such a stage that any self-respecting phone would incorporate at least a megapixel camera. The cutting-edge feature to have though is the 2-megapixel variety. The question is: is it really worth the extra money you pay for it? Without getting into model-by-model comparisons, I am questioning the entire range of 2-megapixel camera phones. Are they really worth it?

For the most part, no.

Despite the cameras being 2-megapixel, the image quality isn\u2019t really that impressive. Whether you take the K750i, W800i (essentially the same phone), N90, Samsung\u2019s D600, LG\u2019s P7200 or even the new RAZR, the photographs are not yet at a level of clarity that they replace my need for a Digicam. Sure, if I have one of these, then I can take better shots when I am not carrying my camera, but they aren\u2019t necessarily better.

Even if we stop contesting the image quality part, according to manufacturers, we should carry 2-megapixel phones to capture life\u2019s moments as they come at us without notice. I disagree. The simple reason being, all these phones take a long time to get into the camera mode that by the time you are ready to click, the moment has already long gone. Take the N90, for example, or the P7200, it can take between 15 and 30 seconds for the phone to get into the camera mode. So much for capturing "sudden" moments\u2026

The next hindrance relates to usage. How many of you actually take enough pictures to warrant a 2-megapixel camera? When I say take enough pictures, I am talking specifically about pictures you would like to keep on your PC, take color printouts of and/or share it with friends and family members via e-mail.

I kept a 2-megapixel camera with me for three months. In those months, I took all of 20 pictures and if I hadn\u2019t taken those pictures, I really wouldn\u2019t have missed much.

So what it comes down to is this: while I am sure some of you will argue that you shoot a lot more photos than I do or don\u2019t mind paying the extra money that manufacturers are charging, but the ultimate fact is that the latest generation of camera phones hold no value, at least from a photography standpoint.

They are slow, they are expensive and the camera, while great for a phone, is not really anywhere near something I would want to carry around. Why not purchase a regular phone and a Nikon S1, S5 or the Sony T9? They are compact enough to not make your pockets bulge and will definitely give you far better results.

Manufacturers need to do a few things to make a convert out of me. First, give me a faster interface. While the phone operating systems have become richer, the hardware hasn\u2019t changed much and it shows. The situation deteriorates even further the minute you get into the camera mode and that\u2019s simply unacceptable.

Secondly, give me a longer battery life. If I can only take 30 odd photos before the battery requires a recharge, then I can neither use the camera attachment nor stay in touch using my phone.

And finally, if the camera is on a phone, I want it to be pictbridge compatible, or at least provide me some good printers that will print wirelessly through Bluetooth so that I am able to cut out the PC altogether.

It\u2019s great to see manufacturers making strides towards improving camera phones and their overall quality, but there\u2019s no value in these gadgets that warrants blazingly steep price tags.

source:http://www.cooltechzone.com/Departments/Featured_Story/The_2_Megapixel_Story_200604122285/


Google Voice Search May be Coming Soon

"The master of text-based search could be looking to lend a voice to Internet users everywhere, or so it appears based on Google's latest patent. Patent #7,027,987, issued today by the US Patent and Trademark Office, covers a 'Voice interface for a search engine.'"

source:http://slashdot.org/articles/06/04/12/0342216.shtml

An overview of virtualization technology

Virtualization is a hot topic in the enterprise space these days. It's being touted as the solution to every problem from server proliferation to CPU underutilization to application isolation. While the technology does indeed have many benefits, it's not without drawbacks.

With all the new vendors entering the space, you have a range of choices. It can be tough to pick the right one for your needs. VMware has been around as long as any of the other vendors and continues to deliver new products, including free ones like VMware Player and Server. Microsoft bought its way into the virtualization game several years ago when it purchased Connectix and its Virtual PC product. Microsoft has since delivered a Virtual Server product and a newer version of the Virtual PC product.

Xen is an open source project started at the University of Cambridge. It has been adopted by a number of different Linux distributions, including Novell's SuSE Linux Enterprise Server. Novell is putting all its eggs in the Xen basket, hoping that Xen will attract more interest in the company's enterprise server product.

While you can run Novell's NetWare operating system (OS) on any of the VMware products, you won't get the same level of performance as you will when running it under Xen on the next version of Open Enterprise Server (OES) due out in early 2007. Novell is investing lots of effort in optimizing Xen specifically for running a virtualized copy of NetWare on top of Linux. The company's goal is to provide its customers with a migration path over to the Linux platform without giving up NetWare.

Technical differences
While many technical details of virtualization are similar, varying approaches exist to solve the problems associated with the different implementations. The three architectures are labeled as single OS image, full virtualization and para-virtualization.

Vendors following single OS image approach include Virtuozo, Vservers and Zones. This method groups user processes into resource containers and manages access to physical resources. While this approach can scale well, it is hard to get strong isolation among the different containers.

In the full virtualization camp you'll find vendors like VMware and Microsoft, along with the open source project QEMU. With this approach the entire operating system and applications are virtualized as a guest OS operating on top of the host OS. The primary advantage to this approach is that you can run any number of different guest OSes on a single host. On the downside, the x86 architecture does not lend itself to efficient virtualization.

Para-virtualization is the name for the technique whereby you modify the host operating system to support low-level calls needed by the guest OSes. Xen and User-mode Linux (UML) use this approach. Advantages include performance, scalability and manageability. Para-virtualization also helps in taking advantage of the newer AMD and Intel CPU architectures that have hardware virtualization capabilities.

What's best?
The answer depends on what you're trying to accomplish. If you're a developer looking for a flexible way to test your application in multiple environments, you'll probably want to go with either Virtual PC or VMware Workstation. If you want to use Linux as your host OS, you'll definitely have to go with VMware.

If you're trying to solve one of the server-based issues like consolidation or application isolation, you'll want to go with a server solution. The decision then becomes what are your key requirements? Cost, performance, platform independence and OS support are some of the typical non-negotiable requirements in many shops. Supportability is another major concern that could be set aside with a vendor like Novell standing behind an option like Xen.

An open source solution will win the cost battle almost every time until you look at what it will take to get it implemented and where you can turn for support. Most enterprise IT organizations want more than just access to Internet news groups and IRC chats to get technical support. That points to a solution such as Novell's that mixes the best of both worlds.

source:http://searchopensource.techtarget.com/tip/1,289483,sid39_gci1179320,00.html


Tiny Flyer Navigates Like Fly

April 11, 2006— An ultralight autonomous aircraft that mimics the navigational abilities of a fly could one day become a real fly on the wall.

The 10-gram microflyer, being developed by a team of researchers lead by Dario Floreano at the Swiss Federal Institute of Technology in Lausanne, has a 36-centimeter (14-inch) wingspan.

But it could one day be shrunk to insect size and used for search and rescue.

"A lot of groups are taking inspiration from insects but none of them have been able to reach that with an indoor flying system," said research scientist and project leader, Jean-Christophe Zufferey.

Autonomous indoor flight presents scientists with particular technological challenges that nature has already overcome.

"Indoor environments are really tough," said Erik Steltz, a PhD candidate in electrical engineering at the University of California, Berkeley. "There are so many things to bang off of I believe this is the best approach out there to do indoor guidance for aircraft."

For example, in order to zip around indoor obstacles — walls, corners, bookcases, furniture, ceilings, etc. — a flyer needs to see the objects and have the brain power to steer away.

On a more conventional robot, this typically requires powerful computing resources, high-resolution cameras, or some kind of distance sensor such as a laser range finder system.

But these components take up precious weight and the heavier an aircraft, the faster it must fly to stay aloft, making indoor navigation all the more challenging.

A fly navigates using its large, compound eyes, which let it see almost an entire field of view at once. Their optic lobes contain motion-sensitive neurons that respond to images moving across the retinas.

Those moving images, the so-called optic flow, combine data that the insect perceives as it flies straight, and data it senses from other motions such as turning, bobbing, or tilting side to side.

The visual data that comes as the insect is flying straight — for example, that a bookcase in its field of view is getting bigger — automatically contains information about the distance to that object.

In fact, flies tend to navigate in relatively straight lines until they get too close to an obstacle. Then they make a quick 90° turn away from the obstruction and continue flying straight again.

Flies also possess two organs, called halteres, which help them fly without flipping over.

Zufferey and his colleagues incorporated technology into their robot that mimics all of these things.

To mimic the fly's vision, the researchers installed two tiny, low-resolution cameras, one over each wing. A microchip-sized gyroscope keeps the microflyer stable.

Onboard signal processing and control software give the autonomous vehicle its insect-like behavior.

In their most recent experiment, the researchers tested the aircraft in a 7 x 7-meter (23 x 23-foot) room with walls painted in vertical strips of black and white.

The microflyer navigated on its own for nearly five minutes.

Zufferey and his colleagues are currently working to reduce the size of the aircraft to the size of a housefly and to give it the ability to adjust its altitude.

source:http://dsc.discovery.com/news/briefs/20060410/flybot_tec.html


Wal-Mart Controls Modern Game Design?

"That Wal-Mart smiley face is looking pretty evil now that Allen Varney has explained how much influence they have on virtually every modern game: 'Publisher sales reps inform Wal-Mart buyers of games in development; the games' subjects, titles, artwork and packaging are vetted and sometimes vetoed by Wal-Mart. If Wal-Mart tells a top-end publisher it won't carry a certain game, the publisher kills that game. In short, every triple-A game sold at retail in North America is managed start to finish, top to bottom, with the publisher's gaze fixed squarely on Wal-Mart, and no other.'"

source:http://games.slashdot.org/article.pl?sid=06/04/11/1759205

Pentium computers vulnerable to cyberattack

Security experts warn of that and other risks at CanSecWest/core 06

VANCOUVER, British Columbia —The built-in procedure that Intel Pentium-powered computers use to blow off their digital steam could put users in hot water by making the machines vulnerable to cyberattacks, computer security researchers announced at the CanSecWest/core06 conference last week.

When the processor begins to overheat or encounters other conditions that could threaten the motherboard, the computer interrupts its normal operation, momentarily freezes and stores its activity, said Loïc Duflot, a computer security specialist for the French government’s Secretary General for National Defense information technology laboratory.

Cyberattackers can take over a computer by appropriating that safeguard to make the machine interrupt operations and enter System Management Mode, Duflot said. Attackers then enter the System Management RAM and replace the default emergency-response software with custom software that, when run, will give them full administrative privileges.

Every computer that runs on x86 chip architecture may be vulnerable to this attack, including the millions of computers that the U.S. government and industry use, said Dragos Ruiu, the conference organizer. He is a Canadian computer security consultant for businesses, governments and the U.S. military.

CanSecWest is an informal annual gathering for hard-core code gurus who create the software that businesses and governments use. The conference presented the latest in what hackers — both helpful and malicious — are doing in IT security, said Eric Byres, a member of the research faculty at the British Columbia Institute of Technology.

A growing number of cyberattacks are targeting Web applications, an area of concern widely discussed during the conference. That ties in to the rapid spread of voice-over-IP technology.

“Have vendors even heard of Web application security?” asked Nicolas Fischbach, senior manager for network engineering security at COLT Telecom, who gave a presentation on VOIP security issues.

VOIP vendors are so driven to beat the competition to market and include new features that they have no idea how to write secure Web applications, he said. He predicted it will take years for such applications to be made secure retroactively.

Among the numerous topics presented at the conference, IPv6 has the most long-term significance because it will probably still be in use 50 years from now, Ruiu said.

IPv6 has 128-bit addresses, a huge step up from IPv4’s 32-bit addressing scheme. That means IPv6 could provide millions of unique addresses for each person on Earth, along with “every toaster, door and window,” said van Hauser, an alias for the security team leader at n.runs, a German security company, and founder of The Hacker’s Choice, a hacker group.

With all those IP addresses, IPv6 resists most traditional worm attacks that rely on randomly finding active addresses, said van Hauser. IPv6 users must enable IP security protocols, he warned, because without them, attackers can use IPv6’s hierarchical structure to get immediate access to Domain Name Servers and other critical system components.

source:http://www.fcw.com/article94010-04-10-06-Print


This page is powered by Blogger. Isn't yours?