Monday, February 06, 2006

Taking On the Database Giants

Can open-source upstarts compete with Oracle, IBM, and Microsoft? It's an uphill battle, but customers are starting to look at the alternatives

Rick Herman is awash in digital data. He's vice-president for business and legal affairs at Sony Online Entertainment, the division of Sony (SNE) that handles Internet gaming. It's his job to ensure that the company's databases manage that information properly and on budget. To do that, Herman has traditionally relied on one of the three main database vendors: Oracle (ORCL), IBM (IBM), or Microsoft (MSFT). But lately, he's doing a bit more shopping around.

It's not that the big three don't do a good job. There's a reason they have 85% of the $15 billion database market, notes Herman. But as you would expect in any market dominated by a few players, Herman is on the lookout for better prices and more flexibility to tailor database features for Sony's unique needs.

OPENING MINDS.
Herman's search has taken him in an unexpected direction. He's spending a lot of time evaluating databases built around the open-source software that's disseminated and developed freely over the Internet. Sony, like most big companies, has been conservative when it comes to open source.

But that has changed since Linux, the open-source operating system, started making big inroads with servers, the computers that run Web sites and corporate networks. Those gains have let companies see firsthand the benefits of open source, which include lower costs and more control over the code.

Switching to an open-source database can slash costs for one of the most expensive segments of the software budget by as much as 90%. "If you had told us four or five years ago we would be considering these types of products at the rate we are, I would have looked at you like you were insane," he says. "But open source isn't going away, and I'm pretty excited about it."

So now Herman and executives like him are the spoils in what's shaping up to be a heated round of database wars. On one side are the defending champions -- Oracle, IBM, and Microsoft -- against a ragtag bunch of coders and some more organized corporate ventures, all going to market in different ways, but all trying to take down the Big Three using the power of open source.

NEW PLAYERS.
This isn't the first time tech titans have defended the database turf. Oracle, now the world's No. 1 database vendor, was the clear winner in a round in the 1980s. It succeeded largely due to better marketing and execution. In the early 1990s, amid a shift in technology, a new raft of nimble competitors such as Sybase (SY) again tried to knock Oracle back on its heels -- and largely failed.

This time around, the dominant players still won't cede ground easily, especially Oracle. It's the longtime market leader with a 41% share of the biggest part of the market, relational databases, which Oracle pioneered in the late 1970s. Growth is slowing -- hence Oracle's $19 billion investment in applications -- but revenue from databases is still its lifeblood.

Open-source databases come in a wide variety of flavors. It's not like when Linux fought the server wars, and the choice was Linux against some patent-protected variety. Now it's proprietary software against the likes of Postgres, Ingres, or MySQL.

Of these, Sweden's MySQL has the most momentum. It has been at the open-source database game for a decade, but only in the last few years has that hard work started to pay off. Its free database has been downloaded 100 million times since it was released. A souped-up version, 5.0, was released last October and has been downloaded more than 4 million times.

ORACLE ON BOARD?
But staggering as those numbers may seem, very few translate to paying customers. Of every 1,000 downloads, one database is actually implemented, and only one customer ends up paying for service and support. MySQL's annual sales are just $40 million. Still, revenue has been doubling year-over-year, and some think MySQL could be the next big open-source company to go public. "We think it would make sense," says MySQL Chief Executive Marten Mickos. "But are we in a hurry? No."

The market got considerably more crowded in November. Ingres, a loser in the first round of database wars, was spun out of Computer Associates (CA). The open-source community around Postgres, based on old Ingres code, published a new release of its database software. And a startup using the Postgres code, EnterpriseDB, started making hay in the market after launching in August.

Even Oracle announced a free scaled-down version of its database for developers to play with, though the code isn't open. The reaction of MySQL's Mickos? Told you so. "We said a long time ago that most databases will go open-source once people realize that we are not just a glorified file system and that people would jump on this train," he says. ASSEMBLY REQUIRED. Each upstart is going at the market differently. MySQL freely admits that its database is totally different from the high-end offerings of IBM and Oracle. It's trying to be the Ikea of the database world: cheap, needs some assembly, but has a sleek, modern design and does the job. "If you are equipping some very fine room for your old relatives, maybe you buy antique furniture. But everyone else uses Ikea," Mickos says.

Even though some contend that its cheap databases are putting pressure on prices, MySQL is also expanding the market, bringing powerful technology to small businesses or niche divisions of big companies that never could have afforded a database before. MySQL largely relies on customers who have downloaded the database to contact it when they want support, cutting out the costly direct sales force that most enterprise software companies employ.

Ingres, on the other hand, looks more like a traditional software company and is going after Oracle head-on. It was purchased out of CA by private-equity firm Garnett & Helfrich Capital and is being run by Terry Garnett until a permanent chief executive is hired. Garnett is no stranger to the database world. In the early 1990s, he started work at Oracle on the same day his wife quit the company to head development at scrappy up-and-comer Sybase.

GET A LIFE.
This time Garnett is the one with the scrappy upstart. Ingres is private, flush with cash, and already boasts a sizable base of paying customers. His first priority is keeping them happy. Nearly half of his team members are ex-Oracle folks, who are well versed in what made Oracle the victor the first time around.

Bowater, a $3 billion paper and pulp manufacturer, says it's happy to stick with Ingres. J. Tyler McGraw, Bowater's staff database administrator, says Oracle is too complex. "I administrate over 40 databases across the company, and I couldn't do it on Oracle," he says. "I took an Oracle class one time, and I thought, 'How do you guys have a life?'" The big question is whether Garnett & Co. can expand that base. A big first hurdle is rebuilding the Ingres brand, Garnett says.

Then there's EnterpriseDB, a company built around Postgres. Its total downloads to date are only a bit bigger than what MySQL sees in a given day. However, some say EnterpriseDB has the most innovative business model of the three: Its products look and feel exactly like Oracle's, allowing almost no downtime for retraining database administrators.

"THROWING ROCKS."
One thing all three have in common: They're all tuned in to the kinds of rumblings that Sony Online Entertainment's Herman had: The databases work fine, but as data volume grows, so do the checks to Oracle, IBM, or Microsoft. Many users aren't clamoring for more features, and some don't even use the bells and whistles they already paid for. They would happily trade some to get their hands on the source code and a better deal. "We're all throwing rocks at the same empire and trying to get customers to see that by default you don't go with this one vendor," Garnett says.

Oracle sharply disagrees. Robert Shimp, its vice-president for technology marketing, says his customers prize innovation, and that's a game Oracle won in the 1990s and continues to excel at. He also argues that customers want to buy more software from fewer vendors, to simplify their lives. That's why Oracle has been aggressively building its position in other parts of business software, such as middleware and applications, through purchases of Siebel, PeopleSoft, J.D. Edward, and others.

Who's right? Analysts say both are. One group of customers wants the latest in databases and will pay up for it. And -- as the runaway downloads of MySQL show -- another group is hungry for a different way.

CATCHING ON?
The wild card in all of this will be whether big, successful tech companies get behind the upstarts. Linux hit prime time only when IBM, Oracle, and others got behind it, rewriting their software to make it compatible and convincing worried CIOs that it was robust and reliable enough to entrust their business to it.

A company such as SAP (SAP) could be pivotal. The German software giant is locked in an applications war with Oracle, but the bulk of companies running SAP applications run them on Oracle databases. So even when SAP wins an application deal, it's often making money for its archrival. That doesn't sit well with ultracompetitive SAP, which already has a burgeoning partnership with MySQL. Closer ties there could mean more SAP applications on MySQL databases. Elsewhere, Red Hat (RHAT) has endorsed both MySQL and Postgres, as did Sun Microsystems (SUNW) last November.

Microsoft has never been a friend to the open-source movement, and don't expect much backing from IBM or Oracle either. As the latest round of the database wars heats up, they have far too much to lose.

source:http://www.businessweek.com/technology/content/feb2006/tc20060206_918648.htm

PayPal vs Google(Buy)

While Google Chief Executive Eric Schmidt confirmed in press accounts that the company was building a payment service, Mr. Schmidt also denied it would directly compete with PayPal. Mr. Schmidt said Google didn't intend to offer a "person-to-person, stored-value payments system," which many people consider a description of PayPal's service. Mr. Jordan (PayPal chief) says he and his team immediately "dissected the wording" of Google's statements. He says he doesn't believe Mr. Schmidt..." There's also a more in depth WSJ article about the service.

source:http://slashdot.org/articles/06/02/06/1429254.shtml

Understanding memory usage on Linux

This entry is for those people who have ever wondered, "Why the hell is a simple KDE text editor taking up 25 megabytes of memory?" Many people are led to believe that many Linux applications, especially KDE or Gnome programs, are "bloated" based solely upon what tools like ps report. While this may or may not be true, depending on the program, it is not generally true -- many programs are much more memory efficient than they seem.

What ps reports
The ps tool can output various pieces of information about a process, such as it's process id, current running state, and resource utilization. Two of the possible outputs are VSZ and RSS, which stand for "virtual set size" and "resident set size", which are commonly used by geeks around the world to see how much memory processes are taking up.

For example, here is the output of ps aux for KEdit on my computer:

USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
dbunker 3468 0.0 2.7 25400 14452 ? S 20:19 0:00 kdeinit: kedit

According to ps, KEdit has a virtual size of about 25 megabytes and a resident size of about 14 megabytes (both numbers above are reported in kilobytes). It seems that most people like to randomly choose to accept one number or the other as representing the real memory usage of a process. I'm not going to explain the difference between VSZ and RSS right now but, needless to say, this is the wrong approach; neither number is an accurate picture of what the memory cost of running KEdit is.

Why ps is "wrong"
Depending on how you look at it, ps is not reporting the real memory usage of processes. What it is really doing is showing how much real memory each process would take up if it were the only process running. Of course, a typical Linux machine has several dozen processes running at any given time, which means that the VSZ and RSS numbers reported by ps are almost definitely "wrong". In order to understand why, it is necessary to learn how Linux handles shared libraries in programs.

Most major programs on Linux use shared libraries to facilitate certain functionality. For example, a KDE text editing program will use several KDE shared libraries (to allow for interaction with other KDE components), several X libraries (to allow it to display images and copy and pasting), and several general system libraries (to allow it to perform basic operations). Many of these shared libraries, especially commonly used ones like libc, are used by many of the programs running on a Linux system. Due to this sharing, Linux is able to use a great trick: it will load a single copy of the shared libraries into memory and use that one copy for every program that references it.

For better or worse, many tools don't care very much about this very common trick; they simply report how much memory a process uses, regardless of whether that memory is shared with other processes as well. Two programs could therefore use a large shared library and yet have its size count towards both of their memory usage totals; the library is being double-counted, which can be very misleading if you don't know what is going on.

Unfortunately, a perfect representation of process memory usage isn't easy to obtain. Not only do you need to understand how the system really works, but you need to decide how you want to deal with some hard questions. Should a shared library that is only needed for one process be counted in that process's memory usage? If a shared library is used my multiple processes, should its memory usage be evenly distributed among the different processes, or just ignored? There isn't a hard and fast rule here; you might have different answers depending on the situation you're facing. It's easy to see why ps doesn't try harder to report "correct" memory usage totals, given the ambiguity.

Seeing a process's memory map
Enough talk; let's see what the situation is with that "huge" KEdit process. To see what KEdit's memory looks like, we'll use the pmap program (with the -d flag):

Address Kbytes Mode Offset Device Mapping
08048000 40 r-x-- 0000000000000000 0fe:00000 kdeinit
08052000 4 rw--- 0000000000009000 0fe:00000 kdeinit
08053000 1164 rw--- 0000000008053000 000:00000 [ anon ]
40000000 84 r-x-- 0000000000000000 0fe:00000 ld-2.3.5.so
40015000 8 rw--- 0000000000014000 0fe:00000 ld-2.3.5.so
40017000 4 rw--- 0000000040017000 000:00000 [ anon ]
40018000 4 r-x-- 0000000000000000 0fe:00000 kedit.so
40019000 4 rw--- 0000000000000000 0fe:00000 kedit.so
40027000 252 r-x-- 0000000000000000 0fe:00000 libkparts.so.2.1.0
40066000 20 rw--- 000000000003e000 0fe:00000 libkparts.so.2.1.0
4006b000 3108 r-x-- 0000000000000000 0fe:00000 libkio.so.4.2.0
40374000 116 rw--- 0000000000309000 0fe:00000 libkio.so.4.2.0
40391000 8 rw--- 0000000040391000 000:00000 [ anon ]
40393000 2644 r-x-- 0000000000000000 0fe:00000 libkdeui.so.4.2.0
40628000 164 rw--- 0000000000295000 0fe:00000 libkdeui.so.4.2.0
40651000 4 rw--- 0000000040651000 000:00000 [ anon ]
40652000 100 r-x-- 0000000000000000 0fe:00000 libkdesu.so.4.2.0
4066b000 4 rw--- 0000000000019000 0fe:00000 libkdesu.so.4.2.0
4066c000 68 r-x-- 0000000000000000 0fe:00000 libkwalletclient.so.1.0.0
4067d000 4 rw--- 0000000000011000 0fe:00000 libkwalletclient.so.1.0.0
4067e000 4 rw--- 000000004067e000 000:00000 [ anon ]
4067f000 2148 r-x-- 0000000000000000 0fe:00000 libkdecore.so.4.2.0
40898000 64 rw--- 0000000000219000 0fe:00000 libkdecore.so.4.2.0
408a8000 8 rw--- 00000000408a8000 000:00000 [ anon ]
... (trimmed) ...
mapped: 25404K writeable/private: 2432K shared: 0K

I cut out a lot of the output; the rest is similar to what is shown. Even without the complete output, we can see some very interesting things. One important thing to note about the output is that each shared library is listed twice; once for its code segment and once for its data segment. The code segments have a mode of "r-x--", while the data is set to "rw---". The Kbytes, Mode, and Mapping columns are the only ones we will care about, as the rest are unimportant to the discussion.

If you go through the output, you will find that the lines with the largest Kbytes number are usually the code segments of the included shared libraries (the ones that start with "lib" are the shared libraries). What is great about that is that they are the ones that can be shared between processes. If you factor out all of the parts that are shared between processes, you end up with the "writeable/private" total, which is shown at the bottom of the output. This is what can be considered the incremental cost of this process, factoring out the shared libraries. Therefore, the cost to run this instance of KEdit (assuming that all of the shared libraries were already loaded) is around 2 megabytes. That is quite a different story from the 14 or 25 megabytes that ps reported.

What does it all mean?
The moral of this story is that process memory usage on Linux is a complex matter; you can't just run ps and know what is going on. This is especially true when you deal with programs that create a lot of identical children processes, like Apache. ps might report that each Apache process uses 10 megabytes of memory, when the reality might be that the marginal cost of each Apache process is 1 megabyte of memory. This information becomes critial when tuning Apache's MaxClients setting, which determines how many simultaneous requests your server can handle (although see one of my past postings for another way of increasing Apache's performance).

It also shows that it pays to stick with one desktop's software as much as possible. If you run KDE for your desktop, but mostly use Gnome applications, then you are paying a large price for a lot of redundant (but different) shared libraries. By sticking to just KDE or just Gnome apps as much as possible, you reduce your overall memory usage due to the reduced marginal memory cost of running new KDE or Gnome applications, which allows Linux to use more memory for other interesting things (like the file cache, which speeds up file accesses immensely).

source:http://virtualthreads.blogspot.com/2006/02/understanding-memory-usage-on-linux.html

Using cell phones to track employees

Advances in mobile phone tracking technology are turning British firms into cyber sleuths as they keep a virtual eye on their staff, vehicles and stock.

In the past few years, companies that offer tracking services have seen an explosion in interest from businesses keen to take advantage of technological developments in the name of operational efficiency.

The gains, say the converted, are many, ranging from knowing whether workers have been "held up" in the pub rather than in a traffic jam, to being able to quickly locate staff and reroute them if necessary.

Not everybody is happy about being monitored, however, and civil rights group Liberty says the growth of tracking raises data privacy concerns.

Kevin Brown, operations director of tracking firm Followus, said there was nothing covert about tracking, thanks to strict regulations.

"An employee has to consent to having their mobile tracked. A company can't request to track a phone without the user knowing," he said. "Under government rules we send random alerts to each phone we track, informing the user they are being monitored."

All that is needed to trace a mobile phone is a computer with an Internet connection. Once a phone is activated for tracking, it becomes a mobile electronic tag and its approximate position can be followed using the service provider's Web site.

Although there was a flurry of interest when the service was launched in 2003 from private individuals suspicious about whether their partners really were working late at the office, the would-be sleuths were quickly disappointed.

"You can forget about borrowing your partner's phone and 'consenting' to being tracked because the random alerts will blow that ruse," said Brown.

As well as wanting to make sure staff are working when and where they are supposed to, many firms say they are increasingly concerned about employee safety.

"Some businesses want to keep an eye on their staff. Some feel they have an obligation to know where staff are in case of emergencies," said Brown. He said Followus, launched in 2003, now has 50,000 subscribers and the number was growing by 5,000 a month.

It tracks cell phone SIM cards with accuracy that varies depending on the saturation of SIM masts--in city centers the technology can pinpoint a phone to within a hundred meters, while in rural locations it might be several miles.

The most obvious application of the technology is for freight and delivery firms, but there has also been interest among small businesses that have tradesmen or sales staff on the road.

Andrew Overton at Verilocation said many of his company's 60,000 subscribers, mostly small businesses, wanted to know where their workers were for security reasons and for better asset management.

"There is increasing awareness about the importance of knowing where your staff are in case of incidents like the July London bombings. Knowing where your nearest employee is to a customer is also important. It allows a company to improve efficiency."

Overton said tracking also allowed bosses to check whether workers were taking the quickest route to a job or whether the expenses they submitted matched the miles they had driven.

Civil rights concerns
Not everyone is so enthusiastic about the growth of tracking.

Civil rights group Liberty said there could be privacy and human rights issues surrounding the use of tracking particularly given the unequal relationship between employee and employer.

"There could well be worries that staff feel coerced into agreeing to be monitored. The technology is neutral, it's the way it is used that is the problem," said Liberty's Jen Corlew.

She said the development of tracking was worrying because it was being driven by the marketplace and not by workers' rights.

"We are already seeing an ebbing away of employee rights and we at Liberty will be keeping a close eye on this area to see if companies who do monitor their staff are complying with the regulations," she said.

Logistics expert Richard Wildings said keeping track of staff and equipment could produce significant cost benefits to companies if they used the information effectively.

"There are benefits in service enhancements--providing a better service to customers and all the attendant advantages that can bring, and also operational gains from managing people and assets better," said Wildings, a professor of supply chain risk management.

According to Wildings, a company that knows where its staff are and can work out whether they will make appointment dates and then communicate with customers will win out over those that do not.

"Giving customers transparency of where their delivery or tradesman is in the supply chain enhances the value of what a company can offer customers," said Wildings from Britain's Cranfield University, a post-graduate institution that specializes in business and logistics.

Transparency builds trust which, in turn, saves cost.

"Customers who don't trust their suppliers can over-order, or hold extra inventory, or shop around for alternatives."

Operationally, companies that use tracking can gain by optimizing their staff.

"If you know where vehicle or employee is and a customer calls you, you have the opportunity to reroute."

Wildings said large-scale truckers have been using similar techniques for years, but using expensive satellite navigation equipment.

"Mobile phone tracking is far cheaper and produces similar business benefits," he said.

source:http://news.zdnet.com/2100-1035_22-6035317.html


Dark matter comes out of the cold


The VLT (Eso)
The British team used 23 nights of observing time on the VLT
Astronomers have for the first time put some real numbers on the physical characteristics of dark matter.

This strange material that dominates the Universe but which is invisible to current telescope technology is one of the great enigmas of modern science.

That it exists is one of the few things on which researchers have been certain.

But now an Institute of Astronomy, Cambridge, team has at last been able to place limits on how it is packed in space and measure its "temperature".

"It's the first clue of what this stuff might be," said Professor Gerry Gilmore. "For the first time ever, we're actually dealing with its physics," he told the BBC News website.

Science understands a great deal about what it terms baryonic matter - the "normal" matter which makes up the stars, planets and people - but it has struggled to comprehend the main material from which the cosmos is constructed.

'Magic volume'

Astronomers cannot detect dark matter directly because it emits no light or radiation.

Its presence, though, can be inferred from the way galaxies rotate: their stars move so fast they would fly apart if they were not being held together by the gravitational attraction of some unseen material.

Such observations have established this dark material makes up about 80-85% of the Universe that is matter.

These are the first properties other than existence that we've been able determine
Prof Gerry Gilmore
Institue of Astronomy, Cambridge
Now, the Cambridge team has provided new information with its detailed study of 12 dwarf galaxies that skirt the edge of our own Milky Way.

Using the biggest telescopes in the world, including the Very Large Telescope facility in Chile, the group has made detailed 3D maps of the galaxies, using the movement of their stars to "trace" the impression of the dark matter among them and weigh it very precisely.

With the aid of 7,000 separate measurements, the researchers have been able to establish that the galaxies contain about 400 times the amount of dark matter as they do normal matter.

"The distribution of dark matter bears no relationship to anything you will have read in the literature up to now," explained Professor Gilmore.

If this 'temperature' for the dark matter is correct, then it has huge implications for direct searches for these mysterious particles
Prof Bob Nichol
Institute of Cosmology and Gravitation, Portsmouth
"It comes in a 'magic volume' which happens to correspond to an amount which is 30 million times the mass of the Sun.

"It looks like you cannot ever pack it smaller than about 300 parsecs - 1,000 light-years; this stuff will not let you. That tells you a speed actually - about 9km/s - at which the dark matter particles are moving because they are moving too fast to be compressed into a smaller scale.

"These are the first properties other than existence that we've been able determine."

Knowledge advance

The speed is a big surprise. Current theory had predicted dark matter particles would be extremely cold, moving at a few millimetres per second; but these observations prove the particles must actually be quite warm (in cosmic terms) at 10,000 degrees.

The most likely candidate for dark matter material is the so-called weakly interacting massive particle, or Wimp.

Computer-generated image of the LHC tunnel as it will look on completion    Image: Cern
Future research in particle accelerators may yield more clues
Scientists believe these are relic particles produced in the Big Bang.

They are predicted by certain theoretical extensions to the accepted description of matter and forces, the Standard Model of Fundamental Particles and Interactions. But also their presence would go a long way to explaining the structure and geometry of the Universe we observe.

Professor Bob Nichol, from the Institute of Cosmology and Gravitation at the University of Portsmouth, described the Cambridge work as "awesome".

"If this temperature for the dark matter is correct, then it has huge implications for direct searches for these mysterious particles (it seems [science] may be looking in the wrong place for them) and for how we thought the galaxies and clusters of galaxies evolve in the Universe.

"Having 'hotter' dark matter makes it harder to form the smallest galaxies, but does help to make the largest structures. This result will generate a lot of new research."

Big neighbours

Experimental crystal detectors placed down the bottom of deep mines are hoping to record the passage through normal matter of these hard to grasp dark matter particles.

Researchers would hope also that future experiments in particle accelerators will give them greater insight into the physics of dark matter.

Andromeda (Martin/Eso/Nasa/Esa)
Andromeda is no longer the heavyweight in the local Universe
The Cambridge efforts have produced an additional, independent result: the detailed study of the dwarf galaxies has allowed the scientists to weigh our own galaxy more precisely than ever before.

"It turns out the Milky Way is more massive than we thought," said Professor Gilmore.

"It now looks as though the Milky Way is the biggest galaxy in the local Universe, bigger even than Andromeda. It was thought until just a few months ago that it was the other way around."

The Cambridge University team expects to submit the first of its results to a leading astrophysics journal in the next few weeks.


Google delists BMW Germany for foul play

05 February 2006 - Google has flexed its muscles and dropped BMW Germany from its search engine following the German car manufacturer’s attempts to artificially boost its popularity ranking.

The move is likely to send shockwaves through the Internet industry over fears that one company has such power and affect over a website's access to the public.

The delisting was reported by Matt Cutts, a software engineer at Google, who works to stop websites tricking the system by featuring hidden text or different content from what the website visitor sees.

In his blog, Cutts wrote that the methods used by BMW were a violation of the search engine's guidelines, and that a second company, camera maker ricoh.de will be removed soon for similar reasons shortly.

“Don’t deceive your users or present different content to search engines than you display to users”, said Cutts on his blog entry.

The delisting will mean that searching for terms like "BMW" or "BMW Germany" on Google will not return a direct link to the car company's German website, bmw.de, but instead the global site.

Moreover, bmw.com.de's PageRank, the algorithms that assign every page on the web a sort of popularity ranking, has been reset to zero.

Many big publishers and website owners enlist the help of Search Engine Optimisation (SEO) experts to help improve search appeal in search engines, and while most methods are acceptable, some are deemed to be unethical.

These so-called black hat tactics are commonly used by gambling or pornography sites.

BMW is thought to be one of the highest profile companies to have a website blacklisted by Google.

source:http://www.pocket-lint.co.uk/news.php?newsId=2404

3D Microscopy of Fossils Embedded in Solid Rock

A 650-million-year-old fossil from Kazakhstan.  Top: optical image of fossil cyanobacterium. Middle: confocal optical image of the same fossil. Bottom L: close-up of section of confocal optical image. Bottom R: Raman chemical image of same boxed region
A 650-million-year-old fossil from Kazakhstan. Top: optical image of fossil cyanobacterium. Middle: confocal optical image of the same fossil. Bottom L: close-up of section of confocal optical image. Bottom R: Raman chemical image of same boxed region
Download high-quality images for this release

UCLA Scientists See and Analyze 650-Million-Year-Old Fossils Inside Rocks in Three Dimensions – a First, With Implications for Finding Life on Mars



UCLA paleobiologist J. William Schopf and colleagues have produced 3-D images of ancient fossils — 650 million to 850 million years old — preserved in rocks, an achievement that has never been done before.

If a future space mission to Mars brings rocks back to Earth, Schopf said the techniques he has used, called confocal laser scanning microscopy and Raman spectroscopy, could enable scientists to look at microscopic fossils inside the rocks to search for signs of life, such as organic cell walls. These techniques would not destroy the rocks.

"It's astounding to see an organically preserved, microscopic fossil inside a rock and see these microscopic fossils in three dimensions," said Schopf, who is also a geologist, microbiologist and organic geochemist. "It's very difficult to get any insight about the biochemistry of organisms that lived nearly a billion years ago, and this (confocal microscopy and Raman spectroscopy) gives it to you. You see the cells in the confocal microscopy, and the Raman spectroscopy gives you the chemistry.

"We can look underneath the fossil, see it from the top, from the sides, and rotate it around; we couldn't do that with any other technique, but now we can, because of confocal laser scanning microscopy. In addition, even though the fossils are exceedingly tiny, the images are sharp and crisp. So, we can see how the fossils have degraded over millions of years, and learn what are real biological features and what has been changed over time."

His research is published in the January issue of the journal Astrobiology, in which he reports confocal microscopy results of the ancient fossils. (He published ancient Raman spectroscopy 3-D images of ancient fossils in 2005 in the journal Geobiology.)

Since his first year as a Harvard graduate student in the 1960s, Schopf had the goal of conducting chemical analysis of an individual microscopic fossil inside a rock, but had no technique to do so, until now.

"I have wanted to do this for 40 years, but there wasn't any way to do so before," said Schopf, the first scientist to use confocal microscopy to study fossils embedded in such ancient rocks. He is director of UCLA's Institute of Geophysics and Planetary Physics Center for the Study of Evolution and the Origin of Life.

Raman spectroscopy, a technique used primarily by chemists, allows you to see the molecular and chemical structure of ancient microorganisms in three dimensions, revealing what the fossils are made of without destroying the samples. Raman spectroscopy can help prove whether fossils are biological, Schopf said. This technique involves a laser from a microscope focused on a sample; most of the laser light is scattered, but a small part gets absorbed by the fossil.

Schopf is the first scientist to use this technique to analyze ancient microscopic fossils. He discovered that the composition of the fossils changed; nitrogen, oxygen and sulfur were removed, leaving carbon and hydrogen.

Confocal microscopy uses a focused laser beam to make the organic walls of the fossils fluoresce, allowing them to be viewed in three dimensions. The technique, first used by biologists to study the inner workings of living cells, is new to geology.

The ancient microorganisms are "pond scum," among the earliest life, much too small to be seen with the naked eye.

Schopf's UCLA co-authors include geology graduate students Abhishek Tripathi and Andrew Czaja, and senior scientist Anatoliy Kudryavtsev. The research is funded by NASA.

Schopf is editor of "Earth's Earliest Biosphere" and "The Proterozoic Biosphere: A Multidisciplinary Study," companion books that provide the most comprehensive knowledge of more than 4 billion years of the earth's history, from the formation of the solar system 4.6 billion years ago to events half‑a‑billion years ago.

source:http://www.newsroom.ucla.edu/page.asp?RelNum=6796


Smoking out photo hoaxes with software

Dartmouth College professor Hany Farid is no fan of Josef Stalin, but he acknowledges that the photo retouching done during the Soviet era was top notch.

"That was impressive work. I've seen some of the originals," Farid said. The Soviets just didn't airbrush their victims out, he added. They painted in new backgrounds on the negatives.

Farid's interest in photo retouching isn't just historical. The professor of computer science and applied mathematics runs the university's Image Science Group, which has emerged as one of the chief research centers in the U.S. for developing software to detect manipulation in digital photographs.

Photo hoaxes

While some of the group's software is now used by the FBI and large media organizations such as Reuters, a version written in Java will come out soon that will be easier to use and thereby allow more police and media organizations to sniff out fraud. The current software is written in Matlab, a numerical computing environment.

"I hope to have a beta out in the next six months," Farid said. "Right now, you need someone who is reasonably well-trained to use it."

Photo manipulation is a lot more common than you might think, according to L. Frank Kenney, an analyst at Gartner. That Newsweek cover of Martha Stewart on her release from prison? It's Martha's head, but a model's body. Some people believe hip hop artist Tupac Shakur remains alive, in part because of the images that have cropped up since his reported death in 1996.

Although it's difficult to estimate the size of the market for fraud detection tools, the demand is substantial, according to Kenney.

"How much is the presidency of a country worth, or control of a company? People tend not to read the retractions," he said. "Once the stuff is indelibly embedded in your memory, it is tough to get out."

The Journal of Cell Biology, a premier academic journal, estimates that around 25 percent of manuscripts accepted for publication contain at least one image that has been "inappropriately manipulated" and must be resubmitted. That means it has been touched up, although in the vast majority of cases, the author is only trying to clean the background and the changes do not affect the scientific efficacy of the results. Still, around 1 percent of accepted articles contain manipulated images that do significantly affect the results, said executive editor Mike Rossner. Those papers get rejected.

"Our goal is to have an accurate interpretation of data as possible," Rossner said. "These (images) are (of) things like radioactivity detected on a piece of X-ray film."

Law enforcement officials have also had to turn to the software to prosecute child pornographers. In 2002, the Supreme Court in Ashcroft v. Free Speech Coalition overturned parts of the Child Pornography Protection Act for being overly broad, ruling that only images of actual minors, and not computer-generated simulations, are illegal.

Since that decision, a common defense has become that the images found on a hard drive are artificially created.

"The burden is now on the prosecution. These cases used to be slam dunks," Farid said.

How it works
Fraud detection software for images essentially searches for photographic anomalies that the human brain ignores or can't detect.

Humans, for instance, ignore lighting irregularities in two-dimensional images. While the direction of light can be re-adjusted in 3D images from video games, it is difficult to harmonize in 2D photographs. The light in the famous doctored photo that puts Sen. John Kerry next to actress Jane Fonda at a protest rally actually comes from two different directions.

"The lighting is off by 40 degrees," Farid said. "We are insensitive to it, but computers detect it."

Although modern researchers have in clinical studies documented humans' ability to filter out lighting incongruities, 15th-century painters were aware of the way humans process images and exploited that knowledge to create seemingly realistic lighting effects that would have been nearly impossible to replicate in real life.

"The lighting is totally bizarre in some Renaissance paintings," he said.

The software also seeks out areas in photographs where applications like Adobe Photoshop fill in pixels. Every time the photos get mashed together, some modification of one or both of the images is required. Sometimes one person is blown up in size while a second might be rotated slightly. These changes leave empty pixels in the frame.

Photo-retouching applications use probability algorithms to fill in those pixels with colors and imagery and thus make them look realistic. Conversely, Farid's software employs probability to ferret out which of these fringe pixels are fill-ins.

"We're asking, from a mathematical and statistical perspective, can you quantify the manipulation," he said. "There are statistical correlations that don't occur naturally."

The quality of forgeries and touch-up jobs varies widely, but it continually improves. Farid gets consulting requests all the time. Some people call him to see if a photo of an item on eBay has been retouched. Others want advice on the genuineness of photos from online dating services. The Image Science Group has also collaborated with the Metropolitan Museum of Art in New York to determine if certain drawings were actually made by Flemish painter Bruegel or were forgeries.

One of the most recent celebrated cases of fraud--South Korean scientist Hwang Woo-suk's claim that he cloned stem cells--actually didn't need specialized software. Spots and artifacts in the background visible to the naked eye showed that the images of cells that came from the supposedly cloned dog were duplicated, Farid noted.

Farid's interest in fraud detection is somewhat random. As a post-doctoral student at MIT seven years ago, he was meandering through the library looking for something to read. He grabbed the Federal Rules of Evidence, a compendium of laws governing the admission of evidence in trials in federal court.

The rules, at the time, allowed digital images of original photographs to be admitted in court as long as they accurately reflected the original. The footnotes that accompany the rules, however, acknowledged that manipulation was a problem and that government did not yet have a way to deal with it.

When Farid started researching the scientific literature, he found little on fraud detection in digital imagery.

Will you be able to get a copy of the Java-based version of the Image Science Group's applications? Probably not. One of the dilemmas of this type of software is that the more widespread the distribution, the more chance forgers will exploit it to their advantage. Police organizations and news media outlets will likely get access to the application, but he's still unsure of how far he will extend distribution beyond that.

And although Farid charges a fee when asked to serve as a consultant, the software will be made freely available under an open-source license. He doesn't even have plans to form a company around his work. A significant amount of the research, after all, was funded by federal grants.

"Taxpayers," he said, "are paying me to do this research and it needs to go back out."

source:http://news.com.com/Smoking+out+photo+hoaxes+with+software/2100-1008_3-6033312.html


Window to the Heart: New Eye Exam Spots Disease Risk

Some say the eyes are the window to the soul, but an Australian medical researcher says they are the window to the heart and beyond.

Tien Wong of the Center for Eye Research Australia at the University of Melbourne has shown in several large-scale studies that abnormalities of the blood vessels in the retina can be used to predict patients' risk for diabetes, hypertension (or high blood pressure), stroke and heart disease.

These four disorders are some of the most common causes of death, hospitalization and disability in the developed world. But the ability to predict them is limited.

In plain sight

The retina is a membrane that surrounds the eyeball and receives light from the lens and converts it into signals that reach the brain and result in vision [Graphic: The Eye].

Wong's approach involves analyzing digital photographs of patients' retinas and studying them to find narrowing or ballooning of the small blood vessels. Systemic diseases—those that affect several organs or the whole body—such as hypertension, diabetes, AIDS, Graves' disease, lupus, atherosclerosis, multiple sclerosis, rheumatoid arthritis, and sickle cell anemia often cause changes in the eye that can show up as red dots or small blood clots.

Blood vessels of the eyes are so predictive because they are part of the brain's vascular system, so they share anatomical features and respond similarly to stress and disease, Wong said.

In fact, eyes are so transparent compared to the rest of the body that they are the only organ that allows physicians to directly see blood vessels. The digital photography approach is non-invasive—no blood is taken, no incisions are made, no probes in orifices. It takes just a few seconds.

Wong has shown that retinal abnormalities are a good predictor of whether a patient will develop high blood pressure or die of cardiac disease in the next 10 years.

"My hope is that one day, retinal imaging will be able to provide an additional means to stratify risk and help identify people who may benefit from early lifestyle changes and preventive therapies,” Wong told LiveScience.

Eyes on the Internet

The idea that the eye is a window to the human body has been around for more than a century, but Wong has figured out how to make precise and quantifiable predictions for illness based on retinal abnormalities that can be used as a standard by all doctors.

Wong sees a future in which physicians include retinal data in making treatment decisions. First, they'll need to arrive at a common classification system for diagnosing retinal abnormalities.

Ultimately, Wong and his colleagues, who now are setting up a Retinal Vascular Imaging Center in Melbourne, plan to develop a Web-based system to which doctors can upload digital images of patients' retinas. The system will report back the extent of a patient's cardiovascular disease.

It remains to be seen, though, how useful the system will be and with how many diseases it may prove helpful.

Emily Chew, a medical researcher at the National Eye Institute in Bethesda, said she was not surprised by Wong's findings relating retinopathy with diabetes.

"It is important for all persons with diabetes to have regular eye exams (annually) and for those over 65 to have eye exams on a yearly basis to detect any eye disease that maybe treatable, Chew said in an email interview.

However, Chew said eye exams will pick up only a small percentage of the population that have other systemic diseases and "one would not screen with eye exams for systemic diseases.”


source:http://www.livescience.com/humanbiology/060203_eye_heart.html

Ultra-Stable Software Design in C++?

"I need to create an ultra-stable, crash-free application in C++. Sadly, the programming language cannot be changed due to reasons of efficiency and availability of core libraries. The application can be naturally divided into several modules, such as GUI, core data structures, a persistent object storage mechanism, a distributed communication module and several core algorithms. Basically, it allows users to crunch a god-awful amount of data over several computing nodes. The application is meant to primarily run on Linux, but should be portable to Windows without much difficulty." While there's more to this, what strategies should a developer take to insure that the resulting program is as crash-free as possible?
"I'm thinking of decoupling the modules physically so that, even if one crashes/becomes unstable (say, the distributed communication module encounters a segmentation fault, has a memory leak or a deadlock), the others remain alive, detect the error, and silently re-start the offending 'module'. Sure, there is no guarantee that the bug won't resurface in the module's new incarnation, but (I'm guessing!) it at least reduces the number of absolute system failures.

How can I actually implement such a decoupling? What tools (System V IPC/custom socket-based message-queue system/DCE/CORBA? my knowledge of options is embarrassingly trivial :-( ) would you suggest should be used? Ideally, I'd want the function call abstraction to be available just like in, say, Java RMI.

And while we are at it, are there any software _design patterns_ that specifically tackle the stability issue?"

source:http://ask.slashdot.org/article.pl?sid=06/02/05/0119223

Biologists build better software, beat path to viral knowledge

WEST LAFAYETTE, Ind. — Insight into the workings of previously inscrutable viruses has been made possible by a team of biologists whose improvements to computer software may one day contribute to the fight against viral disease.

Epsilon 15, a virus that infects the bacterium Salmonella
Download graphic
caption below

With a few deft lines of computer code, Purdue University's Wen Jiang and his research group have created a powerful new tool for lab research that should allow scientists to obtain high-resolution images of some of the world's smallest biological entities — the viruses. Too minuscule to be usefully observed with many conventional imaging devices, viruses' internal structures must often be viewed with microscopes that require sophisticated computer control to make sense of the tiny objects. Advances in the field often come to those who can create the best custom software, and Jiang's team has done just that, opening up for observation a group of viruses that scientists previously could not get a bead on.

As the team reports in the cover article of this week's (Feb. 2) edition of Nature, the researchers have used their methods to examine one such virus that attacks bacteria.

"While before we could only see virus parts that were symmetric, we can now see those that have non-symmetric structures, such as portions of the one our paper focuses on, the Epsilon 15 virus that attacks salmonella," said Jiang, who recently joined Purdue's College of Science as an assistant professor of biology. "This software will enable a substantial expansion of what we can see and study. We remain limited to observing those viruses that are identical from one individual viral particle to the next — which, sadly, is still only a small portion of the viral species that are out there. But it is a major step forward toward our goal of seeing them all."

Jiang conducted the work while at Baylor College of Medicine with that institution's Juan Chang, Joanita Jakana and Wah Chiu, as well as the Massachusetts Institute of Technology's Peter Weigele and Jonathan King.

Developing the software package enabled the team to examine the Epsilon 15 virus, a "bacteriophage" that infects the salmonella bacterium, and to resolve features as small as 9.5 angstroms across — less than a billionth of a meter. Until now, the high-resolution device, called a cryo-electron microscope, used to examine such objects could only examine the virus's outer shell.

"Many teams were able to determine the shell's configuration because it is a highly symmetric, regular 20-sided shape. But to do so, they essentially had to pretend the rest of the virus didn't exist," Jiang said. "The trouble is that its structure is a lot more complicated than that. It has a tail and an internal genome made up of strands of tightly coiled DNA that are essential to the virus's function. We literally didn't have the whole picture of what tools Epsilon 15 uses to infect its host."

The newly revealed components of the viral particle possesses qualities surprising to researchers accustomed to seeing only symmetric viruses up close.

"Epsilon 15's tail, for example, has six 'spikes' in it, but they aren't arranged in a neat hexagonal ring. They're highly deviant," Jiang said. "Because they're so off-kilter, only two of the spikes actually grasp the shell surface. It's probably not very exciting news to anyone who doesn't look at these things for a living, but what it shows us is that the viral world holds many unexpected secrets, and if we're going to unlock them, we need to see them first."

Probing the innards of the virus also revealed that it possesses a core, the existence of which the researchers did not suspect and the function of which they can as yet only guess at. Jiang said his team suspects the core helps ease the release of the DNA coil into the bacterium, an event akin to shooting a spool of twine attached to a grappling hook across a wall at high velocity. But he said the impact of the team's research would likely be felt more by people who have wanted a tool to look at other viruses rather than, say, doctors with salmonella patients.

"So why do this study in the first place, if all it's doing is helping academics increase their own knowledge?" Jiang asked rhetorically. "It's not a simple answer, but the bottom line is, you have to solve the easy problems before you can attempt the hard ones whose answers have more immediate practical use. But where we might be able to go once we've taken these comparatively easy steps is quite tantalizing.

"Phages, for example, are useful to know about because they attack bacteria, and bacteria are staging a worrisome comeback in human health terms because they are growing resistant to our antibiotics — sometimes faster than medicine can keep up. We need a new way to attack bacteria once they mutate, and if we can employ phages to do our work for us, it could be a great advance for medicine."

Phages that attack bacteria are harmless to humans, Jiang said, and for each bacterial species, including those that cause human disease, nature has evolved several phages designed to infect it specifically.

"Phage therapy as an antibacterial weapon was an idea that was introduced in the early 20th century, but it fell by the wayside as antibiotics came to the fore," Jiang said. "It is possible that as we learn more about how viruses work on the molecular level, their promise as a medical tool will finally come to fruition. Until then, software will be the key to focusing our technological eyes, and teams like ours must keep improving it."

This work was supported in part by the National Institutes of Health and the Robert Welch Foundation.

Jiang is associated with Purdue's Markey Center for Structural Biology, which consists of laboratories that use a combination of cryo-electron microscopy, crystallography and molecular biology to elucidate the processes of viral entry, replication and pathogenesis.

Writer: , (765) 494-2081, cboutin@purdue.edu

Source: Wen Jiang, (765) 494-4408, jiang12@purdue.edu

Purdue News Service: (765) 494-2096; purduenews@purdue.edu

GRAPHIC CAPTION:
Pictured are images of Epsilon 15, a virus that infects the bacterium Salmonella. From the left-side cross section of the viral particle's interior, obtained with an advanced magnifier called a cryo-electron microscope, a team including Purdue structural biologist Wen Jiang was able to generate the right-side computer graphic highlighting the salient features of the virus. Scientists have had difficulty resolving the internal features of viruses with non-symmetric components such as Epsilon 15, but Jiang's team made improvements to the computer software used to process the electron microscopy images, an advance that should make many other such viruses available for medical researchers to study. (Graphic courtesy of Nature magazine/Jiang Laboratories)

A publication-quality photo is available at http://news.uns.purdue.edu/images/+2006/jiang-salmonella.jpg


ABSTRACT

Structure of epsilon15 bacteriophage reveals genome organization and DNA packaging/injection apparatus

Wen Jiang, Juan Chang, Joanita Jakana, Peter Weigele, Jonathan King and Wah Chiu

The critical viral components for packaging DNA, recognizing and binding to host cells, and injecting the condensed DNA into the host are organized at a single vertex of many icosahedral viruses. These component structures do not share icosahedral symmetry and cannot be resolved using a conventional icosahedral averaging method. Here we report the structure of the entire infectious Salmonella bacteriophage epsilon15 determined from single-particle cryo-electron microscopy, without icosahedral averaging. This structure displays not only the icosahedral shell of 60 hexamers and 11 pentamers, but also the non-icosahedral components at one pentameric vertex. The densities at this vertex can be identified as the 12-subunit portal complex sandwiched between an internal cylindrical core and an external tail hub connecting to six projecting trimeric tailspikes. The viral genome is packed as coaxial coils in at least three outer layers with 90 terminal nucleotides extending through the protein core and the portal complex and poised for injection. The shell protein from icosahedral reconstruction at higher resolution exhibits a similar fold to that of other double-stranded DNA viruses including herpesvirus, suggesting a common ancestor among these diverse viruses. The image reconstruction approach should be applicable to studying other biological nanomachines with components of mixed symmetries.


source:http://news.uns.purdue.edu/UNS/html4ever/2006/060201.Jiang.salmonella.html


Is it best to expect the worst?

Expecting the worst may not make you feel any better when faced with a disappointment, say psychology researchers who have tested the age-old advice.

Most people believe that mentally preparing for the worst outcome in an examination or race will soften the disappointment if we flunk or flop - and heighten the joy if we succeed. But the idea has rarely been put on scientific trial.

Margaret Marshall of Seattle Pacific University and Jonathon Brown of the University of Washington, Seattle, did just that. They first asked more than 80 college students to fill in questionnaires that measured their general emotional outlook on life - whether bright or gloomy. The students then practised a set of moderately difficult word-association puzzles on a computer. Based on this, they rated how well they expected to perform on a second set of such problems.

The team then gave half the students problems that were slightly easier than the first set, while half were given more difficult puzzles. This ensured that the students' performances would either exceed, or fall short of, their expectations. Afterwards, the subjects filled in a questionnaire to measure their emotional reaction, such as how disappointed or ashamed they felt.

Students who expected to do badly, the researchers found, actually felt worse when they messed up than those who predicted they would do well but similarly botched their test.

This suggests that gloomy expectations could actually exacerbate the wretchedness felt when a person fails. The old advice "doesn't work", agrees psychology researcher Thomas Gilovich of Cornell University in Ithaca, New York, whose interests include optimism and pessimism. "You're just making yourself miserable."

Sunny side up

The study, published in Cognition and Emotion1, suggests that a person's reaction to disappointment or failure is determined mainly by their general outlook on life. Those who expect to succeed tend to have a sunnier stance all round, the researchers say. If they fall short of their goals, they are likely to look on the bright side and still think they have done reasonably well.

These people, who see the world through rose-tinted spectacles, also tend to deny responsibility for their poor performance. Marshall and Brown showed this in a second part of the study, in which students were also asked whether they felt their test performance was a reflection of their ability. The 'rose-tinted' group who did badly in the test tended to believe that it was not.

Conversely, people who have low expectations tend to have a glum take on life and may be less mentally equipped to deal with disappointment. If they don't make the grade, they take it to heart and tend to blame themselves.

It may be difficult for a person to cushion the blow of failure by trying to brighten their natural temperament, Brown says. Based on his earlier research, he says that the best way a person can deal with a setback is by writing it off as unimportant. "People need to be strong enough to learn that failure is not bad," he says.

The dark side

At least in some cases, negative thinking could still work to a person's advantage.

Anticipating failure at a forthcoming mathematics test or public talk, for example, is thought to help some anxious people motivate themselves to study harder and avert their dismal prophesy. Psychologists call such individuals 'defensive pessimists'.

Conversely, there could also be detrimental consequences to perpetually expecting the best, says Julie Norem of Wellesley College, Massachusetts, who studies the psychological strategies people use to pursue goals. Those who continually brush off their failures at the office might be overlooking the larger picture - such as the fact that they are about to be fired.

Sadly, this means there is no simple advice about whether we ought to expect the worst. This study "is part of a very large puzzle", Norem says.

source:http://www.nature.com/news/2006/060130/full/060130-13.html

5 careers: Big demand, big pay

If you're in one of the jobs listed here, you may be able to negotiate a sweet pay hike for yourself when changing employers.

NEW YORK (CNNMoney.com) – Recent surveys show that a lot of people are itching to find new jobs and human resource managers are expecting a lot of movement - both signs that employers may need to sweeten the pot.

There also have been predictions that the labor market may start to tilt in favor of job seekers due to a shortage of skilled workers.

CNNMoney.com talked with specialists at national staffing and recruiting firm Spherion to find out which job-hunting workers today are sitting in the catbird seat when it comes to negotiating better pay.

Below is a list of in-demand workers in five arenas.
Accounting

Thanks to Enron and the Sarbanes-Oxley Act of 2002, those who have a few years of corporate auditing experience working for a large public accounting firm can negotiate a sweet package for themselves when they change jobs.

That applies whether they're leaving the accounting firm to go work for a corporation or if they're seeking to return to the public accounting firm from an auditing job at an individual company.

College graduates with an accounting degree but not yet a CPA designation might make between $35,000 and $45,000 a year, or up to $50,000 in large cities like New York. After a couple of years they can command a substantial pay hike if they move to large company as an internal staff auditor or to a smaller company as controller. At that point, their salary can jump to anywhere from $50,000 to $75,000.

The expectation is that they will obtain their CPA designation.

If they choose to return to a public accounting firm as an audit manager after a couple of years at a corporation they can earn a salary of $70,000 to $85,000.
Sales and marketing

The healthcare and biomedical fields offer some handsome earnings opportunities for those on the business side.

Business development directors, product managers and associate product managers working for medical device makers, for instance, can do quite well for themselves if they develop a successful track record managing the concept, execution and sales strategy for a medical device before jumping ship.

Typically, they have an MBA in marketing plus at least two to three years' experience on the junior end to between five and eight years' experience at the more senior levels. That experience ideally will be in the industry where they're seeking work.

An associate product manager might make a base salary of $55,000 to $75,000. A product manager can make a base of $75,000 to $95,000, while a business development director may make $120,000 to $160,000. Those salaries don't include bonuses.

The business development director seeking a vice president position could boost his base to between $150,000 to $200,000 -- depending on whether the new company is a risky start-up or established device maker.
Legal

Intellectual property attorneys specializing in patent law and the legal secretaries who have experience helping to prepare patent applications are highly desirable these days.

The most in demand are those lawyers with not only a J.D. but also an advanced degree in electrical and mechanical engineering, chemical engineering, biotechnology, pharmacology or computer science.

Even those patent lawyers who just have an undergraduate degree in those fields have a leg up.

Patent lawyers working for a law firm might make $125,000 to $135,000 to start or about $90,000 if they work for a corporation that's trying to get a patent or to protect one they already have. With a couple of years' experience, they can expect a 10 percent jump or better when they get another job.

Legal secretaries, meanwhile, might make $65,000 at a law firm or $55,000 at a corporation. Should they choose to move to a new employer, they can command close to a 10 percent bump in pay.
Technology

Two tech jobs in high demand these days are .NET (dot net) developers and quality assurance analysts.

Developers who are expert users of Microsoft's software programming language .NET can make between $75,000 and $85,000 a year in major cities when they're starting out. If they pursue a job at a company that seeks someone with a background in a given field (say, a firm looking for a .NET developer experienced in using software related to derivatives) they might snag a salary hike of 15 percent or more when they switch jobs.

Those who work in software quality management, meanwhile, might make $65,000 to $75,000 a year and be able to negotiate a 10 percent to 15 percent jump in pay if they switch jobs.
Manufacturing and engineering

Despite all the announced job cuts in the automotive industry, quality and process engineers, as well as plant managers certified in what's known as "Lean Manufacturing" techniques, are hot commodities.

The same applies to professionals in similar positions at other types of manufacturers.

One Lean Manufacturing technique is to use video cameras to capture the manufacturing process. A quality engineer will analyze the tapes to identify areas in the process that create inefficiencies or excess waste, both in terms of materials and workers' time.

Process and manufacturing engineers might make between $65,000 and $75,000. With an LM certification and a few years' experience, they can command pay hikes of between 15 percent and 20 percent if they choose to switch jobs.

A plant manager making between $90,000 and $120,000 may expect to get a 10 percent raise or more.

---------------------

* Wage forecast: Paycheck prospects

* Big jobs that pay badly

* Your Career: Recovering from the "Oh, s#&%!" moment

* Best companies to work for: 100 winners; best in your state

source:http://money.cnn.com/2006/02/03/pf/pay_hike_jobseeker/index.htm?cnn=yes

Rumours mount over Google's internet plan





Google is working on a project to create its own global internet protocol (IP) network, a private alternative to the internet controlled by the search giant, according to sources who are in commercial negotiation with the company.

Last month, Google placed job advertisements in America and the British national press for "Strategic Negotiator candidates with experience in...identification, selection, and negotiation of dark fibre contracts both in metropolitan areas and over long distances as part of development of a global backbone network".

Dark fibre is the remnants of late 1990s internet boom where American web companies laid down fibre optic cables in preparation for high speed internet delivery. Following the downturn in the technology sector during the early 2000s, the installation process for many of these networks was left incomplete. This has resulted in a usable network of cables spread across the United States that have never been switched on. By purchasing the dark fibre, Google would in effect be able to acquire a ready made internet network that they could control.

Late last year, Google purchased a 270,000sq ft telecom interconnection facilities in New York. It is believed that from here, Google plans to link up and power the dark fibre system and turn it into a working internet network of its own.

It was also reported in November that Google was buying shipping containers and building data centres within them, possibly with the aim of using them at significant nodes within the worldwide cable network. "Google hired a pair of very bright industrial designers to figure out how to cram the greatest number of CPUs, the most storage, memory and power support into a 20- or 40-foot box<" Robert Cringely wrote. "The idea is to plant one of these puppies anywhere Google owns access to fiber, basically turning the entire Internet into a giant processing and storage grid."

Google has long been rumoured to be planning to launch a PC to retail for less than $100. The Google computers are likely to be low-grade machines that require a connection to Google to be able to perform functions such as word processing and spreadsheet manipulations. While using the computers, it is understood that consumers will be shown personalised advertising from the company's AdWords network.

The various reports prompted analysts Bear Stearns to note last year: "We think Google could be experimenting with new hardware endeavours that could significantly change potential future applications by Google, creating another advantage for Google over its competitors. Investors may currently under appreciate Google as a potential hardware company."

The technology industry has also been alive with talk that the Google $100 machines will be less like a standard home PC and more like a television: in effect, one of the first convergent devices betweem the internet and television. While offering the standard PC applications, the "Google Cube" will also offer interactive content from a variety of sources while retaining Google branding and displaying Google advertising.

A leading content provider, who did not wish to be named, told Times Online: "We are in discussions with Google to provide content for their alternative internet service, to be distributed through their Google Cube product. As far as I'm aware they have been conducting negotiations with a number of other players in our marketplace to provide quality content to their users."

However, industry insiders fear that the development of a network of Google Cubes powered over a Google-owned internet network will greatly increase the power that Google wields over online publishers and internet users.

Should Google successfully launch an alternative network, it is is theoretically possible for them to block out competitor websites and only allow users to access websites that have paid Google to be shown to their users.

However, the moves towards providing equipment for as little as £60 will prove popular with home users and even governments, who will welcome the spread of the internet to homes that could not previously afford the intital costs of purchasing PCs.

Contacted by Times Online today, a spokesperson for Google denied that it had any such plans, before adding its customary rider: "It's Google's policy not to comment on speculation concerning products before they are launched."

Benjamin Cohen is a regular contributor to Times Online, writing about the internet and commerce. He is the CEO of pinknews.co.uk


source:http://business.timesonline.co.uk/article/0,,9075-2023600,00.html

This page is powered by Blogger. Isn't yours?