Monday, March 27, 2006
How OS X Executes Applications
Being a long-time UNIX user, I generally have a common set of tools that I work with while trying to troubleshoot system problems. More recently, I have been developing software that adds Apple's OS X to the list of supported operating systems; and unlike traditional UNIX variants, OS X does not support many of the tools that relate to loading, linking and executing programs.
For example, when I come across library relocation problems, the first thing I do is run
ldd on the executable. The
ldd tool lists the dependent shared libraries that the executable requires, along with their paths if found.
On OS X though, here's what happens when you try to run
ldd.
evil:~ mohit$ ldd /bin/ls
-bash: ldd: command not found
Not Found? But it's on all the common UNIX flavours. I wonder if
objdump works.
$ objdump -x /bin/ls
-bash: objdump: command not found
Command not found. What's going on?
The problem is that, unlike Linux, Solaris, HP-UX, and many other UNIX variants, OS X does not use ELF binaries.
To get a list of dependencies for an executable on OS X, you need to use
otool.
evil:~ mohit$ otool /bin/ls
otool: one of -fahlLtdoOrTMRIHScis must be specified
Usage: otool [-fahlLDtdorSTMRIHvVcXm] object_file ...
-f print the fat headers
-a print the archive header
-h print the mach header
-l print the load commands
-L print shared libraries used
-D print shared library id name
-t print the text section (disassemble with -v)
-p start dissassemble from routine name
-s print contents of section
-d print the data section
-o print the Objective-C segment
-r print the relocation entries
-S print the table of contents of a library
-T print the table of contents of a dynamic shared library
-M print the module table of a dynamic shared library
-R print the reference table of a dynamic shared library
-I print the indirect symbol table
-H print the two-level hints table
-v print verbosely (symbolicly) when possible
-V print disassembled operands symbolicly
-c print argument strings of a core file
-X print no leading addresses or headers
-m don't use archive(member) syntax
evil:~ mohit$ otool -L /bin/ls
/bin/ls:
/usr/lib/libncurses.5.4.dylib (compatibility version 5.4.0, current version 5.4.0)
/usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 88.0.0)
Much better. I can see that
/bin/ls references two dynamic libraries. Though, the filename extensions don't look at all familiar.
I'm quite sure that many UNIX / Linux users have had similar experiences while working on OS X systems, so I decided to write a little on what I have learnt so far about OS X executable files.
The Mach-O Executable File FormatIn OS X, all files containing executable code, e.g., applications, frameworks, libraries, kernel extensions etc., are implemented as
Mach-O files.
Mach-O is a file format and an ABI (Application Binary Interface) that describes how an executable is to be loaded and run by the kernel. To be more specific, it tells the OS:
- Which dynamic loader to use.
- Which shared libraries to load.
- How to organize the process address space.
- Where the function entry-point is, and more.
Mach-O is not new. It was originally designed for the NeXTstep operating system, that ran on Motorola 68000 processors. It was later adapted to x86 systems with OpenStep.
How Mach-O Files are OrganizedMach-O files are divided into three regions: a
header, a
load commands region, and the
raw segment data. The
header and
load commands regions describe the features, layout and other characteristics of the file, while the
raw segment data region contains ranges of bytes that are referenced by the load commands.
To investigate and examine the various parts of Mach-O files, OS X comes with a useful program called
otool located in
/usr/bin.
In the following sections, we will use
otool to learn more about how Mach-O files are organized.
The HeaderTo view the the Mach-O header of a file, use the
-h parameter of the
otool command.
evil:~ mohit$ otool -h /bin/ls
/bin/ls:
Mach header
magic cputype cpusubtype filetype ncmds sizeofcmds flags
0xfeedface 18 0 2 11 1608 0x00000085
The first thing specified in the header is the magic number. The magic number identifies the file as either a 32-bit or a 64-bit Mach-O file. It also identifies the
endianness of the CPU that it was intended for. To decipher the magic number, have a look at
/usr/include/mach-o/loader.h.
The header also specifies the target architecture for the file. This allows the kernel to ensure that the code is not run on a processor-type that it was not written for. For example, in the above output,
cputype is set to 18, which is
CPU_TYPE_POWERPC, as defined in
/usr/include/mach/machine.h.
From these two entries alone, we can infer that this binary was intended for 32-bit PowerPC based systems.
Sometimes binaries can contain code for more than one architecture. These are known as Universal Binaries, and generally begin with an additional header called the
fat_header. To examine the contents of the
fat_header, use the
-f switch of the
otool command.
The
cpusubtype attribute specifies the exact model of the CPU, and is generally set to CPU_SUBTYPE_POWERPC_ALL or CPU_SUBTYPE_I386_ALL.
The
filetype signifies how the file is to be aligned and used. It usually tells you if the file is a library, a standard executable, a core file etc. The
filetype above equates to MH_EXECUTE, which signifies a demand paged executable file. Below is a snip from
/usr/include/mach-o/loader.h that lists the different file-types as of this writing.
#define MH_OBJECT 0x1 /* relocatable object file */
#define MH_EXECUTE 0x2 /* demand paged executable file */
#define MH_FVMLIB 0x3 /* fixed VM shared library file */
#define MH_CORE 0x4 /* core file */
#define MH_PRELOAD 0x5 /* preloaded executable file */
#define MH_DYLIB 0x6 /* dynamically bound shared library */
#define MH_DYLINKER 0x7 /* dynamic link editor */
#define MH_BUNDLE 0x8 /* dynamically bound bundle file */
#define MH_DYLIB_STUB 0x9 /* shared library stub for static */
/* linking only, no section contents */
The next two attributes refer to the
load commands section, and specify the number and size of the commands.
And finally, we have
flags, that specify various features that the kernel may use while loading and executing Mach-O files.
Load CommandsThe
load commands region contains a list of commands that tell the kernel how to load the various raw segments within the file. They basically describe how each segment is aligned, protected and laid out in memory.
To see a the list of load commands within a file, use the
-l switch of the
otool command.
evil:~/Temp mohit$ otool -l /bin/ls
/bin/ls:
Load command 0
cmd LC_SEGMENT
cmdsize 56
segname __PAGEZERO
vmaddr 0x00000000
vmsize 0x00001000
fileoff 0
filesize 0
maxprot 0x00000000
initprot 0x00000000
nsects 0
flags 0x4
Load command 1
cmd LC_SEGMENT
cmdsize 600
segname __TEXT
vmaddr 0x00001000
vmsize 0x00006000
fileoff 0
filesize 24576
maxprot 0x00000007
initprot 0x00000005
nsects 8
flags 0x0
Section
sectname __text
segname __TEXT
addr 0x00001ac4
size 0x000046e8
offset 2756
align 2^2 (4)
reloff 0
nreloc 0
flags 0x80000400
reserved1 0
reserved2 0
[ ___SNIPPED FOR BREVITY___ ]
Load command 4
cmd LC_LOAD_DYLINKER
cmdsize 28
name /usr/lib/dyld (offset 12)
Load command 5
cmd LC_LOAD_DYLIB
cmdsize 56
name /usr/lib/libncurses.5.4.dylib (offset 24)
time stamp 1111407638 Mon Mar 21 07:20:38 2005
current version 5.4.0
compatibility version 5.4.0
Load command 6
cmd LC_LOAD_DYLIB
cmdsize 52
name /usr/lib/libSystem.B.dylib (offset 24)
time stamp 1111407267 Mon Mar 21 07:14:27 2005
current version 88.0.0
compatibility version 1.0.0
Load command 7
cmd LC_SYMTAB
cmdsize 24
symoff 28672
nsyms 101
stroff 31020
strsize 1440
Load command 8
cmd LC_DYSYMTAB
cmdsize 80
ilocalsym 0
nlocalsym 0
iextdefsym 0
nextdefsym 18
iundefsym 18
nundefsym 83
tocoff 0
ntoc 0
modtaboff 0
nmodtab 0
extrefsymoff 0
nextrefsyms 0
indirectsymoff 30216
nindirectsyms 201
extreloff 0
nextrel 0
locreloff 0
nlocrel 0
Load command 9
cmd LC_TWOLEVEL_HINTS
cmdsize 16
offset 29884
nhints 83
Load command 10
cmd LC_UNIXTHREAD
cmdsize 176
flavor PPC_THREAD_STATE
count PPC_THREAD_STATE_COUNT
r0 0x00000000 r1 0x00000000 r2 0x00000000 r3 0x00000000 r4 0x00000000
r5 0x00000000 r6 0x00000000 r7 0x00000000 r8 0x00000000 r9 0x00000000
r10 0x00000000 r11 0x00000000 r12 0x00000000 r13 0x00000000 r14 0x00000000
r15 0x00000000 r16 0x00000000 r17 0x00000000 r18 0x00000000 r19 0x00000000
r20 0x00000000 r21 0x00000000 r22 0x00000000 r23 0x00000000 r24 0x00000000
r25 0x00000000 r26 0x00000000 r27 0x00000000 r28 0x00000000 r29 0x00000000
r30 0x00000000 r31 0x00000000 cr 0x00000000 xer 0x00000000 lr 0x00000000
ctr 0x00000000 mq 0x00000000 vrsave 0x00000000 srr0 0x00001ac4 srr1 0x00000000
The above file has 11 load commands located directly below the header, numbered 0 to 10.
The first four commands (LC_SEGMENT), numbered 0 to 3, define how segments within the file are to be mapped into memory. A segment defines a range of bytes in the Mach-O binary, and can contain zero or more sections. We will talk more about segments later.
Load command 4 (LC_LOAD_DYLINKER) specifies which dynamic linker to use. This is almost always set to
/usr/bin/dyld, which is the default OS X dynamic library linker.
Commands 5 and 6 (LC_LOAD_DYLIB) specify the shared libraries that this file links against. These are loaded by the dynamic loader specified in command 4.
Commands 7 and 8 (LC_SYMTAB, LC_DYNSYMTAB) specify the symbol tables used by the file and the dynamic linker respectively. Command 9 (LC_TWOLEVEL_HINTS) contains the hint table for the two-level namespace.
And finally, command 10 (LC_UNIXTHREAD), defines the initial state of the main thread of the process. This command is only included in executable files.
Segments and SectionsMost of the load commands mentioned above make references to segments within the file. A segment is a range of bytes within a Mach-O file that maps directly into virtual memory by the kernel and the dynamic linker. The
header and
load commands regions are considered as the first segment of the file.
An typical OS X executable generally has five segments:
- __PAGEZERO : Located at virtual memory address 0 and has no protection rights. This segment occupies no space in the file, and causes access to NULL to immediately crash.
- __TEXT : Contains read-only data and executable code.
- __DATA : Contains writable data. These sections are generally marked copy-on-write by the kernel.
- __OBJC : Contains data used by the Objective C language runtime.
- __LINKEDIT : Contains raw data used by the dynamic linker.
The
__TEXT and
__DATA segments may contain zero or more sections. Each section consists of specific types of data, e.g., executable code, constants, C strings etc.
To see the contents of a section, use the
-s option with the
otool command.
evil:~/Temp mohit$ otool -sv __TEXT __cstring /bin/ls
/bin/ls:
Contents of (__TEXT,__cstring) section
00006320 00000000 5f5f6479 6c645f6d 6f645f74
00006330 65726d5f 66756e63 73000000 5f5f6479
00006340 6c645f6d 616b655f 64656c61 7965645f
00006350 6d6f6475 6c655f69 6e697469 616c697a
__SNIP__
To disassemble the
__text section, use the
-tv switch.
evil:~/Temp mohit$ otool -tv /bin/ls
/bin/ls:
(__TEXT,__text) section
00001ac4 or r26,r1,r1
00001ac8 addi r1,r1,0xfffc
00001acc rlwinm r1,r1,0,0,26
00001ad0 li r0,0x0
00001ad4 stw r0,0x0(r1)
00001ad8 stwu r1,0xffc0(r1)
00001adc lwz r3,0x0(r26)
00001ae0 addi r4,r26,0x4
__SNIP__
Running an ApplicationNow that we know what a Mach-O file looks like, let us see how OS X loads and runs an application.
When you run an application, the shell first calls the
fork() system call.
Fork creates a logical copy of the calling process (the shell) and schedules it for execution. This child process then calls the
execve() system call providing the path of the program to be executed.
The kernel loads the specified file, and examines its header to verify that it is a valid Mach-O file. It then starts interpreting the load commands, replacing the child process's address space with segments from the file.
At the same time, the kernel also executes the dynamic linker specified by the binary, which proceeds to load and link all the dependent libraries. After it binds just enough symbols that are necessary for running the file, it calls the entry-point function.
The entry-point function is usually a standard function statically linked in from
/usr/lib/crt1.o at build time. This function initializes the kernel environment and calls the executable's
main() function.
The application is now running.
The Dynamic LinkerThe OS X dynamic linker,
/usr/bin/dyld, is responsible for loading dependent shared libraries, importing the various symbols and functions, and
binding them into the current process.
When the process is first started, all the linker does is import the shared libraries into the address space of the process. Depending on how the program was built, the actual binding may be performed at different stages of its execution.
- Immediately after loading, as in load-time binding.
- When a symbol is referenced, as in just-in-time binding.
- Before the process is even executed, an optimization technique known as pre-binding
If a binding type is not specified, the
just-in-time binding is used.
An application can only run when all the symbols and segments from all the different object files can be resolved. In order to find libraries and frameworks, the standard dynamic linker,
/usr/bin/dyld, searches a predefined set of directories. To override these directories, or to provide fallback paths, the DYLD_LIBRARY_PATH or DYLD_FALLBACK_LIBRARY_PATH environment variables can be set a colon-separated list of directories.
FinallyAs you can see, executing a process in OS X is a complex affair, and I have tried to cover as much as is necessary for a useful debugging session.
To learn more about Mach-O executables,
otool, and the OS X kernel in general, here are a list of references that I would recommend:
Mac OS X ABI Mach-O File Format ReferenceExecuting Mach-O FilesOverview of Dynamic LibrariesThe
otool man page
The
dyld man page
/usr/include/mach/machines.h
/usr/include/mach-o/loader.h
source:http://0xfe.blogspot.com/2006/03/how-os-x-executes-applications.html
# posted by dark master : 3/27/2006 10:32:00 AM
0 comments 
Prime numbers get hitched
In their search for patterns, mathematicians have uncovered unlikely connections between prime numbers and quantum physics. Will the subatomic world help reveal the illusive nature of the primes?
In 1972, the physicist Freeman Dyson wrote an article called "Missed Opportunities." In it, he describes how relativity could have been discovered many years before Einstein announced his findings if mathematicians in places like Göttingen had spoken to physicists who were poring over Maxwell's equations describing electromagnetism. The ingredients were there in 1865 to make the breakthrough—only announced by Einstein some 40 years later.
It is striking that Dyson should have written about scientific ships passing in the night. Shortly after he published the piece, he was responsible for an abrupt collision between physics and mathematics that produced one of the most remarkable scientific ideas of the last half century: that quantum physics and prime numbers are inextricably linked.
This unexpected connection with physics has given us a glimpse of the mathematics that might, ultimately, reveal the secret of these enigmatic numbers. At first the link seemed rather tenuous. But the important role played by the number 42 has recently persuaded even the deepest skeptics that the subatomic world might hold the key to one of the greatest unsolved problems in mathematics.
Prime numbers, such as 17 and 23, are those that can only be divided by themselves and one. They are the most important objects in mathematics because, as the ancient Greeks discovered, they are the building blocks of all numbers—any of which can be broken down into a product of primes. (For example, 105 = 3 x 5 x 7.) They are the hydrogen and oxygen of the world of mathematics, the atoms of arithmetic. They also represent one of the greatest challenges in mathematics.
As a mathematician, I've dedicated my life to trying to find patterns, structure and logic in the apparent chaos that surrounds me. Yet this science of patterns seems to be built from a set of numbers which have no logic to them at all. The primes look more like a set of lottery ticket numbers than a sequence generated by some simple formula or law.
For 2,000 years the problem of the pattern of the primes—or the lack thereof—has been like a magnet, drawing in perplexed mathematicians. Among them was Bernhard Riemann who, in 1859, the same year Darwin published his theory of evolution, put forward an equally-revolutionary thesis for the origin of the primes. Riemann was the mathematician in Göttingen responsible for creating the geometry that would become the foundation for Einstein's great breakthrough. But it wasn't only relativity that his theory would unlock.
Riemann discovered a geometric landscape, the contours of which held the secret to the way primes are distributed through the universe of numbers. He realized that he could use something called the zeta function to build a landscape where the peaks and troughs in a three-dimensional graph correspond to the outputs of the function. The zeta function provided a bridge between the primes and the world of geometry. As Riemann explored the significance of this new landscape, he realized that the places where the zeta function outputs zero (which correspond to the troughs, or places where the landscape dips to sea-level) hold crucial information about the nature of the primes. Mathematicians call these significant places the zeros.
Riemann's discovery was as revolutionary as Einstein's realization that E=mc2. Instead of matter turning into energy, Riemann's equation transformed the primes into points at sea-level in the zeta landscape. But then Riemann noticed that it did something even more incredible. As he marked the locations of the first 10 zeros, a rather amazing pattern began to emerge. The zeros weren't scattered all over; they seemed to be running in a straight line through the landscape. Riemann couldn't believe this was just a coincidence. He proposed that all the zeros, infinitely many of them, would be sitting on this critical line—a conjecture that has become known as the Riemann Hypothesis.
But what did this amazing pattern mean for the primes? If Riemann's discovery was right, it would imply that nature had distributed the primes as fairly as possible. It would mean that the primes behave rather like the random molecules of gas in a room: Although you might not know quite where each molecule is, you can be sure that there won't be a vacuum at one corner and a concentration of molecules at the other.
For mathematicians, Riemann's prediction about the distribution of primes has been very powerful. If true, it would imply the viability of thousands of other theorems, including several of my own, which have had to assume the validity of Riemann's Hypothesis to make further progress. But despite nearly 150 years of effort, no one has been able to confirm that all the zeros really do line up as he predicted.
It was a chance meeting between physicist Freeman Dyson and number theorist Hugh Montgomery in 1972, over tea at Princeton's Institute for Advanced Study, that revealed a stunning new connection in the story of the primes—one that might finally provide a clue about how to navigate Riemann's landscape. They discovered that if you compare a strip of zeros from Riemann's critical line to the experimentally recorded energy levels in the nucleus of a large atom like erbium, the 68th atom in the periodic table of elements, the two are uncannily similar.
It seemed the patterns Montgomery was predicting for the way zeros were distributed on Riemann's critical line were the same as those predicted by quantum physicists for energy levels in the nucleus of heavy atoms. The implications of a connection were immense: If one could understand the mathematics describing the structure of the atomic nucleus in quantum physics, maybe the same math could solve the Riemann Hypothesis.
Mathematicians were skeptical. Though mathematics has often served physicists—Einstein, for instance—they wondered whether physics could really answer hard-core problems in number theory. So in 1996, Peter Sarnak at Princeton threw down the gauntlet and challenged physicists to tell the mathematicians something they didn't know about primes. Recently, Jon Keating and Nina Snaith, of Bristol, duely obliged.
There is an important sequence of numbers called "the moments of the Riemann zeta function." Although we know abstractly how to define it, mathematicians have had great difficulty explicitly calculating the numbers in the sequence. We have known since the 1920s that the first two numbers are 1 and 2, but it wasn't until a few years ago that mathematicians conjectured that the third number in the sequence may be 42—a figure greatly significant to those well-versed in The Hitchhiker's Guide to the Galaxy.
It would also prove to be significant in confirming the connection between primes and quantum physics. Using the connection, Keating and Snaith not only explained why the answer to life, the universe and the third moment of the Riemann zeta function should be 42, but also provided a formula to predict all the numbers in the sequence. Prior to this breakthrough, the evidence for a connection between quantum physics and the primes was based solely on interesting statistical comparisons. But mathematicians are very suspicious of statistics. We like things to be exact. Keating and Snaith had used physics to make a very precise prediction that left no room for the power of statistics to see patterns where there are none.
Mathematicians are now convinced. That chance meeting in the common room in Princeton resulted in one of the most exciting recent advances in the theory of prime numbers. Many of the great problems in mathematics, like Fermat's Last Theorem, have only been cracked once connections were made to other parts of the mathematical world. For 150 years many have been too frightened to tackle the Riemann Hypothesis. The prospect that we might finally have the tools to understand the primes has persuaded many more mathematicians and physicists to take up the challenge. The feeling is in the air that we might be one step closer to a solution. Dyson might be right that the opportunity was missed to discover relativity 40 years earlier, but who knows how long we might still have had to wait for the discovery of connections between primes and quantum physics had mathematicians not enjoyed a good chat over tea.
Marcus du Sautoy is professor of mathematics at the University of Oxford, and is the author of The Music of the Primes (HarperCollins).
source:http://www.seedmagazine.com/news/2006/03/prime_numbers_get_hitched.php
# posted by dark master : 3/27/2006 10:31:00 AM
0 comments 
Drilling into a hot volcano
Land of steam: Iceland sits atop the Atlantic's mid-ocean ridge system |
Geologists in Iceland are drilling directly into the heart of a hot volcano.
Their $20m project will lead to boreholes that could ultimately yield 10 times as much geothermal power as any previous project.
It is hoped the endeavour will also reveal more about the nature of mid-ocean ridges where new ocean floor is created.
Twenty years ago, geologist Gudmundur Omar Friedleifsson had a surprise when he lowered a thermometer down a borehole.
"We melted the thermometer," he recalls. "It was set for 380C; but it just melted. The temperature could have been 400 or even 500."
Speaking in the first of a new series on BBC Radio 4, called Five Holes in the Ground, he describes how this set him thinking about how much energy it might be possible to extract from Iceland's volcanic rocks.
At depth, the groundwater is way over 100C, but the pressure keeps it liquid. As Dr Friedleifsson puts it: "On the surface, you boil your egg at 100 degrees; but if you wanted to boil your egg at a depth of 2,500m, it would take 350."
Splitting floor
The landscape on the Reykjanes Ridge in southwest Iceland seems like an alien world.
There are pools of boiling mud and the hiss of steam escaping from fissures. There are also signs of industry - past, present and future - with an abandoned salt factory, working geothermal power stations and a big new drilling rig.
It is also an area of great natural beauty. Down on the shore, crashing Atlantic breakers are exposing fresh cliffs of pillow basalt, volcanic lava that has erupted under the sea and been rapidly quenched so that it forms features that look a bit like black toothpaste squeezed from a giant tube.
This is a young landscape. The most recent eruptions here occurred in the 13th Century and there could be new ones at any time.
Iceland is unusual geologically in that it exists above the ocean at all. It stands on the mid-ocean ridge system, the longest mountain range on the planet. This range runs around the world's oceans like the seam on a tennis ball.
It is here that new ocean floor is created as the continents drift apart.
For the most part, it is deep under the sea; it is the place where hydrothermal vents and their "black smokers" belch out super-heated water and dissolved minerals.
But Iceland stands on an additional plume of volcanic mantle rock that has lifted it above the Atlantic and made it accessible to geologists.
Hydrogen future
Some 90% of all homes in Iceland are heated by geothermal energy; and a number of power stations are also producing electricity from steam at around 240C, extracted from boreholes between 600 and 1,000m deep.
But now, the plan is to go much deeper. Omar Friedleifsson of the Iceland Geosurvey is leading the consortium of energy companies in the Iceland Deep Drilling Project.
Last year, they drilled down to a depth of 3,082m and since then have been conducting flow tests.
Later this year, they will put a pressure lining into their borehole and drill on down to more than 4km deep.
At that depth, they hope to encounter what is called supercritical water: water that is not simply a mixture of steam and hot water but a single phase which can carry much more energy.
Engineers on the project have calculated that increasing the temperature by 200 degrees and the pressure by 200 Bar will mean that, for the same flow rate, the energy extracted from such a borehole will go up from 5MW to 50MW.
Power station manager Albert Albertsson predicts that, by the end of the century, "Iceland could become the Kuwait of the North". The vision is to use this cheap and carbon-free energy to split water, to yield hydrogen that could be despatched around the world in tankers.
But interest in the Iceland Deep Drilling Project is not solely for energy production. Geologists have never had the chance before to penetrate the volcanic heart of a mid-ocean-ridge geothermal system and there is much they would like to learn.
As they get deeper, bore teams will change from the rotary drill, which produces rock fragments but can drill up to 200m per day, to a slower drill that produces useful core samples.
The project wants to study the geology, the energy flow and the chemical environment at great depth.
Blue Lagoon: A perfect place for a dip, especially during winter |
Albert Albertson, at the nearby power station, likes to think of the energy as just a part of an integrated system.
Iceland's volcanic rocks are highly fractured and so, below about 50m, there is plenty of water.
For the next 40m or thereabouts, it is fresh drinking water, topped up by Iceland's generous rainfall. Below that, the water is salty; the ocean has managed to seep in.
However, it is the really deep supercritical water that is also laden with dissolved minerals. Mr Albertsson believes he may also be able to extract precious metals, such as copper, silver and gold from the water.
After the water has gone through his turbines, it is still at about 40C. Some of that excess energy is used for district heating and for horticulture in greenhouses.
It also warms one of Iceland's biggest tourist attractions: the Blue Lagoon, a vast outdoor lake which, even in March, greets bathers with the temperature of a hot bath.
There are supposed benefits from the silica rich water with its faint smell of sulphur, and the white silica mud is exported for health and beauty treatments.
Mr Albertsson told the BBC that he himself is a regular visitor.
"For me, the ideal time to take a dip is in the middle of winter, in the middle of the night, looking up at the stars and the Aurora Borealis, the Northern Lights."
Five Holes in the Ground is broadcast this coming week on BBC Radio 4, starting on Monday 27 March at 1445 GMT / 1545 BST. You can listen again to the programmes on the website once they have been broadcast.
source:http://news.bbc.co.uk/2/hi/science/nature/4846574.stm
# posted by dark master : 3/27/2006 10:29:00 AM
0 comments 
Swedish Mathematician Lennart Carleson Wins Abel
"Sci Tech is reporting that Swedish mathematician Lennart Carleson has won the Abel Prize on Thursday for proving a 19th century theorem on harmonic analysis. His theorems have been helpful in creating iPod. Prof Carleson's major contributions have come in two fields - the first has subsequently been used in the components of sound systems and the second helps to predict how markets and weather systems respond to change. One of Carleson's many triumphs was settling a conjecture that had remained unsolved for over 150 years. He showed that every continuous function (one with a connected graph) is equal to the sum of its Fourier series except perhaps at some negligible points."
source:http://science.slashdot.org/science/06/03/27/0548252.shtml
# posted by dark master : 3/27/2006 10:27:00 AM
0 comments 
Bring Home the Biotech Bacon
SAN FRANCISCO -- A microscopic worm may be the key to heart-friendly bacon.
Geneticists have mixed DNA from the roundworm C. elegans and pigs to produce swine with significant amounts of omega-3 fatty acids -- the kind believed to stave off heart disease.
Researchers hope they can improve the technique in pork and do the same in chickens and cows. In the process, they also want to better understand human disease.
"We all can use more omega-3 in our diet," said Dr. Jing Kang, the Harvard Medical School researcher who modified the omega-3-making worm gene so it turned on in the pigs.
Kang is one of 17 authors of the paper appearing Sunday in an online edition of the journal Nature Biotechnology.
The cloned, genetically engineered pigs are the latest advance in the agricultural biotechnology field, which is struggling to move beyond esoteric products such as bug-repelling corn and soy resistant to weed killers.
Hoping to create healthier, cheaper and tastier products that consumers crave, Monsanto of St. Louis and its biotech farming competitors like DuPont are developing omega-3-producing crops that yield healthier cooking oils. Kang said 30 academic laboratories are now working with his omega-3 gene, presumably pursuing similar projects.
"Consumers have responded pretty positively when asked their opinion of food modified to improve food quality and food safety, just as long as the taste isn't altered negatively," said Christine Bruhn, director of the Center for Consumer Research at the University of California, Davis.
Earlier experiments have succeeded in manipulating animals' fat content but most never made it out of the lab because of taste problems.
While boosting Omega-3s doesn't decrease the fat content in pigs, the fatty acids are also important to brain development and may reduce the risk of Alzheimer's disease and depression. The American Heart Association recommends at least two weekly servings of fish, particularly fatty fish like trout and salmon, which are naturally high in omega-3s.
People already eat genetically engineered soy beans in all manner of processed food, but biotech companies run into what bioethicists call the "yuck factor" when they begin tinkering with animals.
The Food and Drug Administration has never approved food derived from genetically engineered animals. Unlike crops, the FDA treats such animals as medicine and requires extensive testing before approval.
"We understand that this research is in the very early stages," FDA spokeswoman Rae Jones said. "This technology will not likely reach meat counters for many years."
The FDA is still considering Massachusetts company Aqua Bounty Technologies' application to market a salmon genetically engineered to grow faster, the only such request pending with the agency. Aqua Bounty began its federal application process about nine years ago and there is no indication when the FDA will rule.
In the meantime, the researchers of the latest project said they will use their genetically engineered pigs to study human disease, especially heart conditions.
source:http://www.wired.com/news/wireservice/0,70504-0.html?tw=rss.index
# posted by dark master : 3/27/2006 10:26:00 AM
0 comments 
Revolutionary jet engine tested
A new jet engine designed to fly at seven times the speed of sound appears to have been successfully tested. The scramjet engine, the Hyshot III, was launched at Woomera, 500km north of Adelaide in Australia, on the back of a two stage Terrier-Orion rocket.
Once 314km up, the Hyshot III fell back to Earth, reaching speeds analysts hope will have topped Mach 7.6 (9,000km/h).
It is hoped the British-designed Hyshot III will pave the way for ultra fast, intercontinental air travel.
An international team of researchers is presently analysing data from the experiment, to see if it met its objectives.
The scientists and engineers had just six seconds to monitor its performance before the £1m engine crashed into the ground.
Rachel Owen, a researcher from UK defence firm Qinetiq, which designed the scramjet, said it looked like everything had gone according to plan.
The vehicle had followed a "nominal trajectory" and landed 400km down the range, Ms Owen said.
A scramjet - or supersonic combustion ramjet - is mechanically very simple. It has no moving parts and takes all of the oxygen it needs to burn hydrogen fuel from the air.
This makes it more efficient than a conventional rocket engine as it does not need to carry its own oxygen supply, meaning that a vehicle using one could potentially carry a larger payload.
As the engine continues its downward path the fuel in the scramjet ignites automatically. This experiment was expected to start working at a height of 35km.
However scramjets do not begin to work until they reach five times the speed of sound.
At this speed the air passing through the engine is compressed and hot enough for ignition to occur. Rapid expansion of the exhaust gases creates the forward thrust.
Making sure the flight happens correctly is incredibly difficult, according to Dr Allan Paull, project leader of the Hyshot programme at the University of Queensland.
"You are dealing with extremes of conditions. You're working out on the edge and with a lot of the stuff no-one has ever tried before," he told the BBC News website. "You've got to expect things to go wrong".
'Flying times cut'
The test was the first of three test flights planned for this year by the international Hyshot consortium.
It will be followed soon by the test flight of another Hyshot engine designed by the Japanese Aerospace Exploration Agency (Jaxa). This will be followed in June by the launch of an engine that will fly at Mach 10, designed by the Australian Defence Science and Technology Organisation (DSTO).
Scramjets do not work until they reach five times the speed of sound |
The first Hyshot engine was launched in 2001 but the test flight failed when the rocket carrying the engine flew off course.
The Hyshot tests will bring the idea of a commercial scramjet one step closer to reality.
In the first instance, these would probably be used to launch satellites into low-Earth orbit but many have speculated that they could also allow passenger airlines to fly between London and Sydney in just two hours.
Although this vision may be many years off, it was given a huge boost when Nasa successfully flew its X-43A plane over the Pacific Ocean in 2004. The unmanned aircraft flew at 10 times the speed of sound, a new world speed record.
SCRAMJET ENGINE TEST 1. Two-stage rocket lifts the scramjet engine to altitude of 330km 2. Rocket free-falls back to Earth, reaching speeds of Mach 8 3. Experiment takes place at Mach 7.6 between 35-23km from ground and lasts 6 seconds |
source:http://news.bbc.co.uk/2/hi/science/nature/4832254.stm
# posted by dark master : 3/27/2006 10:25:00 AM
0 comments 
Here's an Idea: Let Everyone Have Ideas
LIKE many top executives, James R. Lavoie and Joseph M. Marino keep a close eye on the stock market. But the two men, co-founders of Rite-Solutions, a software company that builds advanced — and highly classified — command-and-control systems for the Navy, don't worry much about Nasdaq or the New York Stock Exchange.
Instead, they focus on an internal market where any employee can propose that the company acquire a new technology, enter a new business or make an efficiency improvement. These proposals become stocks, complete with ticker symbols, discussion lists and e-mail alerts. Employees buy or sell the stocks, and prices change to reflect the sentiments of the company's engineers, computer scientists and project managers — as well as its marketers, accountants and even the receptionist.
"We're the founders, but we're far from the smartest people here," Mr. Lavoie, the chief executive, said during an interview at Rite-Solutions' headquarters outside Newport, R.I. "At most companies, especially technology companies, the most brilliant insights tend to come from people other than senior management. So we created a marketplace to harvest collective genius."
That's a refreshing dose of humility from a successful C.E.O. with decades of experience in his field. (Mr. Lavoie, 59, is a Vietnam War veteran and an accomplished engineer who has devoted his career to military-oriented technologies.)
Most companies operate under the assumption that big ideas come from a few big brains: the inspired founder, the eccentric inventor, the visionary boss. But there's a fine line between individual genius and know-it-all arrogance. What happens when rivals become so numerous, when technologies move so quickly, that no corporate honcho can think of everything? Then it's time to invent a less top-down approach to innovation, to make it everybody's business to come up with great ideas.
That's a key lesson behind the rise of open source technology, most notably Linux. A ragtag army of programmers organized into groups, wrote computer code, made the code available for anyone to revise and, by competing and cooperating in a global community, reshaped the market for software. The brilliance of Linux as a model of innovation is that it is powered by the grass-roots brilliance of the thousands of programmers who created it.
According to Tim O'Reilly, the founder and chief executive of O'Reilly Media, the computer book publisher, and an evangelist for open source technologies, creativity is no longer about which companies have the most visionary executives, but who has the most compelling "architecture of participation." That is, which companies make it easy, interesting and rewarding for a wide range of contributors to offer ideas, solve problems and improve products?
At Rite-Solutions, the architecture of participation is both businesslike and playful. Fifty-five stocks are listed on the company's internal market, which is called Mutual Fun. Each stock comes with a detailed description — called an expect-us, as opposed to a prospectus — and begins trading at a price of $10. Every employee gets $10,000 in "opinion money" to allocate among the offerings, and employees signal their enthusiasm by investing in a stock and, better yet, volunteering to work on the project. Volunteers share in the proceeds, in the form of real money, if the stock becomes a product or delivers savings.
Mr. Marino, 57, president of Rite-Solutions, says the market, which began in January 2005, has already paid big dividends. One of the earliest stocks (ticker symbol: VIEW) was a proposal to apply three-dimensional visualization technology, akin to video games, to help sailors and domestic-security personnel practice making decisions in emergency situations. Initially, Mr. Marino was unenthusiastic about the idea — "I'm not a joystick jockey" — but support among employees was overwhelming. Today, that product line, called Rite-View, accounts for 30 percent of total sales.
"Would this have happened if it were just up to the guys at the top?" Mr. Marino asked. "Absolutely not. But we could not ignore the fact that so many people were rallying around the idea. This system removes the terrible burden of us always having to be right."
Another virtue of the stock market, Mr. Lavoie added, is that it finds good ideas from unlikely sources. Among Rite-Solutions' core technologies are pattern-recognition algorithms used in military applications, as well as for electronic gambling systems at casinos, a big market for the company. A member of the administrative staff, with no technical expertise, thought that this technology might also be used in educational settings, to create an entertaining way for students to learn history or math.
She started a stock called Win/Play/Learn (symbol: WPL), which attracted a rush of investment from engineers eager to turn her idea into a product. Their enthusiasm led to meetings with Hasbro, up the road in Pawtucket, and Rite-Solutions won a contract to help it build its VuGo multimedia system, introduced last Christmas.
Mr. Lavoie called this innovation an example of the "quiet genius" that goes untapped inside most organizations. "We would have never connected those dots," he said. "But one employee floated an idea, lots of employees got passionate about it and that led to a new line of business."
The next frontier is to tap the quiet genius that exists outside organizations — to attract innovations from people who are prepared to work with a company, even if they don't work for it. An intriguing case in point is InnoCentive, a virtual research and development lab through which major corporations invite scientists and engineers worldwide to contribute ideas and solve problems they haven't been able to crack themselves.
InnoCentive, based in Andover, Mass., is literally a marketplace of ideas. It has signed up more than 30 blue-chip companies, including Procter & Gamble, Boeing and DuPont, whose research labs are groaning under the weight of unsolved problems and unfinished projects. It has also signed up more than 90,000 biologists, chemists and other professionals from more than 175 countries. These "solvers" compete to meet thorny technical challenges posted by "seeker" companies. Each challenge has a detailed scientific description, a deadline and an award, which can run as high as $100,000.
"We are talking about the democratization of science," said Alpheus Bingham, who spent 28 years as a scientist and senior research executive at Eli Lilly & Company before becoming the president and chief executive of InnoCentive. "What happens when you open your company to thousands and thousands of minds, each of them with a totally different set of life experiences?"
InnoCentive, founded as an independent start-up by Lilly in 2001, has an impressive record. It can point to a long list of valuable scientific ideas that have arrived, with surprising speed, from faraway places. In addition to the United States, the top countries for solvers are China, India and Russia.
Last month, InnoCentive attracted a $9 million infusion of venture capital to accelerate its growth. "There is a 'collective mind' out there," Dr. Bingham said. "The question for companies is, what fraction of it can you access?"
That remains an unanswered question at many companies, whose leaders continue to rely on their own brainpower as the key source of ideas. But there's evidence that more and more top executives are recognizing the limits of their individual genius.
Back at Rite-Solutions, for example, one of the most valuable stocks on Mutual Fun is the stock market itself (symbol: STK). So many executives from other companies have asked to study the system that a team championed the idea of licensing it as a product — another unexpected opportunity.
"There's nothing wrong with experience," said Mr. Marino, the company's president. "The problem is when experience gets in the way of innovation. As founders, the one thing we know is that we don't know all the answers."
source:http://www.nytimes.com/2006/03/26/business/yourmoney/26mgmt.html?ex=1301029200&en=0d90ed5116e769d0&ei=5090&partner=rssuserland&emc=rss
# posted by dark master : 3/27/2006 10:24:00 AM
0 comments 
U.S. Planning Base on Moon To Prepare for Trip to Mars
HOUSTON -- For the first time since 1972, the United States is planning to fly to the moon, but instead of a quick, Apollo-like visit, astronauts intend to build a permanent base and live there while they prepare what may be the most ambitious undertaking in history -- putting human beings on Mars.
President Bush in 2004 announced to great fanfare plans to build a new spaceship, get back to the moon by 2020 and travel on to Mars after that. But, with NASA focused on designing a new spaceship and spending about 40 percent of its budget on the troubled space shuttle and international space station programs, that timetable may suffer.
Still, NASA's moon planners are closely following the spaceship initiative and, within six months, will outline what they need from the new vehicle to enable astronauts to explore the lunar surface.
"It's deep in the future before we go there," said architect Larry Toups, head of habitation systems for NASA's Advanced Projects Office. "But it's like going on a camping trip and buying a new car. You want to make sure you have a trailer hitch if you need it."
Scientists and engineers are hard at work studying technologies that don't yet exist and puzzling over questions such as how to handle the psychological stress of moon settlement, how to build lunar bulldozers and how to reacquire what planetary scientist Christopher P. McKay of NASA's Ames Research Center calls "our culture of exploration."
The moon is not for the faint of heart. It is a lethal place, without atmosphere, pelted constantly by cosmic rays and micrometeorites, plagued by temperature swings of hundreds of degrees, and swathed in a blanket of dust that can ruin space suits, pollute the air supply and bring machinery to a screeching halt.
And that says nothing about the imponderables. Will working in one-sixth of Earth's gravity for a year cause crippling health problems? What happens when someone suffers from a traumatic injury that can't be treated by fellow astronauts? How do people react to living in a tiny space under dangerous conditions for six months?
"It's like Magellan. You send them off, and maybe they come back, maybe they don't," said planetary scientist Wendell W. Mendell, manager of NASA's Office for Human Exploration Science, during an interview at the recently concluded Lunar and Planetary Science Conference here. "There's a lot of pathologies that show up, and there's nobody in the Yellow Pages."
In some ways, the moon will be harder than Mars. Moon dust is much more abrasive than Mars dust; Mars has atmosphere; Mars has more gravity (one-third of Earth's); Mars has plenty of ice for a potential water supply, while the moon may have some, but probably not very much.
Still, the moon is ultimately much more forgiving because it is much closer -- 250,000 miles away, while Mars is 34 million miles from Earth at its closest point. If someone needs help on the moon, it takes three days to get there. By contrast, Mars will be several months away even with the help of advanced -- and as yet nonexistent -- propulsion systems.
Not having to pay as dearly for mistakes is one key reason why the moon is an integral part of the Bush initiative. The other, as even scientists point out, is that if the United States does not return to the moon, others will.
"The new thing is China, and they've announced they're going to the moon. The Europeans want to go; the Russians want to go; and if we don't go, maybe they'll go with the Chinese," Mars Institute Chairman Pascal Lee said in an interview. "Could we bypass the moon and go to Mars while India and China are going to the moon? I don't think so."
Bush's 2004 "Vision for Space Exploration," by calling for a lunar return and a subsequent Mars mission, set goals, which, if achieved, would keep the United States in the forefront of space exploration for decades.
Since then, mishaps and delays with the space shuttle and the space station programs have shrunk both the moon research budget and the rhetoric promoting the mission.
Instead, NASA Administrator Michael D. Griffin has focused agency attention and resources on the design and construction of a new "crew exploration vehicle" and its attendant rocketry -- the spacecraft that will push U.S. astronauts once again beyond low Earth orbit.
Despite the moon's current low profile, however, NASA continues to plan a lunar mission and to promote the technological advances needed to achieve it. Toups, one of the moon program's designers, said NASA envisions that a lunar presence, once achieved, will begin with two-to-four years of "sorties" to "targeted areas."
These early forays will resemble the six Apollo lunar missions, which ended in 1972. "You have four crew for seven to 10 days," Toups said in a telephone interview. "Then, if you found a site of particular interest, you would want to set up a permanent outpost there."
The south pole is currently the top target. It is a craggy and difficult area, but it is also the likeliest part of the lunar surface to have both permanent sunlight, for electric power, and ice, although many scientists have questions about how much ice there is. Without enough water, mission planners might pick a gentler landscape.
Site selection will mark the end of what McKay calls Apollo-style "camping trips." "There's got to be a lot more autonomy, so we keep it simple," McKay said. "We're going to be on Mars for a long time, and we have to use the moon to think in those terms."
The templates, cited frequently by moon mavens, are the U.S. bases in Antarctica, noteworthy for isolation, extreme environment, limited access, lack of indigenous population and no possibility of survival without extensive logistical support.
"The lunar base is not a 'colony,' " Lee said. " 'Colonization' implies populating the place, and that's not on the plate. This is a research outpost."
Once planners choose a base, the astronauts will immediately need to bring a host of technologies to bear, none of which currently exist. "Power is a big challenge," Toups said. Solar arrays are an obvious answer, but away from the poles 14 days of lunar sunlight are followed by 14 days of darkness, so "how do you handle the dormancy periods?"
Next is the spacesuit. Apollo suits weighed 270 pounds on Earth, a relatively comfortable "felt weight" of 40 to 50 pounds on the moon, but an unacceptable 102 pounds on Mars. "You can't haul that around, bend down or climb hills," Lee said. "Somehow we have to cut the mass of the current spacesuit in half."
And the new suit, unlike the Apollo suits or the current 300-pound shuttle suit, is going to have to be relatively easy to put on and take off, and to be able withstand the dreaded moon dust.
After three days, Apollo astronauts reported that the dust was causing the joints in their suits to jam, "and we're not talking about three outings," Lee said of the next moon missions. "We're talking about once a week for 500 days -- between 70 and 100 spacewalks."
Dealing with dust is also a major concern in building shelters on the lunar surface. Toups said it might be possible to harden the ground by microwaving it, creating a crust "like a tarp when you're camping." Otherwise, the dust pervades everything, and prolonged exposure could even lead to silicosis.
Dust also makes it virtually impossible to use any kind of machinery with ball bearings. Civil engineer Darryl J. Calkins, of the Army Corps of Engineers Cold Regions Research and Engineering Laboratory, warned that the combination of dust, low gravity, temperature swings and the high cost of flying things to the moon is going to define the lunar tool kit in unforeseen ways.
"You can't put a diesel up there; you can't put a 20,000-pound bulldozer up there; and none of our oils or hydraulic fluids are going to survive," Calkins said in a telephone interview. "We may have to go back to the 19th century to find appropriate tools -- use cables, pulleys, levers."
And even then, it will be difficult to level a base site and haul away the fill because there's not enough gravity to give a tractor adequate purchase. Instead, Calkins envisions a device that can "scrape and shave" small amounts of soil and take it away bit by bit.
But in the end, "you have to learn how to do it, with real people," McKay said. "This is hard, but we can learn it. And if we do it right on the moon, we will be able to answer my ultimate question: Can Mars be habitable? I think the answer is 'yes.' "
source:http://www.washingtonpost.com/wp-dyn/content/article/2006/03/25/AR2006032500999_pf.html
# posted by dark master : 3/27/2006 10:23:00 AM
0 comments 
The New Wisdom of the Web
"In a cover story, Newsweek takes a look at the new wave of start-ups cashing in on the next stage of the Internet by Putting The 'We' in Web. Sites built on user-generated content like YouTube, Flickr, MySpace, Digg and Facebook have all taken a page from Tom Sawyer's playbook, engaging the community to do their work, prompting Google CEO Eric Schmidt to suggest he finds MySpace more interesting than Microsoft."
source:http://slashdot.org/article.pl?sid=06/03/26/1424224
# posted by dark master : 3/27/2006 10:21:00 AM
0 comments 
The Low-Down on Dellienware
The amount of people who began to cry into their milk when it was announced that Dell has bought Alienware was a bit sad. "Ohh, this'll be the end of Alienware, Dell will screw it up" and suchlike, followed by coy winks and nudges about the AMD connection and Michael Dell's personal image ambitions... it's all a bit of a storm in a teacup, no?
For one, despite the "Willthey/Won'they?" speculation in recent weeks, the move itself makes perfect sense in an "it's inevitable" sort of way. Dell - the company and the personality behind it - has been desperate to push itself in the consumer end of the business, in which they've not been doing quite as well as expected. Dell has an image and a mindshare problem with gamers, and the obvious way to sort this out is simply to buy someone for whom the coin is flipped the opposite way.
As Alienware's vice president of marketing Mark Vena put it to me on Friday, "Alienware is branded like a BMW or a Porsche, while others are branded as Hondas and Toyotas. These are still great brands, but there's that difference."
Alienware, being the volume and branded leader in the high-end gaming field, is the obvious choice for Dell to acquire. Alienware has provided the most robust competition to Dell's own XPS range; the company has a cool image and plenty of mindshare. They're also of the same mind as Dell where selling is concerned, and Alienware follows much the same model as Dell; not selling in shops for example.
The cost of buying this makeover lock, stock and barrel is hardly more than pocket change for Dell - something they've outright said since the deal was announced, though exact figures haven't been revealed. In all the figures we can look at however Dell is like the Red Army compared to Alienware's merry band of partisans - the consumer end makes up 15% of Dell's business, and the company had sales of $55.9 billion last year, compared to Alienware's measly $200 million. Dell wants that end of the pool as a growth and image area, not because it needs it to survive.

Since it's the shallower end of the pool, what happens when a massive Blue Whale like Dell jumps in? Well not much for the moment, I should imagine. For one, Dell has to hold off on doing anything to or with Alienware for a good 30 - 60 days so that US regulators can take a look at the deal. That means that we're going to have a month or two of speculation, but business will continue as normal.
Thereafter, Alienware's direct competition, the Falcon's and Voodoo's of this world, can expect to feel the squeeze simply because Alienware will have Dell's experience, expertise and perhaps direct support in getting their machines to their customers. Alienware is pretty efficient as it is, but the machine that is Dell is not to be underestimated in its abilities to dredge every last ounce of efficiency out of the order to delivery process.
According to Vena, Dell will be handling Alienware's suppliers. "It really simplifies supplier's lives to only have to deal with the one person. Plus Dell gets a lot of allocation of new products such as graphic cards, and all of this can help bring down our lead time from ordering to delivery." At the moment, depending on what you order, you can wait between three and six weeks for your machine to arrive, especially with the delays caused by scarcity of new products. "We hope to bring that down to two weeks in most cases," Vena told me.
Dell can also lean in to make Alienware a more flexible company, allowing it to do more in areas such as financing with its massive amount of financial backing. "We can now do things we simply couldn't do before with our current volumes," said Vena, and this will certainly begin to make Alienware a more attractive offering to many. In the present order of things even Alienware can't afford to back the kind of financing of their ultra-expensive systems that Dell can. Now they have the best of both worlds.
How does AMD fit into the Dell/Alienware picture?
How Dell will influence the direct internal structure of Alienware remains to be seen, and it's this which would have the most impact on the world at large. We don't have much of a track record to go by, as Dell generally doesn't do acquisitions. While the brass at both companies are assuring their employees that after this week "everything will remain exactly the same, only different" it wouldn't be the first time if, in a few months, we then see a major cull of management at Alienware; and anything from new direction to the complete absorption of everything bar the brand and the case designs directly into the Dell machine itself.
Vena says that some executives have agreed to multi-year engagements to ensure that the company transitions smoothly, although his use of the word "some," coming from an experienced marketing type as he is, might give some ammunition to speculative types in the audience.
Personally, I doubt that in the short to medium-term Dell will do anything so drastic. It mainly comes back to their image problems - Dell knows that, for the most part, the (consumer) buyers of their office machines and more generic boxes don't care about any of that. They see the advertisements, they see the price tag, and they buy. The audience that Alienware and its peers sell to, and which Dell has largely failed to conquer with its XPS range, is the tech-literate audience who are constantly reading sites such as Tom's Hardware.

They pay attention to the tech world, and nine out of ten people who have or will buy an Alienware, or similar, machine in the future knew within hours that Dell had bought Alienware because they always have one eye on the industry. Similarly if Dell gobbles up Alienware whole they will know perfectly well what they'll be buying into and make purchasing decisions appropriately.
Alienware will get more competitive, and put the squeeze on its peers, but in the world of high-end gaming for the tech-literate I can't see Alienware becoming the undisputed Number 1, as if nothing else gamers don't like to see a monopoly and would take their money elsewhere if they thought Alienware were about to become a Dell in terms of market position.
As for the ramifications of this deal on Dell itself, and the wider tech world, we can't go forward without mentioning AMD. Dell has been flirting like a 14 year old girl with the company on and off for many moons now, and certainly buying a company that deals quite heavily in AMD chips cannot go unnoticed. One ramification of Dell possibly gobbling up Alienware whole at some point in the future would be to end the debate and bring the company into the AMD camp.
I've learned better than to say "Dell is definitely switching to AMD now!" So I won't. What Dell does with regards to AMD via its new satellite gaming company will remain as much a mystery as it was when, in January around CES time, we were told, and many began to (once again) excitedly proclaim that they'd have done the dirty deal by March.
One thing we do know that is Dell will be handling all of Alienware's suppliers, which Vena told us includes AMD. He was cagey about what that means specifically, though it does leave an obvious open channel between AMD and Dell themselves that didn't exist before.
In the backroom world of cloak and dagger, anything could be happening. Intel might be falling over themselves to make sure Dell remains onside. Or they could have assented to the deal so long as Dell itself stays firmly in the Intel camp. Or Dell could have already signed a deal with AMD months ago, and we'll hear nothing about it for months more. We could chase these mice around inside our skulls forever. Personally, I prefer to do more productive things with my time and then come what may.
What about Dell's own XPS range? At the same time as we heard about the acquisition we got Dell's Renegade XPS, which is far more expensive (for a stock price) than anything even Alienware is producing. Perhaps it's just chaff thrown up to distract us, but Dell could do worse than use Alienware to highlight just how great their brand-spanking-new, balls-to-the-walls XPS range is. The XPS could become the real Porsche, while the Alienware machines come to be seen as more garden variety.
"Dell will be keeping their XPS range, and we see it as offering the customer more choice," Vena told us. This is interesting, as Dell just outlaid a wad of cash to keep its own competition. The bets are now on as to whether, or perhaps when, Alienware's head honco gets ticked off at not being allowed to do whatever he wants and storms off, regardless of agreements, as can happen in these acquisitions.
Personalities may mash there, as Vena told us that "Michael Dell really drove the acquisition from the beginning and has been very involved." Now we're not ones on this end to make crude comments about someone just wanting to be like Steve Jobs (Have I said too much?), but if for example the big personality that is Michael Dell decides that he wants to be seen as the man behind Alienware, there could be ructions in the board room. Toys have been known to be hurled from prams in similar situations.
Only time will tell if Dell will be the end of Alienware as we know it, though I'm personally betting on the companies converging, but still maintaining those vital two degrees of separation.
source:http://www.tgdaily.com/2006/03/25/opinion_low_down_on_dellienware/
# posted by dark master : 3/27/2006 10:10:00 AM
0 comments 
Could Ethiopian skull be missing link?
Scientists believe find could link homo erectus and modern man
ADDIS ABABA, Ethiopia (AP) -- Scientists in northeastern Ethiopia said Saturday that they have discovered the skull of a small human ancestor that could be a missing link between the extinct Homo erectus and modern man.
The hominid cranium -- found in two pieces and believed to be between 500,000 and 250,000 years old -- "comes from a very significant period and is very close to the appearance of the anatomically modern human," said Sileshi Semaw, director of the Gona Paleoanthropological Research Project in Ethiopia.
Archaeologists found the early human cranium five weeks ago at Gawis in Ethiopia's northeastern Afar region, Sileshi said.
Several stone tools and fossilized animals including two types of pigs, zebras, elephants, antelopes, cats, and rodents were also found at the site.
Sileshi, an Ethiopian paleoanthropologist based at Indiana University, said most fossil hominids are found in pieces but the near-complete skull -- a rare find -- provided a wealth of information.
"The Gawis cranium provides us with the opportunity to look at the face of one of our ancestors," the archaeology project said in a statement.
Homo erectus, which many believe was an ancestor of modern Homo sapiens, is thought to have died out 100,000 to 200,000 years ago.
The cranium dates to a time about which little is known -- the transition from African Homo erectus to modern humans. The fossil record from Africa for this period is sparse and most of the specimens poorly dated, project archaeologists said.
The face and cranium of the fossil are recognizably different from those of modern humans, but bear unmistakable anatomical evidence that it belongs to the modern human's ancestry, Sileshi said.
"A good fossil provides anatomical evidence that allows us to refine our understanding of evolution. A great fossil forces us to re-examine our views of human origins. I believe the Gawis cranium is a great fossil," said Scott Simpson, a project paleontologist from Case Western Reserve University School of Medicine at Cleveland, Ohio.
Scientists conducting surveys in the Gawis River drainage basin found the skull in a small gully, the project statement said.
"This is really exciting because it joins a limited number of fossils which appear to be evolutionary between Homo erectus and our own species Homo sapiens," said Eric Delson, a paleoanthropologist at Lehman College of the City University of New York, who was not involved in the discovery but has followed the project.
Homo erectus left Africa about 2 million years ago and spread across Asia from Georgia in the Caucasus to China and Indonesia. It first appeared in Africa between 1 million and 2 million years ago.
Between 1 million and perhaps 200,000 years ago, one or more species existed in Africa that gave rise to the earliest members of our own species Homo sapiens -- between 150,000 and 200,000 years ago.
Delson said the fossil found in Ethiopia "might represent a population broadly ancestral to modern humans or it might prove to be one of several side branches which died out without living descendants."
source:http://www.cnn.com/2006/WORLD/africa/03/25/missing.link.ap/index.html?section=cnn_topstories
# posted by dark master : 3/27/2006 10:10:00 AM
0 comments 
Microsoft's Not So Happy Family
"Reports from Redmond are that Microsoft Employees are not happy with the double delay of Windows and Office being pushed back into 2007. EETimes is reporting that some Microsoft employees are calling for the termination of several top managers Including Brian Valentine, Jim Allchin, and Steve Ballmer for the delay debacle. The report references a blog by Who da'Punk, an anonymous Microsoft employee who asks, where's the accountability for failure? So far the blog entry has generated over 350 comments from Microsoft insiders and outsiders."
source:http://slashdot.org/article.pl?sid=06/03/26/0555255
# posted by dark master : 3/27/2006 10:08:00 AM
0 comments 
Terrorist 007, Exposed
For almost two years, intelligence services around the world tried to uncover the identity of an Internet hacker who had become a key conduit for al-Qaeda. The savvy, English-speaking, presumably young webmaster taunted his pursuers, calling himself Irhabi -- Terrorist -- 007. He hacked into American university computers, propagandized for the Iraq insurgents led by Abu Musab al-Zarqawi and taught other online jihadists how to wield their computers for the cause.
Suddenly last fall, Irhabi 007 disappeared from the message boards. The postings ended after Scotland Yard arrested a 22-year-old West Londoner, Younis Tsouli, suspected of participating in an alleged bomb plot. In November, British authorities brought a range of charges against him related to that plot. Only later, according to our sources familiar with the British probe, was Tsouli's other suspected identity revealed. British investigators eventually confirmed to us that they believe he is Irhabi 007.
The unwitting end of the hunt comes at a time when al-Qaeda sympathizers like Irhabi 007 are making explosive new use of the Internet. Countless Web sites and password-protected forums -- most of which have sprung up in the last several years -- now cater to would-be jihadists like Irhabi 007. The terrorists who congregate in those cybercommunities are rapidly becoming skilled in hacking, programming, executing online attacks and mastering digital and media design -- and Irhabi was a master of all those arts.
But the manner of his arrest demonstrates how challenging it is to combat such online activities and to prevent others from following Irhabi's example: After pursuing an investigation into a European terrorism suspect, British investigators raided Tsouli's house, where they found stolen credit card information, according to an American source familiar with the probe. Looking further, they found that the cards were used to pay American Internet providers on whose servers he had posted jihadi propaganda. Only then did investigators come to believe that they had netted the infamous hacker. And that element of luck is a problem. The Internet has presented investigators with an extraordinary challenge. But our future security is going to depend increasingly on identifying and catching the shadowy figures who exist primarily in the elusive online world.
The short career of Irhabi 007 offers a case study in the evolving nature of the threat that we at the SITE Institute track every day by monitoring and then joining the password-protected forums and communicating with the online jihadi community. Celebrated for his computer expertise, Irhabi 007 had propelled the jihadists into a 21st-century offensive through his ability to covertly and securely disseminate manuals of weaponry, videos of insurgent feats such as beheadings and other inflammatory material. It is by analyzing the trail of information left by such postings that we are able to distinguish the patterns of communication used by individual terrorists.
Irhabi's success stemmed from a combination of skill and timing. In early 2004, he joined the password-protected message forum known as Muntada al-Ansar al-Islami (Islam Supporters Forum) and, soon after, al-Ekhlas (Sincerity) -- two of the password-protected forums with thousands of members that al-Qaeda had been using for military instructions, propaganda and recruitment. (These two forums have since been taken down.) This was around the time that Zarqawi began using the Internet as his primary means of disseminating propaganda for his insurgency in Iraq. Zarqawi needed computer-savvy associates, and Irhabi proved to be a standout among the volunteers, many of whom were based in Europe.
Irhabi's central role became apparent to outsiders in April of that year, when Zarqawi's group, later renamed al-Qaeda in Iraq, began releasing its communiqués through its official spokesman, Abu Maysara al-Iraqi, on the Ansar forum. In his first posting, al-Iraqi wrote in Arabic about "the good news" that "a group of proud and brave men" intended to "strike the economic interests of the countries of blasphemy and atheism, that came to raise the banner of the Cross in the country of the Muslims."
At the time, some doubted that posting's authenticity, but Irhabi, who was the first to post a response, offered words of support. Before long, al-Iraqi answered in like fashion, establishing their relationship -- and Irhabi's central role.
Over the following year and a half, Irhabi established himself as the top jihadi expert on all things Internet-related. He became a very active member of many jihadi forums in Arabic and English. He worked on both defeating and enhancing online security, linking to multimedia and providing online seminars on the use of the Internet. He seemed to be online night and day, ready to answer questions about how to post a video, for example -- and often willing to take over and do the posting himself. Irhabi focused on hacking into Web sites as well as educating Internet surfers in the secrets to anonymous browsing.
In one instance, Irhabi posted a 20-page message titled "Seminar on Hacking Websites," to the Ekhlas forum. It provided detailed information on the art of hacking, listing dozens of vulnerable Web sites to which one could upload shared media. Irhabi used this strategy himself, uploading data to a Web site run by the state of Arkansas, and then to another run by George Washington University. This stunt led many experts to believe -- erroneously -- that Irhabi was based in the United States.
Irhabi used countless other Web sites as free hosts for material that the jihadists needed to upload and share. In addition to these sites, Irhabi provided techniques for discovering server vulnerabilities, in the event that his suggested sites became secure. In this way, jihadists could use third-party hosts to disseminate propaganda so that they did not have to risk using their own web space and, more importantly, their own money.
As he provided seemingly limitless space captured from vulnerable servers throughout the Internet, Irhabi was celebrated by his online followers. A mark of that appreciation was the following memorandum of praise offered by a member of Ansar in August 2004:
"To Our Brother Irhabi 007. Our brother Irhabi 007, you have shown very good efforts in serving this message board, as I can see, and in serving jihad for the sake of God. By God, we do not like to hear what hurts you, so we ask God to keep you in his care.
You are one of the top people who care about serving your brothers. May God add all of that on the side of your good work, and may you go careful and successful.
We say carry on with God's blessing.
Carry on, may God protect you.
Carry on serving jihad and its supporters.
And I ask the mighty, gracious and merciful God to keep for us everyone who wants to support his faith.
Amen."
Irhabi's hacking ability was useful not only in the exchange of media, but also in the distribution of large-scale al-Qaeda productions. In one instance, a film produced by Zarqawi's al-Qaeda, titled "All Is for Allah's Religion," was distributed from a page at www.alaflam.net/wdkl .
The links, uploaded in June 2005, provided numerous outlets where visitors could find the video. In the event that one of the sites was disabled, many other sources were available as backups. Several were based on domains such as www.irhabi007.ca or www.irhabi007.tv , indicating a strong involvement by Irhabi himself. The film, a major release by al-Qaeda in Iraq, showed many of the insurgents' recent exploits compiled with footage of Osama bin Laden, commentary on the Abu Ghraib prison, and political statements about the rule of then-Iraqi Interim Prime Minister Ayad Allawi.
Tsouli has been charged with eight offenses including conspiracy to murder, conspiracy to cause an explosion, conspiracy to cause a public nuisance, conspiracy to obtain money by deception and offences relating to the possession of articles for terrorist purposes and fundraising. So far there are no charges directly related to his alleged activities as Irhabi on the Internet, but given the charges already mounted against him, it will probably be a long time before the 22-year-old is able to go online again.
But Irhabi's absence from the Internet may not be as noticeable as many hope. Indeed, the hacker had anticipated his own disappearance. In the months beforehand, Irhabi released his will on the Internet. In it, he provided links to help visitors with their own Internet security and hacking skills in the event of his absence -- a rubric for jihadists seeking the means to continue to serve their nefarious ends. Irhabi may have been caught, but his online legacy may be the creation of many thousands of 007s.
feedback@siteinstitute.org
Rita Katz is the author of "Terrorist Hunter" (HarperCollins) and the director of the SITE Institute, which is dedicated to the "search for international terrorist entities." Michael Kern is a senior analyst with the institute.
source:http://www.washingtonpost.com/wp-dyn/content/article/2006/03/25/AR2006032500020_2.html
# posted by dark master : 3/27/2006 10:05:00 AM
0 comments 
New data transmission record - 60 DVDs per second
March 24, 2006 As the world’s internet traffic grows relentlessly, faster data transmission will logically become crucial. To enable telecommunications networks to cope with the phenomenal surge in data traffic as the internet population moves past a billion users, researchers are focusing on new systems to increase data transmission rates and it’s not surprising that the world data transmission record is continually under threat. Unlike records where human physical capabilities limit new records to incremental growth, when human ingenuity is the deciding factor, extraordinary gains are possible. German and Japanese scientists recently collaborated to achieve just such a quantum leap in obliterating the world record for data transmission. By transmitting a data signal at 2.56 terabits per second over a 160-kilometer link (equivalent to 2,560,000,000,000 bits per second or the contents of 60 DVDs) the researchers bettered the old record of 1.28 terabits per second held by a Japanese group. By comparison, the fastest high-speed links currently carry data at a maximum 40 Gbit/s, or around 50 times slower.
"You transmit data at various wavelengths simultaneously in the fiber-optic networks. For organizational and economic reasons each wavelength signal is assigned a data rate as high as possible", explains Prof. Hans-Georg Weber from the Fraunhofer Institute for Telecommunications, Heinrich-Hertz-Institut HHI in Berlin, who heads a project under the MultiTeraNet program funded by the Federal Ministry of Education and Research.
A few weeks ago the scientist and his team established a new world record together with colleagues from Fujitsu. Data is transmitted in fiber-optic cables using ultrashort pulses of light and is normally encoded by switching the laser on and off. A pulse gives the binary 1, off the 0. You therefore have two light intensity states to transmit the data. The Fraunhofer researchers have now managed to squeeze more data into a single pulse by packing four, instead of the previous two, binary data states in a light pulse using phase modulation."
"Faster data rates are hugely important for tomorrow's telecommunications", explains Weber. The researcher assumes the transmission capacity on the large transoceanic traffic links will need to increase to between 50 and 100 terabits per second in ten to 20 years. "This kind of capacity will only be feasible with the new high-performance systems."
source:http://www.gizmag.com/go/5396/
# posted by dark master : 3/27/2006 10:04:00 AM
0 comments 
Sony stops making original PS
With the PS3 launch looming, Japanese electronics giant shuts down production of the console that brought it to the industry in the first place.
Sony entered the global gaming industry with the Japanese release of the original PlayStation in December of 1994. More than 11 years and two more consoles later, the electronics giant has pulled the plug on its first console, ceasing all production of PS units. Last September, Sony announced that the original PlayStation and its PSOne redesign had surpassed the 100 million units sold milestone.
Despite the news, Sony representatives noted today that the end of production does not necessarily mean the end of availability. PS hardware and software are still selling in countries around the world.
Even if original PlayStation systems and games are becoming slim pickings in the US, gamers likely won't be going without for long. As part of its PlayStation Business Briefing 2006, Sony last week announced that it is working on an emulator that would allow gamers to play PS titles on the PSP. Details haven't been released yet, but Sony Computer Entertainment president Ken Kutaragi said the games would be digitally distributed.
While the PlayStation's decade-plus production run is certainly impressive, it isn't the longest life span for a gaming console. Atari released its Video Computer System (more popularly known as the Atari 2600) in 1977 and didn't officially pull the plug on it until 1991.
source:http://www.gamespot.com/news/6146549.html
# posted by dark master : 3/27/2006 10:03:00 AM
0 comments 
