At a remote field camp near the North Pole, American researchers have been fishing for answers.
"We've seen the change concentrated at the pole over the last 15 years,'' says one of the researchers, Jamie Morrison, with the University of Washington.
Each April, scientists funded by the national science foundation, take advantage of a three-week window -- when there's 24 hour daylight and temperatures hovering around freezing -- to probe the polar ice pack. The ice pack that has been thinning and retreating for several decades now.
"What I'm becoming more convinced of is that conditions of ice here really do affect the global atmospheric circulation."
In other words, less ice means more areas of warm water. And pockets of warm air that are helping to change the global temperatures and wind patterns. As a result, one new study suggests that desperately needed storms will be steered away from the drought-plagued American west, drying it by as much as 30 percent more, and increasing the wildfire danger.
And what's triggered the change in the arctic?
"You've got a mix of a sort of el nino for the arctic, and a global warming signal,'' says Morrison.
The talk of global warming is a double-edged sword for skeptics who don't believe it, but are nevertheless intrigued by the possibilities a warmer arctic may bring. One possibility is an open water shortcut over the top of the world that may already be emerging.
About 1,000 away in the tiny Inuit village of Resolute Bay, Canada, life may be in for a big change. Some climatologists believe that when these kindergarteners reach middle age, in just 50 years, the normally ice-choked bay will be part of a summer-long ice-free Northwest passage.
"It cuts thousands and thousands of miles off the sea route between Europe and the Orient, and would be irresistible to commerce," says Dennis Conlon, with the Office of Naval Research. Conlon wrote a recently declassified document exploring the military implications of a watery arctic.
Conlon says an open Northwest passage makes America open to new threats.
"We'd have to counter terrorism, we'd have to protect assets, we'd have to perform search and rescue. Every function the Navy performs, we'd have to perform in the Arctic."
It's no longer a matter of speculation -- the Arctic is changing. And if the climate change is prolonged, it will have a global impact. The question on the table is whether man can or will do anything about it.
source:http://www.cbsnews.com/stories/2004/05/31/eveningnews/printable620417.shtml
# posted by dark master : 12/09/2005 09:40:00 AM
0 comments 
The Web Hypertext Application Technology Working Group's approach to improving HTML
Level: Introductory
Edd Dumbill (edd@usefulinc.com), Chair, XTech Conference
06 Dec 2005
In this two-part series, Edd Dumbill examines the various ways forward for HTML that Web authors, browser developers, and standards bodies propose. This series covers the incremental approach embodied by the WHATWG specifications and the radical cleanup of XHTML proposed by the W3C. Additionally, the author gives an overview of the W3C's new Rich Client Activity. Here in Part 1, Edd focuses primarily on two specifications being developed by WHATWG: Web Applications 1.0 (HTML5) and Web Forms 2.0.
HTML isn't a very good language for making Web pages. However, it has been a very good language for making the Web.
HTML's ease of learning and the view source capability for browsers has bootstrapped the Web's popularity in an amazing way. The World Wide Web Consortium's (W3C) involvement in standardizing HTML has ensured that Web browsers all implement the same dialect, more or less. The emergence of CSS, and the corresponding growth of standards-based Web design as best practice has also averted HTML chaos and led to a better Web experience for users and developers alike.
This much you probably know. The resulting Web has probably made a positive impact on your life or business. Yet the fact remains, HTML isn't a very good language. Why, for instance, does HTML have headings H1 through H6? Who ever seriously used a six-level-deep heading hierarchy? And why, in this era of 3D-accelerated graphics cards and sophisticated user interfaces, are Web pages limited to clunky text boxes and radio buttons for user input?
No surprise then, that various groups are pushing again to develop HTML in a way that lets Web publishing and Web applications use more of the technology that's available in modern user interfaces. Who are these people? Broadly speaking, they fall into three groups. The first are those who use today's technology to make a difference. This is what the Asynchronous JavaScript and XML (Ajax) buzz is about: using JavaScript and the browser's XMLHttpRequest
object to create dynamic user interfaces. The effects can be wonderful, but this is not a standard way to move forward.
The other two groups focus on future improvements. The W3C promotes XHTML 2.0, based on the requirements of a broad vendor base -- not just desktop browser makers. XHTML 2.0 is seen as a radical step. In contrast, the Web Hypertext Application Technology Working Group (WHATWG) promotes a set of incremental specifications, which evolve HTML to add the most immediately required functionality into the browser. Some WHATWG features are already implemented in Apple's Safari browser and Mozilla Firefox 1.5. (See Resources for more on W3C and WHATWG.)
These articles will examine the work of the latter two groups: W3C and WHATWG. Ajax has been covered elsewhere in developerWorks (see Resources). While no standards war has erupted yet on the scale that brought HTML into the W3C in the first place, these two organizations are not always in agreement as to where HTML should go. I'll explain and evaluate both approaches.
WHATWG, HTML 5, and Web Forms 2.0
As their Web page states, WHATWG is a "loose unofficial collaboration of Web browser manufacturers and interested parties who wish to develop new technologies designed to allow authors to write and deploy Applications over the World Wide Web." Two terms are of particular interest here: WHATWG's main players make browsers (Mozilla, Opera), and the focus of their improvements is towards creating Web applications.
WHATWG's figurehead specification is code-named HTML5, but is known more properly as Web Applications 1.0 (see Resources). HTML5 is intended to preserve backward compatibility with the current HTML standard, HTML 4.01, and also with XHTML 1.0, the XML version of HTML. The specification sustains both the HTML and XHTML strands of W3C HTML, although it notes that implementations may choose not to.
In addition to HTML5, the Web Forms 2.0 specification (see Resources) seeks to address many of the annoyances that developers find with the current state of HTML forms. Today's forms omit many basic features from regular desktop applications, such as validation and richer widgets.
So what's inside HTML5? In short, a lot. The Web Applications 1.0 specification is an evolving beast, and some of the features mentioned are more fully developed than others. Here's a 30,000-foot flyover of the new features:
- New layout elements, including a calendar control, an address card, a flexible datagrid, gauges and progress meters, drag and drop, and menus
- Programming extensions to the Document Object Model (DOM), including server-sent DOM events
- A formalization of the de-facto standard
XMLHttpRequest
object, the centerpiece of Ajax communication - Dynamic bitmap graphics through the
canvas
element
You can see the heritage of many of these in features implemented as one-offs with JavaScript on the Web today. Indeed, the recent rise in popularity of Ajax toolkits has led to a proliferation of widgets such as gauges, calendars, and so on.
canvas
Traditional implementation of HTML5 features -- that is, as part of a Web browser -- is restricted today to just a few of the technologies mentioned above. The most well-known among these is the canvas
element. Firefox 1.5 and Apple's Safari browser have also implemented canvas
.
While the W3C's Scalable Vector Graphics (SVG) language already provides a way to show in-browser illustrations, canvas
takes a different approach. Rather than implement a declarative document, like SVG, it provides a blank surface for JavaScript to draw upon. This model of imperative graphics owes much more to OpenGL-style drawing than to the declarative Web.
Figure 1 shows a screenshot from a simple canvas
demo. (See Resources -- you can view it with Apple's Safari browser or a pre-release of Firefox 1.5.) When the user moves the mouse over the shapes, the shapes slowly enlarge. Among other things, the demo shows that all the necessary ingredients for implementing user interfaces -- drawing, user input events, and timers -- are in place.
Figure 1. Screenshot of interactive canvas demo
canvas
applications are already one step closer to their obvious conclusion (games!) with the implementation of a simple 3D maze, as illustrated in Figure 2. (See Resources for a link to the actual maze.)
Figure 2. Screenshot of simple maze game
To get a flavor of how you can program canvas
, look at some simple code. Listing 1 shows a self-contained example, the end result of which is shown in Figure 3.
Listing 1. Simple canvas example Figure 3. Output of Listing 1
As canvas
does not present any declarative semantics, it is likely that it will have more applications in the user interface implementation area than anywhere else. An intriguing scope for canvas
is as a prototyping ground for new browser interface elements and features. The best example of this is Antoine Quint's partial implementation of SVG using canvas
(see Resources). Using Quint's method, to render an HTML file with embedded SVG, you add a couple of lines that import his JavaScript SVG renderer. Figure 4 shows the familiar tiger image rendered using this method.
Figure 4. SVG tiger image rendered using JavaScript and the canvas element
Time will show whether canvas
is useful in a mainstream Web setting. Its functionality is something akin to that of Java applets, yet its JavaScript interface makes it much easier to use and interface with other browser elements.
Web Forms 2.0
The WHATWG forms specification's version number indicates its intent to build on the specification of forms in HTML4. In contrast to the Web Applications (HTML5) specification, it is in a mature state. Web Forms 2.0's scope is also more limited -- focusing directly on improving the form widgets available in the browser.
What does this new revision of forms add? Among other things:
- Validation constructs to allow the browser to do more checking before the form is submitted. New attributes include
required
, min
, and max
. - DOM support for validity, with a
validity
attribute for form elements, and a new invalid
event. - Control over auto-completion for form elements, allowing document authors to indicate whether the browser should remember field values and offer to autocomplete them. Predefined values can be passed with the
list
attribute. - An
autofocus
attribute to indicate which form element should receive input focus when the document is loaded. - An
inputmode
attribute that allows the hinting of appropriate language input modes for text-holding form elements. - File upload control improvements, including specifying the expected file type and limits on file size.
- Repetition of templated form elements.
- New types of input controls:
datetime
, number
, range
, email
, and url
. Addition of patterns for restricting input values.
Web Forms is a more consistent specification than HTML5, and is already finding some implementation:
- Beta releases of Opera 9 include Web Forms 2.0 support
- An open-source Web Forms project has a DHTML+Behaviors implementation for Internet Explorer
The W3C's answer to the next generation of forms is XForms (see Resources). XForms differs from Web Forms 2.0 in that it develops a new model of browser-server interaction, based on passing XML documents. By contrast, Web Forms 2.0 is an incremental update to the existing form models intended to make current browser forms more usable. The two specifications address different needs, though obviously share some commonality. In the words of the Web Forms specification:
[T]his specification attempts to add some of the functionality of XForms with a minimum impact on the existing, widely implemented forms model. Where appropriate, backwards compatibility, ease of authoring, and ease of implementation have been given priority over theoretical purity.
Other implementations
canvas
is the main WHATWG feature with browser implementation. The rest of HTML5 is still at an early stage, and may never be implemented in its entirety.
However, the Web Applications and Web Forms specifications are taking on a new significance that probably wasn't anticipated when they first took shape. In recent months, several full-featured projects for developing user interface toolkits for Web applications have emerged. These implementations use HTML plus JavaScript technology, or Flash. Many of them might well take the obvious view that it is pointless to reinvent the wheel, and look to aspects of WHATWG specifications to standardize, for example, their form implementations.
Conclusion
The Web Forms 2.0 specification, by virtue of the obvious need and the completeness of the specification, stands a good chance of receiving implementation and making its way into an accepted standard. Indeed, Web Forms 2.0 has been submitted to the W3C for comment, having the effect of being a position statement and building block from the WHATWG collaborators.
However, it is difficult to glean a coherent view of the future of HTML from the WHATWG specifications alone. Some parts merely describe current innovations -- XMLHttpRequest
, canvas
-- while others seem vague and lack the same drive from implementers. Additionally, motivation for HTML5 is mainly for desktop, application-centric, use. A great deal of HTML is now found on non-PC devices, and that is in need of direction too.
Some of the richer ideas specified in HTML5 might now be made obsolete by the rise of Ajax-based browser interface toolkits. Why should developers be content with the restricted set of widgets specified in a document when they have an extensible toolkit to play with? It might well be that richer Web interfaces are standardized more by the market than by committee.
I'm glad to see progress made in describing commonly-implemented, but as-yet-unstandardized, technologies such as canvas
and XMLHttpRequest
, and hope that this will promote the interoperability of these important features. To move browser technology forward itself, HTML5 needs more clarity, and would benefit from being divided into three specifications, covering available now, available soon, and imagineering features.
Resources
LearnGet products and technologiessource:http://www-128.ibm.com/developerworks/library/x-futhtml1/?ca=dgr-lnxw01FutureHTML
# posted by dark master : 12/09/2005 09:38:00 AM
0 comments 
A few months ago we installed a burglar alarm in our house. The company sent a trustworthy employee to do the installation, and he set the whole thing up for us. With sensors all over the house, it even knows when someone opens a door, and can sense the difference between our dog and a burglar – well that’s what the man told us and after all who are we to disagree. Along with the system comes an impressive panel that allows us to switch it on, and off, and do all kinds of clever things. It’s all described in the excellent documentation although we cannot understand how to use it. The biggest challenge is changing the code to get access to switch it off. We did try it once but gave up. Fortunately the helpful engineer set it all up for us so we don’t need to worry – or should we worry?
So whenever we go on holiday and switch it on, the friendly company that also monitors our house know that we are away, and the taboo topic between my wife and I is the code – Is the trustworthy employee still employed? We simply don’t know.
Well it seems that we’re not alone in this concern about unchanged code. It seems that many if not all IT Auditors, CSOs, and IT security staff, live daily with the fear of the “never expiring password” being exposed. It is the unspoken taboo – the wide open back door in every corporate network today. It is virtually certain that there is not a single business critical application in your company that isn’t wide open. Do you ever wonder how it is that information such as credit card details, personal data, intellectual property, seems to always be so vulnerable. You would think that companies had adequate security precautions to stop this happening, and yet it continues to be a problem.
So where is this wide open back door? In every one of your applications.
When, for example, a user accesses a web based application through a Portal, behind the scenes an awful lot of activity takes place to present the information to the user. This information is stored on systems and databases in your organisation. In order to access these resources, the Portal uses service accounts created on the systems to access the data.
The challenge of securing, managing and sharing the service accounts becomes a major overhead issue for IT departments and application managers in your organisation. The Service Account Passwords that enable applications to communicate with each other must also be managed as they present one of the biggest security backdoors.
In order for these applications to get access to data, they have to “logon” to the systems and applications that store the data, and since the credentials to logon are in the application, they are embedded in the code. Now since it is clearly impractical to rewrite applications on a regular basis, just to change the user ID and password, the result is that the user ID and password never changes. So what’s the big deal you might ask? Well there are a number of things.
Firstly you have the problem of the never expiring password on a system which is accessible by administrators and anyone else who might have privileged access to a system. The problem is more acute when a company is relying on hosting services from a third party. Your applications are accessing valuable business critical data thousands of times a day, using the same user ID and password. In fact there might very well be hundreds of applications all accessing using the same credentials. And since the applications do not have any integrated security such as VPN technology, the passwords to these accounts are often stored in clear text (not encrypted), thus becoming visible to developers, support staff and anyone that has access to the application code.
Secondly because these passwords are often hard coded within the applications/scripts, a reset of a Service Account password becomes a complex process involving changes to application code, compilation, and in some cases a long process of transferring the code from development to QA to production. In some cases this change might result in or require downtime for the application, a scenario that is unacceptable in cases of confidential information.
Thirdly auditing is virtually impossible. Because the credentials that are embedded in the application, although in theory only accessible to the application they can actually be used by any developer who has access to the code. So if for example a person was to log in using the credentials, it would be impossible to discover this through a simple audit check.
Finally the most serious aspect of this is that this user ID and password is known by developers and support staff and can be used for personal access to the resources. And in many cases today those credentials are know by off-shore developers who have been contracted to develop the applications for your organisation. So access to your business data is ultimately in the hands of developers who may be thousands of miles away.
It is likely that your organisation has gone to unprecedented efforts to secure your access as a user, using all kinds of innovative technology from tokens to digital certificates, and at the same time forgetting or possibly choosing to ignore that unauthorized personnel including ex-employees, MSP staff, off-shore developers, have the keys to open up your most valuable assets.
The good news is that there are solutions available that will allow you to once and for all face up to this unspoken taboo and eliminate this threat. The solution is digital vaulting technology. It means that no organisation today needs to feel a sense of being exposed to risks in this area. Regardless of the platform, the technology is available today to ensure that all your applications will never again require the never expiring password, but the first step in solving the problem is to face up to the unspoken taboo in your organisation and do something about it.
source:http://www.net-security.org/article.php?id=879
# posted by dark master : 12/09/2005 09:35:00 AM
0 comments 
By Dawn Kawamoto
http://news.com.com/Unpatched+Firefox+1.5+exploit+made+public/2100-1002_3-5987401.html Story last modified Thu Dec 08 08:54:00 PST 2005
Exploit code for the latest version of open-source browser Firefox was published Wednesday, potentially putting users at risk of a denial-of-service attack.
The exploit code takes advantage of a bug in the recently released Firefox 1.5, running on Windows XP with Service Pack 2. Firefox, which initially debuted over a year ago, has moved swiftly to capture 8 percent of the browser market.
The latest Firefox flaw exists in the history.dat file, which stores information from Web sites users have visited with the Firefox 1.5 browser, according to a posting on the Internet Storm Center, which monitors online threats.
"If the topic of a page is crafted to be long enough, it will crash the browser each time it is started after going to such a page," according to the Internet Storm Center posting. "Once this happens, Firefox will be unable to be started until you erase the history.dat file manually."
In testing Firefox 1.5 without a system running McAfee security software, the Firefox 1.5 browser would stall and not respond to a user's mouse, said Johannes Ullrich, chief research officer for the Sans Institute, which runs the Internet Storm Center.
"Users have to kill out of the browser and start over again. This stalled browser creates a DOS (denial of service) condition," Ullrich said.
Packet Storm, the security group that initially published the proof-of-concept exploit code, noted that in addition to the potential denial-of-service attack that could follow a buffer overflow, systems may also be subject to a malicious execution of code.
Ullrich, however, said while the potential may exist, it has not been proven either way that malicious code could be executed.
Mozilla Foundation, which released Firefox, said it was not able to confirm the browser would crash or be at risk of a DOS attack, after visiting certain Web sites. And Mozilla has not received any reports from users of such a problem, said Mike Schroepfer, vice president of engineering for Mozilla Corp.
He added that Firefox 1.5 can be slugglish on its next start-up, due to a bug in the history.dat, but it is not a security problem.
"We have gotten no independent verification that it crashes (Firefox), but there have been a lot of attempts to try," Schroepfer said.
Correction: This story incorrectly stated the affiliation of Mike Schroepfer, Mozilla's results in verifying the Firefox 1.5 flaw, and the nature of the problem. Schroepfer is vice president of engineering with Mozilla Corp., and Mozilla has not been able to verify its browser can crash and lead to a denial-of-service condition. The problem itself was not a security vulnerability but actually a flaw in the browser.
# posted by dark master : 12/09/2005 09:34:00 AM
0 comments 