Friday, July 01, 2005
Slashdot | Science's 125 Big Questions
Slashdot | Science's 125 Big Questions: "Friday July 01, @07:40PM
'To celebrate their 125th anniversary Science is running a series of articles on the 125 Questions of Science. The top 25 each link to an article exploring the subject of the question in depth. Included are such questions as: Are we alone in the Universe? What are the limits of conventional computing? How did cooperative behavior evolve?'"
# posted by dark master : 7/01/2005 09:14:00 PM
0 comments 
2005 Looks Like Record Year for Net Growth
n the
July 2005 survey we received responses from
67,571,581 sites. The gain of 2.76 million hostnames from June is the second-largest monthly increase in the history of our survey, as 2005 continues to shape up as a historic year for Internet growth. The only larger gain was a 3.3 million hostname increase in March 2003, which ended months of stagnation and kicked off 30 consecutive months of positive growth for the Web.
Factors in the dramatic growth include:
- Increasing use of the Internet by small businesses as web sites and online storefronts become more affordable.
- The explosive growth of weblogs, a growing number of which are purchasing domains for branding purposes.
- Speculation in the market for domain names, buoyed by rising resale prices and the ability to generate revenue via pay-per-click advertising on parked domains.
- Strong sales of online advertising, especially keyword-based contextual ads that support business models for both domain parking and commercial weblogs.
The Internet has added 10.7 million hostnames in the first seven months of the year. Barring a dramatic slowdown, 2005 should easily exceed the record growth of 16 million hostnames in 2000.
Total Sites Across All Domains August 1995 - July 2005

Top Developers
Developer | June 2005 | Percent | July 2005 | Percent | Change |
Apache | 45172895 | 69.70 | 47030635 | 69.60 | -0.10 |
Microsoft | 13131361 | 20.26 | 13871645 | 20.53 | 0.27 |
Sun | 1849471 | 2.85 | 1842812 | 2.73 | -0.12 |
Zeus | 580844 | 0.90 | 608121 | 0.90 | 0.00 |
Developer | June 2005 | Percent | July 2005 | Percent | Change |
Apache | 20680650 | 69.98 | 21313596 | 69.81 | -0.17 |
Microsoft | 6708549 | 22.70 | 6959561 | 22.80 | 0.10 |
Zeus | 219798 | 0.74 | 238881 | 0.78 | 0.04 |
Sun | 220726 | 0.75 | 226001 | 0.74 | -0.01 |
Totals for Active Servers Across All Domains
June 2000 - July 2005
Sun is the sum of sites running SunONE, iPlanet-Enterprise, Netscape-Enterprise, Netscape-FastTrack, Netscape-Commerce, Netscape-Communications, Netsite-Commerce & Netsite-Communications.
Microsoft is the sum of sites running Microsoft-Internet-Information-Server, Microsoft-IIS, Microsoft-IIS-W, Microsoft-PWS-95, & Microsoft-PWS.
Platform groupings are here.
source:http://news.netcraft.com/archives/2005/07/01/july_2005_web_server_survey.html
# posted by dark master : 7/01/2005 09:13:00 PM
0 comments 
New tax for broadband customers?
Many broadband customers will pay new universal service taxes akin to those on their telephone bills if Congress bows to suggestions from rural legislators.
The suggestions came as lawmakers started debating changes to the Telecommunications Act of 1996, which created the framework for the Universal Service Fund, overseen by the Federal Communications Commission.
The USF currently collects a fixed percentage of revenues from long-distance, wireless, pay phone and telephone companies so that it can pass on subsidies to low-income customers, high-cost areas, and rural health care providers, schools and libraries. Most companies come up with their share, set for this quarter at 10.2 percent, by charging their customers a fee.
The USF should continue to be "industry funded," but the base of contributors should be expanded to "all providers of two-way communications, regardless of technology used, to ensure competitive neutrality," a bipartisan coalition of rural legislators said in a June 28 letter to the U.S. House of Representatives Energy and Commerce Committee, which will be drafting the rewrites. That means companies providing broadband services such as VoIP over telephone wires would also have to pay into the fund.
"We need to ensure government policies protect the infrastructure that makes advanced services, including broadband, possible and available to everyone in the United States," said the letter, signed by 62 House members.
"If our residents are to be competitive in today's fast-paced, technology-driven global marketplace, our communities will require affordable high-speed, high-capacity access to data and information over the Internet," Rep. John Peterson, R-Penn., co-chairman of the Congressional Rural Caucus, said at a press conference held the day the letter was released. "If the private sector is either unwilling or unable to provide that service at an affordable price, we'll find a way to provide it for ourselves."
The wireless industry applauded the proposed change "since wireless consumers are significant and disproportionate payers into the universal service and intercarrier compensation systems," Steve Largent, CEO of CTIA - The Wireless Association, said in a statement.
But Randolph May, a senior fellow at the Progress and Freedom Foundation, a market-oriented think tank, said policy-makers should be cautious before making any changes. Broadband access, he said, is getting cheaper and more widely available.
"It's not clear that any subsidies are needed," May said. "But if policy-makers want to provide some subsidies, they should be, in my view, carefully targeted to low-income people that really need them."
The Universal Service Fund in recent years has faced allegations of waste, fraud and abuse. The FCC announced in June a formal inquiry into its management.
# posted by dark master : 7/01/2005 09:12:00 PM
0 comments 
Slashdot | MMOGs Reaching For Casual Gamers
Slashdot | MMOGs Reaching For Casual Gamers: "Friday July 01, @06:23PM
The Guardian Gamesblog has a nice bit of commentary up today discussing the push for MMOGs to connect with casual gamers. Announcements of Massive games on the next generation of consoles have been fast and furious, but skeptics seem to feel casual gamers may not make the leap. Indeed, even veteran MMOG players have difficulty with the genre, as a recent AFKGamer column on how to deal with Grind illustrates. From the Guardian article: 'Still, in order to be a viable entity on a home console unit - competing directly with the likes of GTA, Super Mario and FIFA - things will have to change. Some may call it dumbing down, but the product must be created with the consumer in mind. Personally, while I consume my fair share, I'm still only primarily interested in them from an academic perspective, as resources of human sociability in online space'"
# posted by dark master : 7/01/2005 09:12:00 PM
0 comments 
Government To Fix Identity Theft?
"With nearly 50 million identities compromised in the last 6 months, the powers that be are gearing up to fix the problem. 'Prosecutors and privacy experts say that what America needs is a coordinated national strategy. While 15 states require companies to tell consumers if their data has been compromised, there's still no national law.' A new study joins a host of other statistics -- some private, some government-sponsored -- attempting to quantify the size of the ID theft problem. There is no universal agreement on the size of the problem, on the way to count the victims, or even on how to define identity theft."
source: http://yro.slashdot.org/article.pl?sid=05/07/01/205221&tid=158&tid=187&tid=219
# posted by dark master : 7/01/2005 09:11:00 PM
0 comments 
Swedes curb rampant downloading
Swedes curb rampant downloading
Swedes can no longer freely download copyrighted material |
Sweden has outlawed the downloading of copyrighted movies, games and music in an attempt to curb rampant piracy. About 10% of Swedes freely swap music, games and films on their computers, one of the highest rates in the world.
With no law banning file-sharing, Sweden had become a hotbed of piracy where films, music and software were readily swapped.
But experts believe the law will change little and that Swedes will remain rampant downloaders.
Pirate haven
Prior to the law coming into force, Sweden was the only European nation that let people download copyrighted material for personal use.
As a result many Swedes, thanks also to the available of cheap high-speed net access, were committed downloaders. It is estimated that about 900,000 Swedes regularly downloaded movies, games and music.
The law was drawn up to bring Sweden into line with EU directives and is also part of a wider crackdown on net piracy.
It comes a day after the US Attorney General's office announced an 11-nation operation to catch and shut down net piracy groups.
But, say experts, the habit of downloading is likely to be hard to break.
"There is nothing that indicates that (the pirates) would change their behaviour," said Henrik Ponten, a spokesman for Antipiratbyran, a Swedish anti-piracy agency funded by film studios and game makers.
"A law in itself changes nothing," he said.
No fear
Antipiratbyran estimates that one in every 2,000 Swedes has received a letter telling them that they are making pirated material available from their computer. In other nations the ratio is one in every 7,000.
The change in the law was popular with most Swedish politicians. But the nation's Justice Minister said that chasing pirates would only be a priority for the police if files were being downloaded in massive quantities.
Before the new law was passed, it was only illegal to make copyrighted material available to others via the net, whereas downloading the content was allowed.
The older law is set to be tested later this year during the trial of a 27-year-old Swede, charged with illegally making a Swedish movie available from his home computer.
Mr Ponten said if the man were fined it would send a signal to many that they could continue downloading with little fear of the consequences.
Antipiratbyran's letter writing campaign has led it to being reported to Sweden's data protection agency for flouting privacy laws by tracking people down via their net address.
As a result the data protection agency has said Antipiratbyran must stop sending out letters.
"The situation in Sweden is completely unique, with this kind of counter-reaction," said Mr Ponten.
"The forces that are fighting to keep this illegal behaviour are incredibly strong."
source:http://news.bbc.co.uk/1/hi/technology/4642373.stm
# posted by dark master : 7/01/2005 09:10:00 PM
0 comments 
Copyright and the law - Rip. Mix. Burn.
Rip. Mix. Burn.
Jun 30th 2005
From The Economist print edition
Media companies are jubilant at a Supreme Court judgment, but Congress should take them on AS USUAL, America's Supreme Court ended its annual term this week by delivering a clutch of controversial decisions. The one that caught the attention of businessmen, and plenty of music lovers, was a ruling concerning the rampant downloading of free music from the internet.
Nine elderly judges might have been forgiven for finding the entire subject somewhat baffling. In fact, their lengthy written decisions on the case betray an intense interest, as well as a great deal of knowledge. Moreover, they struck what looks like the best available balance under current laws between the claims of media firms, which are battling massive infringements of their copyrights, and tech firms, which are keen to keep the doors to innovation wide open (see article).
This case is only the latest episode in a long-running battle between media and technology companies. In 1984, in a case involving Sony's Betamax video recorder, the Supreme Court ruled that technology firms are not liable if their users infringe copyright, provided the device is “capable of substantial non-infringing uses”. For two decades, this served as a green light for innovations. Apple's iTunes, the legal offspring of illegal internet file-sharing, is among the happy results. But lately, things have turned against the techies. In 2000, a California court shut down Napster, a distributor of peer-to-peer (P2P) file-sharing software. It had, the court decided, failed to stop copyright violations (though the firm relaunched as a legal online-music retailer).
In its ruling this week, the court unanimously took the view that two other P2P firms, Grokster and StreamCast, could be held liable if they encourage users to infringe copyrights. The vast majority of content that is swapped using their software infringes copyrights, which media firms say eats into their sales. Although the software firms argued they should not be responsible for their customers' actions, the court found that they could be sued if they actually encouraged the infringement, and said that there was evidence that they had done so. On the other hand, the court did not go as far as media firms demanded: they wanted virtually any new technology to be vulnerable to legal action if it allowed any copyright infringement at all.
Both the entertainment and technology industries have legitimate arguments. Media firms should be able to protect their copyrights. And without any copyright protection of digital content, they may be correct that new high quality content is likely to dry up (along with much of their business). Yet tech and electronics firms are also correct that holding back new technology, merely because it interferes with media firms' established business models, stifles innovation and is an unjustified restraint of commerce. The music industry is only now embracing online sales (and even experimenting itself with P2P) because rampant piracy has demonstrated what consumers really want, and forced these firms to respond.
The Supreme Court tried to steer a middle path between these claims, and did a reasonable job. But the outcome of the case is nevertheless unsatisfactory. That's not the court's fault. It was struggling to apply a copyright law which has grown worse than anachronistic in the digital age. That's something Congress needs to remedy.
In America, the length of copyright protection has increased enormously over the past century, from around 28 years to as much as 95 years. The same trend can be seen in other countries. In June Britain signalled that it may extend its copyright term from 50 years to around 90 years.
This makes no sense. Copyright was originally intended to encourage publication by granting publishers a temporary monopoly on works so they could earn a return on their investment. But the internet and new digital technologies have made the publication and distribution of works much easier and cheaper. Publishers should therefore need fewer, not more, property rights to protect their investment. Technology has tipped the balance in favour of the public domain.
A first, useful step would be a drastic reduction of copyright back to its original terms—14 years, renewable once. This should provide media firms plenty of chance to earn profits, and consumers plenty of opportunity to rip, mix, burn their back catalogues without breaking the law. The Supreme Court has somewhat reluctantly clipped the wings of copyright pirates; it is time for Congress to do the same to the copyright incumbents.
source:http://www.economist.com/printedition/displayStory.cfm?Story_ID=4128994
# posted by dark master : 7/01/2005 09:06:00 PM
0 comments 
We Don't Need the GPL Anymore
ESR: "We Don't Need the GPL Anymore"
by
Federico Biancuzzi 06/30/2005
Recently, during FISL (Fórum Internacional de Software Livre) in Brazil, Eric Raymond gave a keynote speech about the open source model of development in which he said, "We don't need the GPL anymore. It's based on the belief that open source software is weak and needs to be protected. Open source would be succeeding faster if the GPL didn't make lots of people nervous about adopting it." Federico Biancuzzi decided to interview Eric Raymond to learn more about that.
Why did you say we don't need the GPL anymore?
It's 2005, not 1985. We've learned a lot in the last 20 years. The fears that originally led to the reciprocity stuff in GPL are nowadays, at least in my opinion, baseless. People who do what the GPL tries to prevent (e.g., closed source forks of open source projects) wind up injuring only themselves. They trap themselves unto competing with a small in-house development group against the much larger one in the parent open source project, and failing.
I read that you studied marketing to promote OSI. I'm wondering whether maybe the fact that Linux is more famous than BSD could be related to the "virality" of the GPL. If every time you build a product based on Linux you must release the source code, it's clear that everyone will know that it's based on Linux. And so the buzz keeps growing: another product based on Linux! This doesn't happen with BSD licensed code. For example, I read that the new portable console Sony PSP includes some code from NetBSD. How do we know it? Just for the advertising clause written in a simple text file that someone discovered on their web site. But that is different from the way Sony announced that Playstation 3 will natively run Linux. They proudly wrote a press release and started a big buzz. What is your opinion?
I believe you overestimate the marketing benefits of saying "Linux Inside!"; most technology consumers don't care, and Sony knows that. I think Sony is announcing Linux support because they know it's the most likely option to attract developers. NetBSD is a worthy project, but, let's face it, the fan base for it simply is not large enough to justify spending marketing effort to recruit them.
More generally, I don't think the GPL is the principal reason for Linux's success. Rather, I believe it's because in 1991 Linus was the first person to find the right social architecture for distributed software development. It wasn't possible much before then because it required cheap internet; and after Linux, most people who might otherwise have founded OS projects found that the minimum-energy route to what they wanted was to improve Linux. The GPL helped, but I think mainly as a sort of social signal rather than as a legal document with teeth.
Some time ago there was a monetary offer to get a Linux snapshot under BSD license. Would you have accepted?
I wouldn't have had the right to accept Jeff Merkey's offer; I'm not Linus. But supposing I were ... hmmm ... it's interesting that you asked me this, because I've exchanged email with Linus about a not entirely dissimilar set of questions having to do with how we cope if a court breaks the GPL.
I think the answer to your question is no. I wouldn't worry about a closed fork harming the community; I think that's a self-punishing form of idiocy. Linus probably still disagrees with me on this; he's always been more of a GPL fan than I am. What I would feel is something I know for certain Linus does--an obligation not to violate the expectations of the kernel contributors without better reason than a few dollars in my pocket.
Linus has told me he will relicense the kernel only in the case of an emergency that makes it necessary. I think in his shoes I would have the same policy.
Linux (the kernel) comes under GPL version 2. Quoting Linus's note:
Also note that the only valid version of the GPL as far as the kernel is concerned is _this_ particular version of the license (ie v2, not v2.2 or v3.x or whatever), unless explicitly otherwise stated.
Is it certain that Linus will adopt GPL v3?
It depends. I know for a fact that he is concerned that GPL 3.0 will overreach.
Is there any risk of a fork over this? One Linux kernel led by Linus under GPL v2 and another led by someone else under GPL v3?
I think not. The technical culture of the Linux kernel group doesn't seem to me to be composed of licensing fanatics.
The GPL includes a clause that automatically shifts the license terms to any new version of the license itself. Isn't this a Trojan horse?
No, because the clause says "at your option." That is, the person receiving GPL v2 software gets to choose whether v2 or the later version applies.
But this means:
- The user will choose which particular version of GPL he prefers, not the original software author.
- This mobile target (GPL 2.0, 2.1, 2.x, 3.x) could be chosen appropriately in case of a lawsuit.
How can this undefined condition be a good thing?
The user chooses only for copies of the software still under GPL 2. The author is free to change his license to 3.0.
The condition you call "undefined" means users cannot have rights they might have been counting on summarily yanked out from under them. Which is a good thing: the GPL was supposed to be about guaranteeing the users' rights, after all.
You take code under GPL and then you are free to choose every single day of the week a different version of that license because you don't have to write that number anywhere. Shouldn't the people working on the GPL v3 remove this clause or require licensors to write somewhere the exact version number of the GPL "receiving" developers have chosen?
Why? What problem would this solve?
It certainly wouldn't do anything for the users, who would be losing options rather than gaining them.
Quoting from Insider Hints at GPL Changes:
The current version of the GPL, which was last updated in 1991, fails to trigger the open source license if a company alters the code, but does not distribute its software through a CD or floppy disk.... But the rule does not apply to companies that distribute software as a service, such as Google and eBay, or even dual-license companies like Sleepycat.... "That means a bunch of innovation is being taken out. This is an important problem for us working on the new GPL to get right."
Wouldn't expanding its virality to software distributed as a service be a dangerous path?
I never really thought virality was a good idea in the first place, so I have to see this as a step in the wrong direction. I've used GPL on many of my own projects, but more as a gesture of solidarity with the majority in the community than because of any attachment to the GPL itself.
The pros and cons of "viral licensing" is something I've been thinking about a lot recently. As far back as 1998, I suspected that allegiance to the GPL is actually evidence that open source developers don't really believe their own story. That is, if we really believe that open source is a superior system of production, and therefore that it will drive out closed source in a free market, then why do we think we need infectious licensing? What do we think we gain by punishing defectors?
Stronger virality punishes defectors more effectively, but also has more tendency to scare people away from joining the open source community in the first place. Where the optimum point is all depends on how important punishing defectors really is relative to the economic pressures in favor of open source. My current belief is that the free market will do quite a good job of punishing defectors on its own; thus, increasing virality is a bad move.
Mind you, in saying this I'm not defending or excusing anybody who is cheating on the GPL's terms now. Whether our not viral licensing is a good idea for the community may be debatable, but software authors who choose it have a right to expect that those terms will be obeyed.
On gpl-violations.org there are some examples of companies that used GPLed code for their products, but "did not make any source code offering or include the GPL license terms with their products."
Most of them signed a "Declaration to Cease and Desist from further distributing their product without adhering to the license terms," while one of them refused to sign the declaration and so "the netfilter/iptables project was compelled to ask the court for a preliminary injunction," banning the distribution of that product unless the company "complies with all obligations imposed by the GNU GPL."
Why does this happen? Don't business companies understand GPL terms? Or maybe they try to cheat, hoping that nobody will discover them?
It's probably both. That is, some companies cheat and others blunder. There are probably some that have both cheated and blundered.
But if they prefer to keep the source code undisclosed, why don't they choose BSD-licensed code?
The blunderers don't know what they're doing. The cheaters may not know of BSD equivalents of what they want.
Maybe because they chose to put Linux in their embedded products just because it's one of the current buzzwords in IT and not because it is technically better than BSD systems?
I can't read the minds of blunderers and cheaters, and would not want to immerse myself in their thinking if I could.
Is managing a successful software company really possible, providing all your code under GPL?
Cygnus ran that way at a profit for years. I think Red Hat does today, though I'm not certain about the status of Cygwin.
But Red Hat doesn't release all of its software as open source.
I believe they make their real money on a product (Red Hat Enterprise Linux) that is completely GPLed. If you know differently, can you tell me what other licenses they are using?
It seems that Red Hat is selling its GNU/Linux distribution under a sort of user license that limits the freedom No. 2 provided by the GPL. The short version of the story, as I was told, is that if I buy a CD/DVD with the last Red Hat version and I make an ISO from that and put that online, I'll get sued.
The same thing happens with computer magazines. They cannot include any Red Hat CD because the term Red Hat is a trademark or something like that, and they don't let the magazine use it without permission. And obviously they don't give you that permission. Magazines must use Fedora and never say Red Hat.
Excuse me while I fire up a browser and research this a bit... Ahhh... right, if you republish a RHEL CD in either form, you could get sued for illegal use of the embedded trademarks. I think I just found the user license in question.
So the answer to your question is yes... Red Hat is a demonstration that you can have a profitable business based on entirely GPL code. You may have to play some interesting tricks with trademark law to do it, though. As I understand it now, what Red Hat has done is legally blocked republication of its entire RHEL distribution even though any component part is still GPLed and therefore freely redistributable.
Damn, that's clever and sneaky. I like it. It serves everybody: Red Hat gets a fence around its product, but all the community objectives of open source licensing are still met.
Do you consider this behavior coherent?
Yes. It makes logical and even ethical sense.
Isn't one of the community objectives of open source licensing the possibility to share the code? So how can the fact that every single piece is still under GPL and thus redistributable, but if I take the whole CD/ISO it will be covered by the US trademark laws, be acceptable?
That's the beauty of it. The possibility of sharing the code is unaffected--what you can't "share" is Red Hat's integration work and branding.
Some time ago, Linux abandoned BitKeeper because the company behind it didn't like the reverse-engineering effort done by Andrew Tridgell. Larry McVoy, BitKeeper's primary author, said, "You can compete with me, but you can't do so by riding on my coattails. Solve the problems on your own, and compete honestly. Don't compete by looking at my solution."
This is not the first reverse-engineering effort to write an open source tool that uses a proprietary protocol; just think of Samba or Gaim.
Is freedom more important than innovation?
The question is falsely posed. Freedom and innovation are not opposites; they are intimately connected. Proprietary protocols and the monopolies they spawn prevent innovation. Open standards encourage it.
McVoy is engaging in a rhetorical deception, one he copied from Microsoft. He wants you to believe that innovation will stop if software vendors can't collect secrecy rents. But this is pure, unadulterated bullshit. Is innovation in automobile designs stopped because auto designers can look at each other's engines?
Open source designers think up and deploy more new ways to do things every week than the proprietary industry can manage in six months. The reason is lower process friction--when you want to experiment, you don't have to argue with a boss or risk being shot down because your company thinks it might be too disruptive of their existing product line. You just do it.
Freedom is the oxygen of innovation, not its enemy.
So why didn't Andrew develop a new protocol and made it an open standard? I mean, he chose to write a free tool to handle a proprietary protocol. He chose freedom, not innovation. Only after the company stopped the support of the free client did they start a new project (Git).
Tridge did this because what he needed was a tool to extract BitKeeper metadata from the kernel archive--data which had been put in BitKeeper but which McVoy did not own. Writing a new protocol would have been beside the point--what he wanted was to get the metadata out of jail.
Everyone is free to choose where to put his own data. Linus chose BitKeeper years ago. So the root of the problem is that someone chose a proprietary application from the very beginning.
That's right. But Linus's choice should not constrain either Linus or other developers from getting at their own data.
If I don't want to have my data trapped in a proprietary format, I will avoid those proprietary applications. What is the sense of using the proprietary application and then asking for an open source tool to access the data?
You own your data, even when it's trapped in a proprietary application. There is a term in American law, conversion, for the act of refusing to give back property of others that has been entrusted to you for safekeeping. This is probably illegal wherever you live, too, and when proprietary vendors trap your data and refuse to let you get at it except through their application, they may be committing a crime.
In the open source world, we think this gives you a right to do whatever is necessary to retrieve it. The wrong, if there is one, isn't in creating open source rescue tools as Tridge did; it is on the part of proprietary vendors who refuse to provide facilities for export to an open, fully documented dump format.
In fact, in this case McVoy did provide such an export format. All Tridge did was figure out the magic words that tell the BitKeeper server to dump that information to the client. His rescue tool--I've seen it--is a trivial script. So there was no wrongdoing on either side here, just McVoy shooting his mouth off because Tridge figured out how to use a poorly documented feature.
If open standards are the final goal, why should people write open source software to support proprietary protocols?
To support users who don't want their data to be trapped in proprietary applications under vendor control.
So why did those users choose proprietary applications at first?
Sometimes they might have no choice. Sometimes they might believe that choosing a proprietary application is the best choice. It doesn't really matter which of these is the case. Either way, they have a right to get their data out of jail once they've realized their mistake.
But if we keep writing open source software for people "trapped" in proprietary formats or applications, people will keep choosing them just because there is an "emergency exit door"!
Shouldn't we use this "trap" to attract people to open source software directly? If we keep accepting and helping people with proprietary formats, why should they change software and formats?
Because the software they're using is, in general, badly engineered, inflexible, and full of security holes. By supporting proprietary formats, we reduce the friction costs of migrating out of that world.
What type of relation do you see between the open source world and standards such as IEEE standards and de facto standards?
A very intimate and symbiotic one: each requires the other. Most people already understand one direction of the dependence--for open source to flourish, strong open standards for communications protocols and data formats are essential.
What a lot of people have not figured out yet--though the W3C and IETF are moving in this direction--is that open standards depend on open source reference implementations. It is extremely difficult to specify everything that needs to be specified in a technical standard; in some notorious cases, such as the way Microsoft perverted Kerberos, vendors have used that underspecification as a loophole to create implementations that locked in their customers. Open source reference implementations are the only known way to prevent such abuse.
Do you think that organizations such as W3C and IETF should actively verify and enforce that protocols and formats proposed by companies not be patented? I'm thinking of Cisco and the VRRP story....
Yes. A "standard" with blocking patents is a dangerous trap.
The upcoming GPL version should cover some new aspects such as patents. Do you think that this could worry businesses more than its virality and make lots of people even more nervous about adopting it?
It depends on how sweeping the new clauses are. Without seeing them, I can't really evaluate the possibility.
Some big companies announced that they will share their patents to cover Linux and some other open source projects. I don't understand how this can be a good thing. If every open source developer is against software patents, shouldn't we try to boycott the system? What is the sense if we say, "We are against software patents, but if any big companies patents things for us, it's better"?
I don't hear anybody saying this. The open source community is not asking companies to patent anything, merely accepting those grants when they happen. It's not like we could stop them, really.
I don't understand your last sentence. If someone gives you a present, you will probably accept. But if you are against something and the present is related to that "something," shouldn't you refuse the present?
I didn't hear anyone saying, "Thanks, IBM, we don't want your patents because we are against this system and we don't want to support it or being part of it." Don't you think they should have?
Why do that, rather than encouraging IBM in an action that subverts the system?
I don't think IBM behavior subverts the system. I think they did that move to get the press talking about them and Linux, and to tell clients that they will be covered by IBM lawyers if needed. However, the point is that since Linux people didn't refuse this move, now they have an entanglement with IBM patents. Doing so, they sent a message to the people that is clearly not "we are against patents."
I don't agree with that interpretation. Think back to the days of the Cold War. If the Soviets let some dissidents out of jail, would you have told them "No, put those people back in jail" because you thought accepting that action meant supporting the Gulag?
It's a strange example.
Just follow the logic.
Let's get back to the GPL. Aren't you concerned that you'll stir up a political mess by saying we don't need it? Some people might think you're trying to wreck GPL 3.0 before it launches.
No. I've actually been making the argument that the GPL is rationally justified only if open source is an inferior system of production since 1998. But back then, the evidence that open source is a superior system was much less clear. I believed it, having written the first analytical argument for that proposition in The Cathedral and the Bazaar in late 1996. I already knew what that implied about the superfluousness of the GPL when I wrote the paper--but until there was a lot more real-world evidence, I knew the argument would be difficult for GPL fans to hear.
In fact, it's political considerations that kept me quiet for a while. For a long time, I judged that any harm the GPL might be doing was outweighed by the good. For some time after I stopped believing that, I didn't see quite enough reason to fight with the GPL zealots. I'm speaking up now because a couple of curves have intersected. It has become more apparent how much of an economic advantage open source development has, and my judgment of the utility of the GPL has fallen.
Yes, many people will view this as heresy. Fine--it's part of my job to speak heresy in ways other people might feel afraid to do. If there is any better use for being famous and respected than using that status to question orthodoxy, I haven't found it yet.
Federico Biancuzzi manages the BSD section of the Italian magazine Linux&C. As a freelancer, he writes for ONLamp, LinuxDevCenter, and NewsForge.
source: http://www.onlamp.com/pub/a/onlamp/2005/06/30/esr_interview.html
# posted by dark master : 7/01/2005 08:51:00 PM
0 comments 
BitTorrent Whiz Extolled Piracy?
BitTorrent programmer Bram Cohen may be in legal jeopardy after the discovery on Wednesday of an old agenda buried on his website saying he creates programs to "commit digital piracy."
The polemic would have been of little interest a week ago. But on Monday, the Supreme Court ruled that the intent behind a file-sharing program can be a decisive factor in determining whether the creator can be sued for its users' copyright infringement.
Cohen said the agenda was written years before he started work on BitTorrent, and that it was written as a parody of other manifestos.
"I wrote that in 1999, and I didn't even start working on BitTorrent until 2001," Cohen said. "I find it really unpleasant that I even have to worry about it."
Undated and less than 200 words long, Cohen's "Technological Activist's Agenda" says he creates and gives away software in furtherance of laissez-faire political objectives.
"I further my goals with technology," the manifesto reads. "I build systems to disseminate information, commit digital piracy, synthesize drugs, maintain untrusted contacts, purchase anonymously and secure machines and homes."
In a unanimous ruling penned by Justice David Souter, the Court found that file-sharing software companies Grokster and StreamCast Networks can be sued because "the record is replete with evidence" showing the companies took steps to encourage infringement. The case has been returned to the lower courts for trial.
Cohen has never publicly encouraged piracy, and he has consistently maintained that he wrote BitTorrent as a legitimate file-distribution tool. That would seem to make him and his budding company, BitTorrent, safe under the Grokster ruling.
But legal experts worry the newly discovered manifesto extolling "digital piracy" could put him on less certain legal ground.
"Before I saw the manifesto, it always seemed clear to me that he's had a very clean record," said Mark Schultz, a law professor at Southern Illinois University Law School. "A good lawyer will try to nail him to the wall with that, and any other statements they can find. It's circumstantial evidence of intent. It's not a slam dunk but it hurts his case a little."
Cohen said although he contributed to an earlier peer-to-peer tool, MojoNation, the text wasn't alluding to that system either.
Rather, his agenda was intended as a parody of the "Cypherpunk's Manifesto," a 1993 text that guided the pro-privacy cryptographic coders movement.
Indeed, the "Technological Activist's Agenda" doesn't strictly reflect his actual views on intellectual property, or, for that matter, anonymity or recreational drug use, he said.
"That was written in a combative confrontation style; I wasn't really talking about anything," said Cohen. "It was a reaction-getting thing.... I think it's pretty clear the way that was written is that it was written in voice. It was an exaggerated character speaking it."
Cohen said he's unhappy that the Supreme Court's decision is forcing him to confront something he wrote more than five years ago.
"The way they talked about intent is so vague that it can cause people to pay attention to things that they wrote years and years ago, having nothing to do with what they're doing right now," Cohen said.
"Anybody who thinks that they might produce technology at some point in the future that might be used for piracy has to watch everything that they say," he added.
Fred von Lohmann, senior staff attorney with the Electronic Frontier Foundation, which represented StreamCast Networks in the recent case, said Cohen has a good point.
"I don't think it's anything that Bram needs to worry about but the Supreme Court seems to think that everything is relevant to the discussion," he said. "It raises the question of (whether) anything you've ever said can be used as evidence against you later."
But von Lohmann said if the Motion Picture Association of America wanted to go after Cohen, it would have done it a long time ago.
"I've heard them say they are not interested in going after Bram," von Lohmann said. "I take them at their word on that."
Ashwin Navin, BitTorrent's chief operation officer, said the company isn't worried about the manifesto's legal implications, but is afraid that some may seize upon it as proof that the company is pro-piracy.
"Our worry is that journalists will pull this out of context," said Navin. "It's not a business problem. It's more of a PR problem."
Kori Bernards, a spokeswoman for the Motion Picture Association of America said "we want to work with people like Bram and others to come up with a solution to the problem of illegal trading of copyrighted material.... With regards to (what he said about) digital piracy, we hope he's changed his mind."
source:http://www.wired.com/news/digiwood/0,1412,68046,00.html?tw=rss.TOP
# posted by dark master : 7/01/2005 10:35:00 AM
0 comments 
Microsoft settles IBM antitrust claims
Microsoft and IBM have settled outstanding legal claims stemming from the U.S. government's antitrust case against Microsoft in the mid-1990s.
The agreement, announced Friday, will result in a $775 million payment to IBM and a $75 million credit toward Microsoft software.
In the course of the U.S. Justice Department's antitrust suit against the software giant, the government claimed that IBM suffered from Microsoft's discriminatory pricing and overcharging practices, according to a Microsoft statement released Friday.
The settlement resolves those antitrust claims, as well as others related to IBM's OS/2 operating system and SmartSuite desktop application suite.
"IBM is pleased that we have amicably resolved these longstanding issues," Ed Lineen, senior vice president and general counsel for IBM, said in the statement.
The pact does not cover claims for alleged harm to IBM's server hardware or server software business.
As part of the settlement, IBM has agreed to not seek monetary damages related to server products for two years and not make server-related claims involving events prior to June 30, 2002, according to Microsoft's statement.
Microsoft has sought to resolve all of the ongoing legal processes against the company, including antitrust claims, over the past few years. In the statement, Brad Smith, Microsoft's general counsel and senior vice president, said the IBM resolution is a "significant step toward achieving that goal."
In November 2003, Microsoft and IBM entered into "tolling agreements," which extended the statute of limitations on IBM's antitrust claims against Microsoft without litigation. With those agreements set to expire in July, the companies spent the last two months devising a settlement.
source:http://news.com.com/Microsoft+settles+IBM+antitrust+claims/2100-1014_3-5771535.html?tag=nefd.top
# posted by dark master : 7/01/2005 10:31:00 AM
0 comments 
100 million go online in China
Millions of Chinese go online via internet cafes |
The number of internet users in China has risen above 100 million for the first time, according to reports in the country's state media. Only the US now has more web surfers as young and old Chinese take to the internet in record numbers.
The figure is expected to grow rapidly in the next few years.
China's economic boom is behind the dramatic rise as increasing personal wealth means more people are able to buy computers and go online.
Great Firewall
But the Chinese authorities are less in love with the net. The government regularly tries to block access to material it considers pornographic or politically subversive.
Only last week, the authorities threatened to shut down websites and blogs that failed to register with regulators in a new campaign to tighten controls on what the public can see online.
The so-called Great Firewall of China is constantly being breached as citizens and the authorities play a cat and mouse game with the flow of information.
Of the 100 million net users, about 30 million have broadband.
Mobile phone usage is also on the rise, gaining about 60 million new users each year.
There are now 358 million mobile phone users in China and it makes up 44.6% of China's telecom business.
source: http://news.bbc.co.uk/1/hi/technology/4630867.stm
# posted by dark master : 7/01/2005 10:29:00 AM
0 comments 
Slashdot | Planet Discovered with a Massive Core
Slashdot | Planet Discovered with a Massive Core: "Friday July 01, @11:29AM
'A collaboration of astronomers discovers possible a 'Rossetta Stone' of planetary formation study, reported by San Francisco State Univerity and Subaru Observatory. This new planet, orbiting around G-star like our Sun (HD 149026), weighs roughly equal to that of Saturn, while its size is significantly smaller in diameter. Planetary modeling suggests that the core of the planet alone must have 70 times more mass than Earth, indicating the possible existence of a metallic solid core inside the planet. Just like the rocky planet discovered earlier, the finding of this dense-core planet may lead to better understading of the formation of rockey planets in the Universe.'"
# posted by dark master : 7/01/2005 10:29:00 AM
0 comments 
Rival Pop-Up Ads Legal for Authorized Adware
By Anick Jesdanun
AP
06/30/05 9:15 AM PT
Though the case did not directly address consumer frustration over adware, which often gets onto computers without their owners' full knowledge, the court said it viewed WhenU's ads as authorized.

The Formula for Total Customer Experience
Morae is a software solution for usability testing, and the perfect partner to your web analytics solution. Identify barriers to conversion, while watching real users navigate your site. See for yourself!
Adware companies do not break trademark laws when they use a retailer's Web address to trigger coupons and other ads for rivals' products, a federal appeals court has found.
The 2nd U.S. Circuit Court of Appeals becomes the nation's highest court to rule on a fundamental practice of adware companies that serve up pop-up and other ads based on sites users visit. Lower courts around the country had issued conflicting opinions.
In 1-800 Contacts's lawsuit against adware provider WhenU.com, the appeals court likened WhenU's ads to retail stores that place generic competitors next to brand-name products.
Though the case did not directly address consumer frustration over adware, which often gets onto computers without their owners' full knowledge, the court said it viewed WhenU's ads as authorized.
The ruling may not, however, fully apply to many of the trademark disputes involving adware companies or such search companies as Google (Nasdaq: GOOG)
that target ads based on search terms, including brand names.
# posted by dark master : 7/01/2005 10:27:00 AM
0 comments 
Slashdot | Copyright Issues in the Mainstream
Slashdot | Copyright Issues in the Mainstream: "Friday July 01, @10:19AM
'Recently, the Supreme Court of the U.S. ruled on a momentous topic, the Grokster case (as covered on Slashdot). It turns out, however, it's not just geeks who are taking notice, and we're not the only ones who think things are getting ridiculous. The Economist has a great story on the subject, noting among other things, that if the cost of publishing had come down with the internet, perhaps the amount of protection needed to encourage publishing is less as well.' From the article: 'Both the entertainment and technology industries have legitimate arguments. Media firms should be able to protect their copyrights. And without any copyright protection of digital content, they may be correct that new high quality content is likely to dry up (along with much of their business). Yet tech and electronics firms are also correct that holding back new technology, merely because it interferes with media firms' established business models, stifles innovation and is an unjustified restraint of commerce.'"
# posted by dark master : 7/01/2005 10:27:00 AM
0 comments 
646-Pound Catfish Netted in Thailand

Fishermen in northern Thailand have netted a fish as big as a grizzly bear, a 646-pound Mekong giant catfish, the heaviest recorded since Thai officials started keeping records in 1981. The behemoth was caught in the Mekong River and may be the largest freshwater fish ever found.
"It's amazing to think that giants like this still swim in some of the world's rivers," said Dr. Zeb Hogan, a WWF Conservation Science fellow and leader of a new World Wildlife Fund (WWF) and National Geographic Society project to identify and study all freshwater fish over 6 feet long or 200 pounds. "We've now confirmed now that this catfish is the current record holder, an astonishing find."
The fish was caught and eaten in a remote village in Thailand along the Mekong River, home to more species of giant fish than any other river. Local environmentalists and government officials negotiated to release the record-breaking animal so it could continue its spawning migration in the far north of Thailand, near the borders of Thailand, Laos, Myanmar and China - also known as the "Golden Triangle"). But the fish, an adult male, later died. The species is declining, which fishermen in the region blame on upstream dams and environmental deterioration. The specimen is the largest giant catfish ever recorded; it is listed by the Guinness Book of World Records as the largest freshwater fish.
The Mekong giant catfish is Southeast Asia's largest and rarest fish and the focus of Dr. Hogan's project along with about two-dozen other species around the world such as the giant freshwater stingray, the infamous dog-eating catfish, the dinosaur-like arapaima, and the Chinese paddlefish - all of which remain contenders for the title of the world's largest fish. Long shots for the title include caviar-producing sturgeon, goliath Amazon catfish, giant lungfish, razor-toothed gars, massive cods, and Mongolian salmon.
"I'm thrilled that we've set a new record, but we need to put this discovery in context: these giant fish are uniformly poorly studied and some are critically endangered. Some, like the Mekong giant catfish, face extinction," continued Dr. Hogan. "My study of giant freshwater fish is showing a clear and global pattern: the largest fish species are disappearing. The challenge is clear: we must find methods to protect these species and their habitats. By acting now, we can save animals like the Mekong giant catfish from extinction."
The Mekong River Basin is home to more species of massive fish than any river on Earth. It is also the most productive fishery in the world, generating $1.7 billion each year. Fish from the Mekong are the primary source of protein for the 73 million people that live along the river.
source: http://www.scienceblog.com/catfish.html?q=node/8320
# posted by dark master : 7/01/2005 10:24:00 AM
0 comments 
Net Pioneer Wants New Internet
One of the fathers of the internet wants to be a daddy again.
David Clark, who led the development of the internet in the 1970s, is working with the National Science Foundation on a plan for a whole new infrastructure to replace today's global network.
The NSF aims to put out a request for proposals in the fall for plans and designs that could lead to what Clark called a "clean slate" internet architecture. Those designs, Clark said, could be tested on the National LambdaRail, the nationwide optical network that researchers are using to experiment with new networking technologies and applications.
Two NSF program directors in the agency's Networking Technology and Systems program refused to speak on the record about the $200,000 grant the agency gave Clark to explore his "clean slate" internet idea. Nor would they comment on a broader initiative taking shape at the NSF, of which Clark said his research is a component.
But Clark hinted that the agency is poised to take a leading role in developing new internet technologies.
"There are (program directors) at the NSF who are willing to rally the academic community," said Clark. "They are saying, 'Let's break some eggs.'"
Clark, who served as chief protocol architect for the government's internet development initiative in the 1980s, wants researchers to re-imagine the infrastructure that connects computer users around the world.
The problem with today's internet, according to Clark, is that its 30-year-old design, which allowed for the development of exciting new applications (the world wide web, e-commerce, file sharing, you name it), is now stifling further growth.
A new architecture could allow for ubiquitous embedded wireless communications devices and sensors. It could also provide for more secure and convenient forms of commerce. A super-high-speed internet could even allow people a world apart to collaborate inside elaborate 3-D virtual arenas, a process called tele-immersion.
As for today's internet, new applications and protocols meant to address security issues and wireless and ubiquitous devices may not be enough to solve its underlying problems.
"Systems rigidify over time," said Clark. "Each of those incremental changes has interactions with the others. And each is harder to add than the last one. After a while, the effort-to-success ratio (becomes untenable)."
Another internet founding father, however, questioned whether the academic community really needs to start talking about building a new internet from scratch.
"Anything you can do all at once, you could do with incremental changes," said Robert Kahn, who helped design the architecture for Arpanet, the precursor to the internet. (Kahn is now president of the Corporation for National Research Initiatives.)
Even Clark agrees with those who say the internet currently serves most of its users quite well. But he said applications and technologies introduced incrementally to the existing system, such as those springing from engineering working groups and the Internet2 research consortium, cannot solve the internet's fundamental architectural problems.
"The idea then was to build a cost-effective network 10 times faster than what we had at the time," said Clark. "But Internet2 is not architecturally different than the internet.”
Clark, a senior research scientist at MIT, said he will use his NSF grant to talk with other researchers this summer who could potentially submit proposals for new internet designs.
Clark, in the abstract that got him the grant, asks the question, "Can the research community devise a fresh, new design for an internet -- a design that takes into account both the wisdom in the original design and what has been learned since, a design that takes into account the requirements the network now faces and those we can predict in the future -- and demonstrate a network with sufficient appeal and merit that we might persuade the world to move to it?"
Clark said he would like to see two things addressed in any replacement for the current internet. The first is a coherent security architecture. The second is a healthy economic infrastructure for network service providers, who will need a bigger piece of the pie in the new internet than the one they are getting now if they are going to help pay for building it.
Clark is arranging a workshop this summer to bring together network architects and computer security specialists. He said he wants to encourage security specialists to think more about architecture, rather than simply their next anti-virus software upgrade.
"Look at phishing and spam, and zombies, and all this crap," said Clark. "Show me how six incremental changes are going to make them go away."
source: http://www.wired.com/news/infostructure/0,1377,68004,00.html?tw=wn_6techhead
# posted by dark master : 7/01/2005 10:19:00 AM
0 comments 
The 12-minute Windows heist

The 12-minute Windows heist |
 |
|
There is a 50 percent chance your unprotected Windows PC will be compromised within 12 minutes of going online, says security vendor Sophos. Highlighting the increasing speed of online attacks in research covering the last six months of virus activity, the vendor said the news was mostly grim. Authors of malware such as spam, viruses, phishing scams and spyware increased both the volume and sophistication of their assaults, releasing almost 8,000 new viruses in the first half of 2005 and increasingly teaming up in joint ventures to make money. The new-virus figure is up 59 percent on the same period last year. "With financial gain rather than notoriety becoming more of a motivation, spammers and virus writers have been drawn together with more traditional criminal elements," said Sophos Australia and New Zealand senior technical consultant Sean Richmond. While the usual virus culprits like Zafi-D, Netsky-P and Sober-N came under the spotlight, Sophos said growth in Trojan attacks -- where malicious software allows a remote attacker to gain backdoor access to a PC -- was perhaps the most significant development in the malware-creation field. "Sophos has seen a three-fold increase in the number of key-logging Trojans so far this year," the company said. "Trojans are delivered to target organisations via e-mail attachments or links to Web sites. They are often used by remote hackers to steal priviledged information, and very often to launch further attacks." But Sophos made it clear the news wasn't all bad. "Businesses in Australia and New Zealand mostly have it right when it comes to protecting their desktops, servers and gateways," said Richmond. "On the other hand, we've seen significant numbers of unprotected home computers become zombies for spammers," Richmond praised the Australian telecomms regulator for its recent move to press charges against Perth-based alleged spammer Wayne Mansfield. Mansfield is one of Australia's most notorious Internet marketeers and stands accused of sending at least 56 million -- mostly unsolicited -- e-mails in the period after the Spam Act was enacted in April 2004. Events further afield also caught Sophos' attention, as it highlighted several recent prosecutions of virus and privacy-related Internet crime. One dealt with the impending trial of German teenager Sven Jaschan, who has admitted writing the Netsky and Sasser worms, while another involved the arrest of a Cypriot man who was spying on a 17-year-old girl via her own Webcam. "Four United Kingdom phishers were also jailed this week," said the company.
source: http://www.zdnet.com.au/news/security/0,2000061744,39200021,00.htm |
# posted by dark master : 7/01/2005 10:17:00 AM
0 comments 
