Jim Barton of TiVo has an article in the ACM Queue magazine discussing the genesis of the TiVo platform and the decisions they made when creating it. I particularly noticed this quote about their choice to use FOSS software for their platform:
We wanted to avoid dependence on outside software suppliers at all times by having control of each and every line of source code. This would ensure that TiVo would have full control of product quality and development schedules. When the big bug hunt occured, as it always does, we needed the ability to follow every lead, understand every path, and track every problem down to its source.
This reminds me of the event in 1994 which drove me to FOSS. (more…)
The Wall Street Journal reports today that Sun's "longtime leader, Scott McNealy, is stepping down as chief executive and named the company's president, Jonathan Schwartz, as his replacement."
Schwartz has previously stated that all Sun's software will be open-source within a few years.
Peter Carbone the acting CTO of Nortel recently presented at the Carleton University conference on Competing with Open Source Software. In his presentation he identifies a six-level maturity model for FOSS use by industry ranging from level 0 (denial) to level 5 (aggressive). He identifies Sun, along with IBM, as examples of the top-level FOSS users: the "aggressive" level. They use FOSS as a strategic competitive tool.
"Mr. McNealy, 51 years old, has come under fire in recent years for the company's erratic profits and what was seen as a reluctance to cut costs." according to the Journal. "Sun has struggled to find a consistent growth formula for its server business since the Internet bubble collapsed."
With Schwartz's strategic use of FOSS it will be interesting to see what he does with the top line as well as the bottom line.
FOSS has become much more than just a cost-cutting tool in business. For example: Forbes recently reported on Digium's Mark Spencer:
In a research park outside the low-key bustle of downtown Huntsville, Ala. Mark Spencer finishes his barbecue and resumes wreaking havoc on the multibillion-dollar phone equipment business. […] Spencer is the inventor of Asterisk, a free software program that establishes phone calls over the Internet and handles voicemail, caller ID, teleconferencing and a host of novel features for the phone. With Asterisk loaded onto a computer, a decent-size company can rip out its traditional phone switch, even some of its newfangled Internet telephone gear, and say good-bye to 80% of its telecom equipment costs.
I assume Schwartz is likewise looking at disrupting the server industry status quo by using FOSS.
I've been spending a lot of time recently trying to understand why the corporations seem to have problems collaborating over a commons-based source-code base when the FOSS community barely breaks sweat doing this.
The record seems to show that adopting "best practices" software methodologies like full blown waterfall methodology with traceable requirements analysis, code reviews, metrics…the whole nine yards…doesn't seem to prevent such anti-collaborative social misbehavior as rampant "clone and own".
Also, some of the tools commonly used in corporate development, like centralized source-code management tools, seem to be purpose-designed to inhibit collaboration.
The University of California, Irvine, Institute for Software Research has been studying the social aspects of FOSS development for the last few years. I've just started poring over their papers.
The most interesting quote so far:
In contrast to the world of classic software engineering, open software development communities do not seem to readily adopt or practice modern software engineering or requirements engineering processes. Perhaps this is no surprise. However, these communities do develop software that is extremely valuable, generally reliable, often trustworthy, and readily used within its associated user community.
Indeed! Since FOSS projects flaunt all the "best practices" and still achieve the holy-grail of "extremely valuable, generally reliable, often trustworthy, and readily used" software" then why don't corporations co-opt the FOSS methods?
More about this later…
Today's Wall Street Journal had an article entitled "The Inside View: Employee blogs can put a human face on companies. But that's not always a good thing." (April 3, 2006 page R7)
The question is, do companies really want an open communication path between their employees and their customers?
For a scary exercise:
Probably the most interesting comment in this story is "[one expert] suggests limiting blogging lower-level engineers and product experts". The concern is that "top executives who blog may not seem believable."
I've written before about the end-to-end principle. Today a colleague sent me a link about Van Jacobson's network channels. I noticed this quote: "The key to better networking scalability, says Van, is to get rid of locking and shared data as much as possible, and to make sure that as much processing work as possible is done on the CPU where the application is running. It is, he says, simply the end-to-end principle in action yet again." Yet again?
This got me to thinking of all the places this principle shows up. Examples:
- The Network Neutrality issue where it is argued that innovation occurs at the ends of the network, and thus the center or core should be agnostic to all traffic types.
- The so-called "long tail" effect, where an end-to-end principle allows small consumers to find small suppliers and transact business that would otherwise not be possible.
- Politics, where Tip O'Neal famously said "all politics is local": it occurs at the ends of the channels connecting politicians and their constituents.
I guess the fact that it is called a "principle" should be a clue, but it is startling how many places you find it if you go a-lookin'.
Yet, if it is truly a "force of nature", then why do so many in the telecom business work so hard bucking this force by trying to build networks that provide complicated services, like IMS? Insurgent services (like, say, Google Talk) are ignoring the "intelligent core" and instead creating edge services depending on a mostly dumb network.
Take a simple service like, say, encryption: the end-to-end principle would predict that the best place to encrypt transmitted data is at the ends of the channel. (My understanding of data security principles would cause me to reach the same conclusion.) Would the market accept a network-provided encryption service?
The end-to-end mindset leads to wildly different system architectures than does an "intelligence in the core" mindset, hence many of the battles we see today.
All the world can be divided into openists or deregulationists. At least according to Tim Wu as argued in his paper The Broadband Debate: A User's Guide.
I recently attended a lecture at the Havard Berkman Center where Mr. Wu described this regulatory divide. (Notable attendees included Scott Bradner of IETF fame and David Isenberg author of the The Rise of the Stupid Network.)
Central to this debate is the so-called end-to-end principle which argues that technical and economic forces tend to push services to the edge of the network. Those who believe this thesis tend to be openists. Those who reject it tend to be deregulationists. I've learned, however, that there is a big difference between Internet types and Telco types as to what is the core and what is the edge of a network. That's a topic I have much more to say about in the future! From my point of view, both sides are "regulationists" in the sense that they wish to establish by regulation policies that favor their views. This is important because network "open access" reglulations are currently progressing through the FCC rulemaking process.