Content is scrollable
Back to March 2000 meeting page
Talk for Manchester LUG
I'd like to divide this talk into three sections:
Please excuse me if I begin with a little autobiography. Until
January 2000 I was a teacher and had been all my life. In the Autumn
of 1997 the school in which I was Deputy Head, which was the Hellenic
College of London, (a small independent school for Greek children)
decided that it was time to install a network. As it happened, at that
time there was an ex-student called Vassilis who was helping us out
with some part-time teaching after having finished his PhD. I knew
that he was very knowledgeable about IT in general and discussed with
him ways in which we could set up the network. In particular I asked
him whether it could be done with Linux.
At this point my only experience with Linux was that I had once set up Slackware to dual boot on my machine at home, but had done nothing serious with it and had removed it. Vassilis was interested in the idea and as he didn't have much to do, he started looking into it. We decided first as a result of his research and advice that we would set up a network based on a FreeBSD server, which would connect to the Internet by ISDN, run samba and a pop mail server and use natd to allow internal machines to access the Internet. Vassilis began setting up a machine with FreeBSD and in the meantime we had the building cabled. Very soon afterwards, Vassilis was offered a very well-paid job with Anderson Consulting. So I was left holding the baby...
I had to learn about Unix rather quickly: I was determined that we would go on in the way that we had planned, but my knowledge at that time was distinctly limited. I once again installed Linux at home and began to learn something about it while continuing to play with FreeBSD at school. I learned quickly and became (well, slightly) obsessed. I soon decided that Linux would be a better bet for the school system because it looked at the time as if it might be tricky to get ISDN working with FreeBSD. So I changed over to Linux (SuSE actually) and soon had a working network connected to the outside world, though initially only serving the management and staff.
In January 1999 I was at the BETT educational IT show at Olympia in London. I noticed that there was no Linux or Unix of any kind there at all. By this time I was convinced of the benefits of Linux to schools and even considered the question as to whether it was possible or practical to hire a space next year there on my own behalf or with the help a LUG to promote Linux to teachers off my own bat.
By January 2000 I was working for SuSE and was at the BETT show on our stand (having suggested before I joined that this would be a useful place to be). There was an enormous amount of interest from teachers and school network managers -- we were rushed off our feet keeping up with the questions from them. Most of those who stopped at our stand had heard of Linux, but didn't really know what it was. Others had not heard of it at all.
I compare this to the situation regarding people's awareness of the Internet in 1994 (of which more later): in 1994 I had to continually explain to people what the Internet was -- in 1995 I didn't need to explain . . . . At next year's BETT I predict that it won't be necessary to explain to many people what Linux is.
At BETT it was fun to tell people what's in the box -- show them a running system and then watch their reaction to the answer to their question ``and how much does it cost?''
As a result of the interest that we found at BETT, we decided to run a direct promotion to schools whereby they could obtain a full copy for SuSE for half price. In the first month of the promotion 215 schools had taken up the offer: over 6% of the schools which received the circular.
We then set up a mailing list for the schools to discuss their use of Linux and to get informal help from us. It is open to all schools, not just those using SuSE. This has been ticking over quite nicely -- ideally more people would ask questions, but I can understand why quite a lot prefer to `lurk'. [If anyone is interested in joining this list, the address is in the handout.]
This varies: there is a complete range from `it's still on the shelf
unopened' to schools where Linux is already in use as a server and
desktop. There are one or two schools which are using X-terminals
running off a server. I know of one school where they give students a
Windows desktop through Windows running on VMWare and diskless
X-terminals, which is an interesting idea. Most of the schools who
have copies are somewhere in between: a keen teacher or network
manager is playing with it, seeing what Linux can do, and has maybe
connected it to the network, got samba running ...
Schools are particularly interested in using samba as an alternative file server for Windows machines on the network. Switching to Linux on the desktop is slightly problematic because of software applications requirements.
In this connection it is worth talking about some peculiarities of software usage in schools: there is a peculiar kind of hypocrisy whereby what teachers say they use computers for in their teaching differs from what they actually do. There is a strong pressure on teachers to `show' that they are using `educational software' in their lessons as part of their subject teaching. This pressure comes from inspectors, heads of department and others: I have seen weekly lesson planning sheets with a box labelled `use of IT', which is expected to be filled in. The reality, though is that a large proportion of this `educational software' is of very poor quality, is bought unseen, and is not often used. In reality what people are doing is word processing, images and spreadsheets and using the Internet. So the reality and the theory of educational computer use don't quite correspond.
Another peculiarity of schools' use of IT is that although cash is short, many schools buy hard-ware for very inflated prices from particular suppliers who have let's say a very strong `psychological hold' on the market. This has been achieved partly by supplying Windows NT networks with additional specially designed configuration tools for schools, possibly insulating the administrator still further from the reality, but allowing easy remote support. In my view the psychological hold is maintained by the fact that those responsible are to some extent kept in ignorance of how their systems work. Of course this is partly because those in schools who are responsible for computer systems are very often pressed for time and not very interested in how anything works -- they just want it to work. But it certainly means that schools in many cases have less hardware and less software for their money than they could be getting. However, with Linux there is a virtuous circle: once you begin to see that you are in control, and that you can really begin to understand how things work, your attitude changes.
Another problem for schools is that in larger schools, while there often actually is someone with a real interest in and knowledge of IT who is running the department or the network, that person is not necessarily the same person who makes purchasing and planning decisions.
I predict that the proportion of schools using Linux will continue to increase. We shall be continuing to help schools as much as we can for a mixture of altruistic and self-interested motives -- we have just sent out another mailing to all UK secondary schools offering SuSE 6.4 at a special price.
Anyone can help by talking to their local school: telling them about Linux, offering a little bit of help. The learning curve is steepest at the beginning. In my experience people who have started and become interested don't look back, and that includes teachers. So if you are involved with a local school, I would urge you to offer useful help and advice . . . . The advice of course can consist of just telling people what they could be doing with Linux and getting them interested.
You might also be interested to know that the UK Unix Users' Group has started a Schools Liaison Group: this is in its very early stages so far, but we at SuSE hope to work with them on more ways to interest schools in using Linux and on ways of offering them help.
I think that discussion of open source has sometimes concentrated very
closely on software in general and Linux in particular and has
neglected wider areas where the same factors which have brought about
the success of Linux have been operating. Open standards of one kind
or another have been crucial in the growth of computing in general and
the Internet in particular.
The earliest computers didn't even have a standard representation of letters by bytes: ASCII was an agreed open standard without which very little would have been achieved. More interestingly, and this is something which is often overlooked: what was the reason for the dominance of the `standard' PC -- the so-called `IBM-compatible' Intel PC as opposed to other types of microcomputer on the market at the same time? It's quite clear that it was the fact that it became an `open hardware standard'. The way that this happened was that the functions of the BIOS of the IBM machine was copied in a way that did not infringe IBM's intellectual property. Manufacturers were then able to produce cloned hardware which was functionally equivalent to IBM's hardware.
The rest we know: MS-DOS, Windows, the dominance of the market by one particular hardware and software platform. But it was the emergence of an effectively open hardware standard which set off this process. It was simply the fact it was an open standard which enabled it to take over. The moral of this story will be obvious: we had an open hardware standard -- it took over. We now have an emerging open software standard.
A very interesting book on the history of the PC industry is `Accidental Empires' or, to give it its full title: Accidental Empires: How the Boys of Silicon Valley Make Their Millions, Battle Foreign Competition and Still Can't Get a Date by Robert X Cringely. A version of this was made into a four part TV series a year or two ago. (Triumph of the Nerds PBS and Channel 4.) He also does a weekly article on the web which is well worth reading. [The url is in the handout]
Another example of the existence of an open standard having an explosive effect is the growth of the World Wide Web itself. This quote is from Clay Shirky (http://www.shirky.com):
In the 36 months from January of 1993 to December of 1995, HTML went from being an unknown protocol to being to being the pre-eminent tool for designing electronic interfaces, decisively displacing almost all challengers and upstaging online services, CD-ROMs, and a dozen expensive and abortive experiments with interactive TV, and it did this while having no coordinated centre, no central R&D effort, and no discernible financial incentive for the majority of its initial participants.
The reason that this happened (and this was the period I mentioned earlier -- the gap between the time when most people didn't know what the Internet was and the time when nearly everybody did) -- the reason that this happened was simple: the availability of a simple open standard for displaying text and graphics across the Internet.
Consider how effortless it would have been for Tim Berners-Lee or Marc Andreeson to treat the browser's ``View Source...'' as a kind of debugging option which could have been disabled in any public release of their respective browsers, and imagine how much such a `hidden source' choice would have hampered this feedback loop between designers. Instead, with this unprecedented transparency of the HTML itself, we got an enormous increase in the speed of design development.
Consider too what would have happened if there had been browser and content encoding systems which were proprietary. A web or webs based on these would not have taken off so quickly, because there would have been a cost of entry and competition between standards. Even if one such system had taken over as a de facto standard it would have seriously held up the development of the Internet.
Linux is the OS of the Internet age -- the fact that we inhabit a new age is gradually making itself apparent to the mass of people. Linux has come into being as a result of the changed economic and cultural circumstances of the Internet age. It will succeed because of its quality and because it is an open standard.
Now I'd like to say something about the particular economic and cultural circumstances which made Linux possible, and quote a bit from the writings of Eric S Raymond.
When I first got to know about free software and open source my reaction was, I think a common one: I felt instinctively that this was too good to last. The `baddies' would take over -- the golden age would come to an end and selfishness would triumph. And the reason for that instinctive reaction is the experience we all have of how free access to shared resources leads to the destruction of those resources. For instance in my experience if you run a `kitty' system in a shared house to purchase food, there's never any cheese left in the fridge. Technically this is often known as the `tragedy of the commons'.
ESR expresses this very well in the Magic Cauldron:
Again, we must first demolish a widespread folk model that interferes with understanding. Over every attempt to explain cooperative behaviour there looms the shadow of Garret Hardin's Tragedy of the Commons.
Hardin famously asks us to imagine a green held in common by a village of peasants, who graze their cattle there. But grazing degrades the commons, tearing up grass and leaving muddy patches, which re-grow their cover only slowly. If there is no agreed-on (and enforced!) policy to allocate grazing rights that prevents overgrazing, all parties' incentives push them to run as many cattle as quickly as possible, trying to extract maximum value before the commons degrades into a sea of mud.
Most people have an intuitive model of cooperative behaviour that goes much like this.
[. . . ]
When people reflexively apply this model to open-source cooperation, they expect it to be unstable with a short half-life. Since there's no obvious way to enforce an allocation policy for programmer time over the Internet, this model leads straight to a prediction that the commons will break up, with various bits of software being taken closed-source and a rapidly decreasing amount of work being fed back into the communal pool.
In fact, it is empirically clear that the trend is opposite to this. The breadth and volume of open-source development (as measured by, for example, submissions per day at Metalab or announcements per day at freshmeat.net) is steadily increasing. Clearly there is some critical way in which the ``Tragedy of the Commons'' model fails to capture what is actually going on.
Part of the answer certainly lies in the fact that using software does not decrease its value. Indeed, widespread use of open-source software tends to increase its value, as users fold in their own fixes and features (code patches).
In this inverse commons, the grass grows taller when it's grazed on.
This, together with actually reading, thinking about and understanding the GPL was a revelation to me: it, together with the rest of ESR's writings helped me more than anything else to understand why the open source model was workable.
Another interesting book which throws some light on this (and which doesn't mention software once) is Robert Axelrod's book `The Evolution of Cooperation'.
This outlines a very precise theory as to how cooperative behaviour develops sometimes in circumstances where it might seem unlikely. (One interesting example which he gives is the cooperation which grew up between units of the British and German armies during the First World War as they faced each other across no-man's land: tacit understandings developed whereby each side would try to do as little harm to the other as they could between major offensives.) What is important for cooperation is the conditions -- there are conditions which encourage cooperative behaviour and others which make it rapidly unravel. The cooperative behaviour develops in an evolutionary manner when those right conditions exist. What ESR has done is identified various facts about the development of software which make it work better under cooperative conditions, as well as facts about the psychology, anthropology and motivation of hackers which explain why they behave in the ways that they do.
Talking about evolution in relation to the development of software is more than an analogy.
There are only two ways we know of to make extremely complicated things. One is by engineering, and the other is evolution. And of the two, evolution will make the more complex.
-- Danny Hills, quoted in `Out of Control'.
Almost all complex technology has developed by a process of evolution: this applies to such things a cars and aeroplanes as much as to software. Evolution works through natural selection: it works faster and more efficiently in a challenging environment. In open source development more people are potentially (and probably actually) seeing and debugging the code (`given enough eyeballs, all bugs are shallow'), which is another way of saying that it is exposed to a wider and more challenging environment. It is exposed to a wider environment in another sense also: software which can be freely used gets more use in the long run and so encounters more circumstances in which bugs can appear and be corrected (the grass grows taller when it's grazed on). It is not an accident that the programs which power the Internet such as apache, sendmail, postfix and so on are open source projects. The fact that they have evolved in a `hostile environment' (by which I mean an environment where they are constantly put to the test) accounts for their robustness. You can compare this to the super-rats which have evolved immunity to warfarin.
In another interesting paper on the evolutionary nature of the development of open-source software, Ko Kuwabara recalls the age-old controversy over the evolution of the eye, which goes something like this: `how could such a perfect instrument have developed through by evolution -- it must have been created once and for all by an all- powerful God' -- with the development of an operating system. On the one hand, you have Windows NT. Microsoft began the development of Windows NT in 1986, with a `God-like' centralised design methodology. Linus Torvalds started work on Linux in 1991, allowing an evolutionary process to build on a basic structure which he had created. The analogy is clear, and shows how evolution delivers the goods more quickly.
Kuwabara also points out:
Fred Brooks, an authority on software engineering, has found that, due exactly to rising costs of coordination, production time increases exponentially with the number of developers. In the corporate world, the Brooks' Law has more or less become a truism. In the case of Linux, however, there has never existed centralised organisation to mediate communication between Torvalds and the thousands of contributors, nor are there project teams with prescribed tasks and responsibilities, to which individual contributors are specifically assigned. Instead, from the beginning, it has been left to each person to decide what to work on at the moment, even at the potential risk of coordination difficulties. And yet, for all that, the Linux project has proceeded at a remarkable speed, with updates and version releases on a daily and weekly basis at times.
Clearly Brooks' law does not apply in the case of Linux because of the different method of organisation. Perhaps it would be possible for closed source software developers to learn something from the way Linux has grown: however on the evolutionary analogy when the source is closed and only available to members of a team within a company there may not be a challenging enough environment for evolution to operate efficiently.
Kuwabara uses the concepts of complexity theory to analyse the growth of Linux in an essay which is quite long and somewhat academic but well worth reading.
Eric Raymond's analysis of Microsoft's economic position:
I've talked about ESR's analysis of the open source phenomenon already: I would urge anyone to read his stuff. In this month's Linux Journal there was an interesting small interview with him in which he aired some thoughts about the future of Microsoft based on economic considerations which are well worth reading. He claims that Microsoft relies heavily on the upward trend in its share price in two ways. Firstly, its talented employees are locked in to the company through stock options, and when the price begins to fall, they will take their money and go before it falls further. Secondly, according to Raymond, Microsoft actually makes a large proportion of its money by gambling on the value of its own stock. For the stock price to continue to rise, sales must continue to rise, and this is becoming harder and harder to achieve because the total market for their products is not growing fast enough.
So quite apart from the threat from Linux, he argues that Microsoft is likely to run into serious problems simply based on economics. The fall in the price of hardware is another consideration which makes the cost of a proprietary operating system a much larger proportion of the total price of a computer.
SuSE 6.4 will be released at the end of this month in German and a couple fo weeks later in English. It will contain kernel 2.2.14 with USB support for more mice, keyboards and scanners and support for memory up to 3.5 GB. It will also contain the Reiser journaling file system. There is an improved graphical installer YaST2, and an improved version of SaX. There will of course be support for more graphics cards. XFree 4.0 will be included in the unsorted directory: XFree 3.3.6 will be used for the standard installation. SaX2 will also be included in unsorted. XFree 4.0 came out just as the final version of 6.4 was being put together: too late to be properly tested and integrated into the system. In XFree 4.0 there is just one X server: drivers for particular hardware are added in a modular way.
For the first time there will be a version of SuSE Linux for PPC which will run on for instance iMac hardware, so now there are SuSE versions for three different hardware platforms: PC, PPC and Alpha-AXP.
Looking further ahead: The version following 6.4 will, it is hoped (depending on the development time-scales for the projects concerned) contain kernel 2.4 and KDE 2.0.
The current version is called milestone 14. The next one (milestone 15) will be the first beta. I have been running milestone 14 and it looks very good, though as you would expect for alpha software it is not entirely stable. As you probably know, it was Netscape's decision to open the source of Netscape and start the Mozilla project which focused a lot of interest on open source in general. Mozilla has a special licence however: it is open but not actually GPL. The licence is called the NPL.
Apache continues to be an enormous success story for open source.
The May 1999 WWW server site survey by Netcraft found that over 57% of the web sites on the Internet are using Apache (over 60% if Apache derivatives are included), thus making it more widely used than all other web servers combined.
As we said earlier, most people are not aware of the extent to which the Internet which they use every day is so thoroughly dependent on open source software. This is well worth saying and repeating when people express scepticism about free software. Apache, sendmail, postfix, bind: all these are open-source and are involved in almost every use of the interent by anybody. And of course it was the open standards and protocols laid down in the RFCs which enable the Internet to come about in a technical sense.
Gnome is still developing very quickly. An improved version known as Helix Gnome will form the basis of Gnome version 2 which will come out before the end of this year.
VMWare 2.0 is now available. For those who are not familiar with VMWare, this is a commercial piece of software which creates a virtual PC in which you can install another operating system. So you can run VMWare on Linux and install another operating system within it. There is also a version for Windows NT. You can download the Linux version and a 30-day licence from the VMWare web site. It's fun to play with, even if you don't need to use it. If you try it out, be sure to also install the `VMWare tools' to get good graphics performance. I enjoyed setting up SuSE inside SuSE. More seriously we use it at the office for trying out and looking at other distributions and operating systems. For those who need to run Windows on top of Linux this is a fool-proof way, unlike wine which will run some things . . . . As I mentioned earlier, with a powerful server and VMWare, you can give users their own Windows desktops on thin clients running as X-terminals and booting off the server (a `Citrix'-like solution). An open source project to create software with the same capabilities is also under way.
KDE 2 and KOffice:
KDE 2.0 and the KOffice are in a fairly advanced state of development, though when they are actually due to be released I'm not sure. If the KOffice does all the things it is expected to do it will be very interesting and powerful.
AbiWord is another open source word processor which is now in a fairly advanced state of development. Eventually there are plans to create an office suite.
In this connection it's worth also mentioning StarOffice, which is currently at version 5.1a. Version 5.2 is expected soon. As you probably know, this is distributed free of charge by Sun Microsystems and is an office suite which available for various platforms including Linux.
The Gimp was probably the first open source end-user desktop application to better its commercial counterparts in terms of quality. (`It out-Photoshops Photoshop.') It is very powerful image processing indeed. We are still waiting for this to happen in the area of office suites. The Gimp is extensible through its own scripting language.
As you probably know, this is expected soon.
I haven't tried this but people who dual boot may be interested in being able to read their ext2 partitions from Windows.