Why Linux is failing on the desktop

I should’ve known better. I wrote a post a few days ago detailing my frustration with Linux, and suggested (admittedly in very indelicate terms) that the global effort to develop Linux into an alternative to general use desktop OSes such as Windows and OS X was a waste of resources. I have absolutely no idea how 400 people (most of them apparently angry Linux fans based on extrapolation from the comments) managed to find their way to the article within hours of me posting it. I think they must have a phone tree or something. Nonetheless, I should’ve been more diplomatic. So, as penance, I will here attempt to write a more reasonable post better explaining my skepticism of desktop Linux, and will even try to offer some constructive suggestions. I’m sure this post will get no comments, in keeping with the universal rule of the Internet that the amount of attention a post recieves is inversely proportional to the thought that went into it.

Before starting, let’s just stipulate something supported by the facts of the marketplace. Desktop linux has been a miserable failure in the OS market. If you’ve consumed so much of the purple koolaid prose of the desktop linux community that you can’t accept that, you might as well quit reading now. I think every year for the past decade or so has been “The Year Linux Takes Off.” Except it’s actually going down in market share at this point.

As I pointed out in my first post, (perhaps a bit rudely) this isn’t just a bad performance, it’s a tragic waste of energy. Can you imagine the good that could’ve been done for the world if these legions of programmers hadn’t spent over a decade applying their expertise (often for free) on a failure like desktop Linux? For one, they could’ve made a lot of money with that time and donated it to their favorite charities, assuming they were as hellbent on not making money as they appear to have been. And two, it might have been nice to see what useful things they would’ve produced had they done something somebody were actually willing to pay for as opposed to trying to ram desktop linux down the collective throat of the world. You know, sometimes the evil capitalistic market does useful things, like keeping people from wasting their time.

Open Source community projects put innovation and scale at odds. If an Open Source project is to be large, it must rely on the input of a huge distributed network of individuals and businesses. How can a coherent vision arise for the project in such a situation? The vacuum left by having no centralized vision is usually filled by the safe and bland decision to just copy existing work. Thus, most large scale Open Source efforts are aimed at offering a open alternative to something, like Office or Windows, because no vision is required, just a common model to follow. This is not to say that innovation is not found in the OS community, but it is usually on the smaller scale of single applications, like Emacs or WordPress, that can grow from the initial seed of a small group’s efforts. The Linux kernel is a thing of beauty, and is actually a small, self-contained project. But the larger distribution of a deskop OS is another matter, and here we find mostly derivative efforts.

An OS is only as good as the software written for it. One of the great things about Open Source is that there is a tremendous power in being able to take an existing project and spawn off a new one that fixes a few things you didn’t like. While this is fine for an application, it’s problematic for a piece of infrastructure software expected to serve as a reliable, standard substrate for other software. Any Linux application requiring low-level access to the OS will have to be produced in numerous versions to match all the possible distros and their various revisions. See OpenAFS for an example of how ridiculously messy this can get. For apps, do you support GNOME or KDE or both or just go, as many do, for the lowest common denominator? And supporting hardware accelerated 3D graphics or video is the very definition of a moving target. There are multiple competing sound systems, and neither are nearly as clean or simple as what’s available on Windows or the Mac. The result is usually a substandard product relative to what can be done on a more standardized operating system. Compare the Linux version of Google Earth or Skype to the Windows version of the same to see what I’m talking about. (That is, if you can even get them working at all with your graphics and sound configuration.)

Pointing the finger at developers doesn’t solve the problem. To begin with, for the reasons I’m explaining in this essay, some of the poor quality of Linux software is the fault of the Linux community diluting its effective market share with too many competing APIs. Even without this aggravating factor, developers just can’t justify spending significant time and money maintaining a branch of their software for an OS that has less than 3% market share, regardless. Because of this, the Linux version of commercial software is often of much lower quality than the Windows or Mac version. A prime example of this is the ubiquitous and required FlashPlayer. It consistently crashes Firefox. Is it Adobe’s fault? Maybe. But when this happens to somebody, do you think they know to blame Adobe or Firefox or just Linux? Does it even matter? It’s just one more reason for them to switch back to Windows or the Mac. And for the record, why should Adobe bother to make a good version of FlashPlayer for a platform with no stability and few users?

This solution to all of this is easy to state, but hard to enforce. (There are downsides to Freedom.) Somehow the fractious desktop Linux community must balance the ability to innovate freely with the adoption of, and adherence to, standards. Given the low market share, linux has to be BETTER than Windows as a development target, not just as good or worse. However, one of the problems with Linux seems to be a certain arrogance on the part of its developers, who consider applications as serving Linux, and not the other way around. An OS is only as good as the programs written for it, and perhaps the worst thing about Linux is that it hinders the development of applications by constantly presenting a moving target, and requiring developers to spend too much time programming and testing for so many variants.

It’s downright laughable that an OS with single digit market share would further dilute its market share by having two competing desktops. Yeah, I know KDE and GNOME are supposedly cooperating these days. But (a) it’s probably too late, (b) it’s not perfect and (c) even if you disagree that it dilutes effective market share, it still dilutes the development effort to maintain two desktop code bases. For godsake, somebody kill KDE and make GNOME suck less! Yeah, I know that’s never going to happen. That’s why the title of this essay is what it is.

One of the few things MS gets right is that for all they get wrong, they do understand one crucial thing: developers are the primary customer of an operating system. They may advertise to consumers, but they know that at the end of the day it is developers whom they serve. The Linux community just doesn’t get this.

Unfortunately, I don’t have much hope of desktop Linux ever becoming sufficiently standardized. If the focus becomes on making Linux friendly to applications and their developers, the distributions must be become sufficiently standardized as to render them effectively consolidated, and the desktop frameworks so static as to negate much of their Open Source character. For Linux to become more developer friendly, it would have to essentially become a normal operating system with a really weird economic model.

OS development actually requires physical resources. The FOSS movement is based on the idea that information is cheap to distribute, and thus it is a tremendous leverage of the human capital to develop it. People write some software, and the world perpetually benefits with little marginal cost. That works beautifully for applications, but OS development, especially desktop OS development, requires tremendous continuous resources to do correctly. For one, you need tons of machines of different configurations and vintages on which to test, which costs money. And you need a large team of people to run all those tests and maintain the machines. Any respectable software company dedicates a tremendous amount of their budget to QA, fixing stupid little bugs that happen to come up on obscure hardware configurations. Linux just can’t achieve the quality control of a commercial OS. And that’s probably why when I “upgraded” from Gutsy to Hardy, my machine no longer completes a restart on its own. Maybe this will get fixed when the planets align and somebody with the same motherboard as me who also knows how the hell to debug this runs into the same problem, but I’m starting to get weary of this, and the apparently I’m not alone based on the declining desktop linux market share.

The lack of control of open source software makes Linux vulnerable to the weakest link. One of the biggest criticisms of my previous post was that I shouldn’t judge Linux by Ubuntu. Fine. But if we are to take the measure of each distribution as a product unto itself, than so we must measure their market shares as such. It is unfair to speak of the “adoption of desktop Linux” in sweeping terms, and then whenever something goes wrong with one distro, to conveniently speak of that distribution as not representative.

If one desktop Linux distribution gets bad press, it taints the entire community. This may seem unfair, but its the bed one makes when one decides to avoid having to actually write the majority of your own software. You can’t expect people to spend effort figuring out if something that goes wrong with Red Hat is really due to something specific to their implementation or something endemic to Linux.

Operating systems confer little advantage to the free-as-in-beer aspect of Linux. An operating system is, in some ways, one of the best bargains in software. The ubiquity of the need for OS software makes it highly efficient to develop, with the overhead costs negligable on a per unit basis. For their scope and breadth, paying about $100 for Windows or OS X is quite a bargain, and its cost easily justified if either saves you even a minute of time each day versus a free alternative. Thus, the OS market is naturally highly elastic; being free is a negligable advantage for a product which you use so often and which has such an impact on your life.

So, does Linux save people time, or does it just offer the illusion of saving money? In the context of individual users, I’d argue it offers no advantage for most people to switch. Again, the biggest determinant is is that resulting from differences in the quality of applications. In my experience, applications written for Windows and Mac OS X out perform those written for Linux, both in terms of robustness and usability. Maybe this is due to differing efforts aimed at each platform, or maybe its due to inherent difficulties in Linux as a development platform, as mentioned in the previous section. Either way, money is not the issue for anybody with any serious use for their computer.

The most compelling proof of the above is the fact that while desktop Linux has never seen widespread adoption (and may even be moving backwards) Linux as a server platform was a resounding success. Applications such as Apache and PHP are fundamentally good old fashioned UNIX applications, and do not suffer from the death of a thousand APIs or all of the other mistakes made by the desktop Linux community. Furthermore, this is an arena where the scales of operation make the free-as-in-beer aspect of Linux actually worthwhile, as opposed to a fool’s coupon. The market is certainly backing this up. In 2003, they were calling for Linux to be 20% of the market by 2008, up from 3%. It’s now 2008, and Linux is still less than 3% of the desktop market (but it’s serving more web pages than any other OS).

Community Open Source projects are the most successful and needed that address a more niche software need, where overhead development costs drive prices up. A good example of this is Adobe Illustrator. Inkscape and similar programs don’t have to be much better than Illustrator for them to make sense for average users, given the tremendous cost of Adobe products, and the relative infrequency of their use for the average person. But just as a professional artist would be stupid for “saving” money by going with Inkscape, a professional of any type is foolish for switching to Linux purely for the cost savings alone. One hour on the phone to tech support will erase the advantage. It is only because Windows is so horribly engineered that people can even begin to make the case that Linux is comparible in useability to Windows. Which brings me to my next point…

Linux won’t always be able to rely on the dominant OS being terrible. Offering something comparable in usability to Windows is nothing to be proud of, and simply sucking less than Microsoft may be easy, but it’s not a sustainable business model. Eventually, either Microsoft will get its act together, or far more likely, Apple will finally achieve market share critical mass and developers will flock to it. Apple has been making great strides, and in my opinion, has been improving OS X at a much faster rate than Linux has been improving. As they grow their market share, I expect this to accelerate. At some point, the Linux community will have to offer something more than ideas largely taken from Windows and OS X, and provide motivation to switch other than animosity for Microsoft.

The desktop Linux community lacks the ability to undergo significant rewrites. When Apple decided to use OpenStep as the basis for OS X, they saw fit to make a lot of changes to its Unix underpinnings, including a complete rewrite of the graphics model. Vista involves a complete retooling of the Windows driver model, allowing things like graphics driver installations without a restart. Apple and MS have the resources and organization to handle such labor intensive clean slate approaches. In my opinion, the Linux community just doesn’t have the ability to handle projects of such scale, at least not within a time frame even remotely competitive with Apple or MS.

When you’ve got labor divided as finely as it is in the Linux community, the general modus operandi tends to be patching and mods. However, there is only so much that can be done with X11. I think people are so impressed with what has been done with the 20 year old graphics system, that they forget to be embarrassed that you can’t even make trivial changes to the graphics configuration without requiring a restart of the X server. If you want proof of this, look no further than the 50+ pages of discussion on the Ubuntu forums on how to get dual monitors working.

The truth is that Linux is impressive, but it’s not as recent an accomplishment as is often claimed. The majority of what is usually called Linux was already there in the form of GNU and X11 (hence the FSF’s hopeless insistence on everybody calling it GNU/Linux) which has been under development in one way or another for around 20 years. But having been given a quick headstart by usurping the GNU project (the fair perils of Open Source, which must cause RMS no end of internal conflict), progress will not be so quick in the future. While MS and Apple were able to start from scratch on their graphics systems, Linux struggles to get compositing graphics to work reliably. This is completely understandable. I’m frankly amazed at what they’ve been able to do with extensions to X. But there is no extra credit given for effort, and Linux is falling further and further behind on the desktop as the task of grafting modern features onto an aging architecture gets harder and harder to maintain. Worse, but predictably, what people do work on is the fun stuff like compiz, and not the boring, unsexy infrastructure.

• • •

My suggestions. If you’ve read this far, you might as well hear my hair brained proposals to the problems outlined above. Most of them deal with making Linux a better platform for developers.

  1. Take a hint from Apple and don’t try to work on every possible system configuration. Agree as a community on a small list of supported hardware. Maybe, if you standardize on something, a single hardware vendor might be induced to give a damn and write decent drivers. I’d suggest that the Linux community adopt Nvidia cards and completely forget ATI.
  2. Focus more on high quality development tools and standardize the multimedia APIs. Be ruthless. If OSS-supporting apps have to die, let them die! Pick one goddam printing interface. If stuff breaks or has to be rewritten, so be it. Never forget that applications determine the strength of an OS more than anything else. Worry more about developer convenience than user convenience. When was the last time you heard somebody say they used Windows because they liked it? They use Windows because applications X, Y, and Z work better on it, or are only available on it.
  3. Get all distros to agree on a standard release schedule such that all distros use the same kernel and GNOME/KDE release. Err on the side of longer release times and stable ABIs.
  4. Standardize on one package management scheme. This will allow developers to distribute software more easily, instead of having to maintain 7 different downloads on their distribution page.
  5. If all else fails: focus on the core Unix parts (user space GNU tools and kernel) that linux gets right and let somebody develop a closed-source desktop on top of it. Yes, you heard me right. If you change the license of the linux kernel and use the GNU userland tools as add-ins, I think it might be possible. I’d love to hear people’s opinions on this, especially the legal issues, but my feeling is this is the only way you’re going to get a decent desktop out of linux. It will take a single company with a profit motive to get linux to work on the desktop, just as it took Apple to make BSD/OpenStep work for the masses. Control by a single company would address the stable platform issue, as well as getting some of the nastier but needed tasks done that I mentioned earlier.

63 responses to “Why Linux is failing on the desktop”

  1. Linux will fail in the corporate world of large enterprises, and will win in the SMB environment.

    Microsoft is building software applications such as Sharepoint, and other applications that corporate world have to use to compete. So, if you gain on your competitor because you standardized on using Vista or its successor, then who cares about cost.

    For the SMB, cost is important, and FOS stuff will more then do the job. Actually, FOS stuff will do the job and be there a few years behind the MS offering and be able to provide the functionality.

    SMB = small to medium businesses. FOS is free and Open Source

  2. Hi Leslie,

    Thanks for the comment. I agree with you up to to a point. I’m still not convinced, however, that even the ‘S’iest of SMBs really sees their desktop OS costs as a really obstacle. Are you talking about server OSes? If so, I completely agree, but I made my argument against desktop linux from the perspective of one person, and if my argument works for one person, than it certainly scales up to a business, especially one that can’t afford a full time system administrator.

  3. I think your analysis is dead on target.
    You have adressed all the points that are currently preventing Linux from being something else than a hobbyist’s desktop OS. I think that these issues aren’t likely to be adressed in the future.

    From an enduser’s POV, Linux as a Desktop OS has so far managed to re-create a typical commercial workstation grade Unix (AIX, SOLARIS, HP-UX, IRIX) the way it was in the 1990s, but without the bits which appealed to businesses (great support, hardware & software certifications, etc.). That’s not what I call innovation. As good as these OS were, none of them never made sense for home users back then. They would certain not make sense for home users today.

  4. Incidentally, I saw the first post on tuxmachines, as I expect many others did.

    Just one point really, if 3 billion people live on less than 2 dollars a day, should they save for a legal copy of Windows/OS X, pirate Windows/OS X, use Linux, BSD or use ReactOS?

  5. Hi Darrell:

    If you live on $2 a day, what in god’s name are you going to use to pay for the computer on which to run Linux, and what exactly will you do with it?!? The idea that what kids in the third world really need is a laptop is a conceit of western society that could only come from the minds of people at the MIT Media Lab. (I think children may be the only people willing to listen to them at this point, but that’s only because Nicholas Negroponte smells like expensive chocolate.)

    And everybody is missing the core of my argument, which is that Linux is not free. It takes up the time of millions of people. If you’re worried about the 3 billion people, maybe the several million who contribute to Linux could devote some of the energy and money and time to helping the third world get things like water and sewage infrastructure. Highmindedness will get you nowhere in this argument. Did you forget we’re talking about computers?

    Thanks for the heads up on the link.

  6. Jonathon,

    A well reasoned, thoughtful post. And,

    “I’m sure this post will get no comments, in keeping with the universal rule of the Internet that the amount of attention a post receives is inversely proportional to the thought that went into it.”

    is apparently true.


  7. Hi Bill,

    Thanks for the kind comment. Maybe the secret is to write a reasoned post, but give it a more salacious title. I’ll get it right one of these days.

  8. I would like to modify my response as follows: In the economic world of the United States, the wealth that is there will be a very very strong barrier to the absolute success of Linux.
    However, in the public domain of schools, and other companies (in the affluent societies), linux will supplant Microsoft. And it is well underway.

    In the colleges, the computer science courses are essentially linux based, because the course is open, and students can disect a even improve on the software they are studying. In four to five years, linux will really take importance.

    The third world countries are already standardizing on linux. We are actually seeing these governments and their schools there standardizing on linux. Regarding the hardware requirements, linux can execute on hardware that was originally build to Windows 98. This is the type of hardware that these countries can afford. There is also the OLPC program that is getting, via donations, one laptop per child. With global knowledge, who knows what platform these 3rd world countries will be using in 20 years.

    Finally, Every linux can be supported for a fee. That fee is reasonable, when compared to the cost of Microsoft licenses. So, again there is an incentive to standardize on linux.

    Today linux is around 2% of software installs, It is getting significant. Lets ask your question in one year’s time. You may be right and that Vista prices drop, that hardware requirements become less heavy, and that people are going to do more then emails, write and internet work with their computers.

  9. You are right in some ways – but not in others. First, setting up Linux is difficult because of hardware issues. Whatever the zealots may say, until hardware vendors support Linux, it will be difficult to set up. Once it is set up, however, it will run with remarkable stability. After a little initial work, my Zenwalk box has yet to crash – after 7 months.

    “While MS and Apple were able to start from scratch on their graphics systems, Linux struggles to get compositing graphics to work reliably.”

    Uh… what? Considering the number of times I have crashed OS X and Windows, this seems a bit absurd. Also, what exactly does the first part have to do with the second? They seem completely disconnected.

  10. Leslie:

    I must admit, you make a damn good point about Linux enabling use of cheaper low-end and used computers. I hadn’t thought of that. I’d be interested to know what the difference is in requirements to run XP versus GNOME/Linux, or what the relative memory footprints are. I know Vista is a pig, but I’d assumed XP and desktop Linux were comparable. Anyone know?

  11. Colonel:

    Why would a hardware vendor support Linux when it’s twice the effort and 1/45th the market share? I agree there’s a technical solution, just not an economically feasible or pragmatically realistic one. Linux advocates tend to get stuck on “well, there’s a fix for that.” Sometimes that refers to industry wide issues where Linux folks expect companies to do economically unviable things just because Linux deserves it. And with technical issues at the individual user level, Linux folks expect people to spend economically unrealistic amounts of time figured out how to fix and setup their computer, and think that as long as their is a solution, there should be no consideration of just how ludicrous and belabored it was. Both share common cause in the notion that Linux, being a politically correct miracle of philanthropy, should be treated with different expectations than Mac OS X or Vista. (I’m not saying you feel this way, I’m simply defending my original thesis.)

    As to your second issue: I didn’t explain this very well. My point was that one of the reasons Mac OS X and Vista can get compositing to work across a multiple monitors, even those with different sizes, so seamlessly is that they started from scratch with their designs. The compositing on Linux is done on top of X11 as a mod, and it’s ugly as hell to get it working on multiple monitors, or configured correctly, if you’re not a geek. I had to do some serious hand editing of xorg.conf to finally get it working on my system, and I’ve never gotten the second monitor to work with compositing. This stuff just worked out of the box with Vista, and my computer was purchased before Vista was released. I’m sure hardware support is some of that, but the hardware support is actually surprisingly good for Linux. Why ATI bothers to write proprietary drivers for 2% of the market is beyond me, but they do, and they work. What is lacking is a remotely sane approach to modern graphics in Linux.

    Now, as a geek, I’m willing to mess with Linux because it’s fun and I have severely poor time management skills and a complete inability to prioritize correctly. But if I were a more sane individual, I would be more than happy to fork over $100 to avoid spending hours messing with my computer.

    (For the record, I love Linux. Everybody thinks I hate it just because I think it sucks. First of all, I’m not dumb enough to complain about things I don’t care about. Second, if an engineer likes an OS, that probably means it’s a shitty desktop OS for a regular person.)

    • Linux is not a product. It is a public service.

      Good will towards others is never a wasted effort. The growth of human knowledge – the expansion of science – is far more important for mankind then can be measured or compensated for with money.

      You make the reality of the world you live in – to a significant degree –
      kindness towards others is often remembered and will often make a difference in who people want to associate themselves with.

      The larger dimension of the good that collaboration in working to create a commons that is useful to all – is the hope and opportunity and good will towards mankind as a whole.

      If the only thing important in this life was money – death, war, murder, (and realisticly much less extreme examples) would lead to a breakdown of society – you can’t place a price on civility.

      I am a “zealot” because I believe in giving something back to the world that can have a permanent impact – that is positive. To help the schools. To give someone a chance to gain employemnt to start up a business setting up Linux servers or desktops (yes desktops it is possible) in situations in which they might not be able to be employed – if they are willing to learn – they can both help someone save while earning for themselves a small sliver of the billions of dollars accumulating in Gates and Allen’s coffers.

      But its not just about that – its about having a sense of doing your part to work together to make something many people can use. Linux is about the users themselves – plus the added benefit of helping others as “gravy.”

      The schools mentioned by another poster above come to mind.

      I’m also a “zealout” because I’m willing to work to help myself, as well as others, have more then 1 viable option on the desktop.

      Is it “better” then windows for people that don’t like to learn the ins and outs of it all? What goes on under the hood.

      Not in the sense that learning Linux isn’t a self apparent experience. Its an educational experience.

      That gift – the opportunity for you to learn – is there for you to take or reject.

      Perhaps you are pissed off because you don’t have a choice, your attitude is attributory to that becoming a realization.

      Meanwhile, we, the “zealots” are at least working towards a solution.

      Give me liberty or give me death.

      I think that the “Open Source” people had it wrong in trying to make a business arguement out of public service. In the process they overplayed their hand and made a lot of empty promices.

      I don’t speak for anyone else, neither they for me.

      I can understand some anger or [disbelief at what you perceive as the delusional being very vocal while making somewhat empty promices]. Others pointlessly arrogant. They forget they are working on a public good.

      However – for those of us who are doing our public service, for you and others, you come accross as taking a piss on a volunteer beach cleanup crew, or perhaps the Red Cross.

      James Leone

      • I appreciate your optimism and idealism. But idealism without integrity is useless. Too often people think that good intentions are enough. If truly good intentioned, you will not only care about the vision, but whether or not that vision ever comes to fruition. And if I’m right about desktop Linux being hopeless, then it is getting in the way of human efforts, and consuming resources, that could do more good for the world. There is no philosophy, James, where everything is good and all is happy. Working towards a solution for the greater good is not good enough. That work has to be demonstrably going somewhere, or else those in the effort are guilty of something worse than apathy: getting in the way. I know it seems harsh to criticize those who are attempting to do good, but that is the risk you take when you try to change the world for the better. If you are expecting people to change their lives based on your vision, that is perhaps a good kind of arrogance, but an arrogance nonetheless. With it comes the cost of having to justify it. Desktop linux has, in no way shape or form, justified itself relative to the scale of the effort.

        I don’t expect to convince anybody. If there’s one thing I’ve learned, it’s that one can’t underestimate the longevity of a bad idea or the credit given to those who are wrong.

        • “you will not only care about the vision, but whether or not that vision ever comes to fruition. ”

          I agree and I do. How that would happen must be thought out carefully. I do have some ideas, don’t mean it will work.

          Mathematica: the effort that you describe – to create a replacement – how many schools are involved? Can others join you in the effort in a situation like yours?

          You have a skill I wish I had – the ability to program beyond using Bash.

          The more people involved the less work you would have to do to improve it.

          Another point I haven’t mentioned is that I have had a tremendous benefit using Linux on my desktop.

          There are a great many programs I simply don’t have to buy because I have replacements for.

          Some are easy and ready, some I had to earn. But the bottom line is that I earned my freedom.

          You can earn it too – its your choice really.

          If you were not able to afford Mathematica, but you had the replacement that is not as good, what would you do?

          You probably would work on improving it and solicit others to do so as well.

  12. Jonathan, you make lots of good points and I agree with most of them. For the one’s I’d like to discuss about:

    “However, there is only so much that can be done with X11. I think people are so impressed with what has been done with the 20 year old graphics system, that they forget to be embarrassed that you can’t even make trivial changes to the graphics configuration without requiring a restart of the X server.”

    As I replied to your other Linux post, X being 20 years old doesn’t make it bad. Demanding restart after configuring is bad I agree but that is a design question. I don’t really know much about graphical systems but I understood that X is really a good and a flexible one. If I understood correctly, for example, Windows NT technology is about 15 years old. Software doesn’t rot like old buildings, it gets used and tested. And what I’ve gathered is that most companies want to affiliate themselves with things(partners, employees, software) that have been around for some time and proven themselves true.

    Steve Yegge talked about system that require a restart quite well in his blog.


    I must say that Windows XP-reboots have bugged me a lot more than X-restarts(and eaten up way more of my time). Not that it would make X-restarts ok, they still make me rage.

    “Quit parroting Windows and Mac OS X. You can’t beat them by copying them.”

    I think there’s lots of innovation in the FOSS-community(and within software developers everywhere) but what we see mostly is what the big boys think that masses want. And right now that’s Mac OSX, Windows XP/Vista, GNOME/KDE/XFCE etc.. Bigger distros rarely come with Ion or ratpoison on by default and free OSs get most their new users from XP and OSX so they understandably try to facilitate the switch.

    “Are movable windows really the best way to present information; How much effort could be saved if we didn’t have to constantly be positioning and sizing windows? Why am I paying for pixels which only show my background picture half the time? Revolutionize the interface, don’t just copy OS X. This will require a small group to break off on their own to get this started, for the reasons I detailed in the main body of this post. In the absence of market share, attract developers by offering a unique platform.”


    Personally I use Ion3 and I can’t understand how people could be productive with non-tiling window managers. Overlapping window management is management by user, I’d rather have software do that.

  13. Heikki:

    Fantastic comment. I’m embarassed to say I’d never heard of the tiling window managers, and I’m really excited to find out that this is an area of active work. Thanks for the links.

    On the X11 age issue, I agree that age does not inherently mean bad when it comes to software. That was probably the software equivalent of ad hominem attack on my part. I should simply state that X11 has some bad characteristics that OS X and Vista don’t, and point out that it would require something akin to a full rewrite to address them. I think that will eventually happen for the most part, but Linux just isn’t in the position to require developers to support a new graphics API and so the legacy X11 will have to be supported, which makes trouble for developers and hardware vendors. Amazingly, progress is hindered even further by a duplication of efforts between different camps (AIGLX versus XGL).

  14. I just recalled one thing which doesn’t solve the restarting X-problem but it should make it a bit easier. No one said X needs to be REstarted, you can always start another X-session and see if the settings work. This isn’t a real solution because the original X-session will need to restarted if the changes are needed there too.

    In a nutshell, to start another X-session(ALT-F1 goes to a virtual terminal, usually six of them are started up to ALT-F6):

    startx — :1

    Then you can switch between X-session with:

    These depend on the number of virtual terminals as the first X-session is available with the number after the last virtual terminal.


  15. In the world of technology in which I work, (ERP, Project Management, Manufacturing, Business Intelligence, Business Process Managment), I see the intrenched dependence on XP for the client software disappearing. What is replacing it is a browser based interface. An interface that (if you consider Firefox), is operating system independent.

    What I also experience is the server side of operations being more and more linux based. The DB2, Oracles and linux PostGresQL, MySQL products are being deployed as hosts, with the logic behind the databases being PHP driven.
    Coming up shortly will be Microsoft’s SQL8 where the database wars will again start. SQL8 will drive clients to their server platform, but to what extent is not known.

    Microsoft is diversifying as fast as it can, into business applications, since it is known that the desktop war is a losing proposition. In order to hang onto it’s dominance, they must get OOXML approved, else governments will not upgrade to the new office products. OOXML is a life and death must for them.

    In the linux world, subversion is used to maintain basic applications common to all distributions. Contrary to what a previous respondant indicated, open source and forums does allow for group collaboration, and to the demonstration of the motto “Noone has exclusivity on brains”.

    Not but not least are the virtualisation products, and the products such as wine, and crossover, and another test product which is a linux program that executes on XP, to make the wars useless. I wish I could bring in the MAC to this correspondence, but since I have no experience with this commercial OS, my feelings are the same, Firefox or other certified browser will provide operating system freedoms for big business.

    The rest of the discussion is one of coming down to preferences and to familiarisation.

    In closing, I just want to indicate that 50 years ago, almost every car owner had a tool box in his trunk, he changed the oil, oil filter, battery and the sparkplugs himself. Today, with so many vendors out there, and so many models, and with fantastic reliability, we dont give two thoughts to bringing the car into a service shop. The same will eventually be that way with your workstation/playstation.

  16. When we do not talk about the price. we will say the “customization”. We cannot change any single piece of the MS or OSX software, while we can tune any Linux software to our need. I know there is the requirement, and Linux software has the space to live.

    I am then “astonished” by your statement about Linux is not a better development platform than windows. —— Many people around me use Linux just because Linux is a better development environment, not that Linux is a better desktop. Let’s see Windows: there’s DOS ASM, then MSC, turbo C, win31 SDK, then win32s, win32ddk, MFC, then ATL, then .NET, C#, then the new driver kit in vista,the development platform is *always* changing and there is no motivation to develop a good enough software in windows, since the source code will likely to be legacy in just a few years.

    But I bet that you can still compile C source code written in 1993 using the newest GCC 4, though the code is 15-year-old. for that reason developers like Linux much better than ordinary users do.

    For 3rd party software distribution, many linux users simply choose the source code and it is the easiest to install, because it is universal on all Linux distribution.

    When we do talk about the money, then I admit it does not have great advantage. But the smallest advantage is better than NO advantage.

    I live in 3rd world and I know that people living for less than 2 dollars a day is quite common. Believe me, they need computer too, and they would not just even think about paying for Adobe Photoshop and Microsoft Office. While they are poor, they are not stupid. they know that computer brings them creativity and they use computer to do a lot of useful amusement. This is the fact I have seen in decades and I have to say it is true.

  17. Pan Shi Zhu:

    Thanks very much for you comment. I think my essay could probably use the attention of a good editor, because I don’t think my point about linux being a poor development platform was explained very clearly. I love developing on linux. I agree that the GNU toolchain is the best there is. I was talking not about the happiness of the developer but of the final product. I think linux is a rotten platform for developing software that other people have to use, or at least desktop software. Unix was designed for computer scientists, not average users, with software distributed in source code so that local changes could be made as needed. There is a lot of power in that model, and I love it.

    But consider something like Skype. The linux version is not the pleasure to work with that the Windows or Mac version is, and the reason for this is linux. Audio and video support on linux is difficult for two of the reasons I mentioned above: first, there is too little standardization of audio on linux, and two, the market share of linux is such that hardware driver support is spotty and generally done by third-parties. This is not a good environment for the developers of Skype to find themselves in, and do you really think that they are given the resources to properly deal with the problems of the linux version?

    As for your cost argument, I have to admit that I may be wrong there. My position was that for people in the third world, getting a computer is the hard and expensive part, and the OS cost is not the problem. If you buy a new computer, the OS is rather cheap as a proportion of the whole computer. If you buy a used computer, or recieve one as a donation, you inherit the OS originally purchased.

    Remember, I was ONLY talking about the OS. I agree that the economics are there for free software applications, such as Photoshop and Office, as you mentioned. In fact, I’ve argued that the great pity about linux is that it takes development resources away from more useful open source projects. My opinion is that the open source community would do more good by focusing on developing free applications for mainstream operating systems.

    I’m not saying kill linux, just quit wasting time trying to make it usable for the masses. It will never happen. Ever.

  18. All that talk could boil down to a simple reason:

    Windows and Mac OS are products. Linux is software.

    Windows and Mac OS includes all the things from kernel, core system, userland, drivers, common applications.

    Linux is a kernel, that you put a GNU userland, X11, and then install some GUI like Gnome, KDE, or none of those and just use the terminal.

    That’s the main difference. If you wanna see Linux on the desktop, it should be delivered as a complete, supported product, with a defined vision. All those distros try to deliver it, but how hard is to deliver a defined vision when you depend on upstream projects that you don’t control?

    In my opinion, the development model of Linux is software, not product, and as such, will never be so widespread.

  19. Windows is cheap, while windows software is not, there is many wonderful free-software developed such as foobar2000, ExactAudioCopy, 7-zip, and no commercial product could beat them till now. Not to say the vim and emacs which works in Linux much better than in Windows and OSX. You cannot just say all software in Linux is inferior to windows counterpart. If the author of the software uses windows, he will most likely to make the windows version the best one. While some other software are developed under Linux, they runs in Linux the best.

    Linux will never be made usable for the mass, I personally regard this as a common sense. And I agree that developers should concentrate on application development.

    However, how many efforts have been “wasted” on developing Linux itself? I don’t think it is that much.

    Most effect has been contributed to do application development, not the kernel itself. For example, KDE4 now supports OS X and Windows, this is an indication that the development team regard it as a “software”, not “part of Linux”.

    If you mean: there’s too much effort has been “wasted” on making distributions. No I don’t think so, Ubuntu employers are not making it for free since they get paid, so do the SUSE and RedHat, will they still do that if they got no paid? They may do something else.

    And yes, there is many other distributions out there for free. They make distributions just for fun. Then I would ask what we do in our spare time? Sit in the sunshine? or drink coffee and talk with friends? go traveling? Those hobby obvious is a waste of time, and should that be prohibited?

    My point: many people developing Linux or distribute Linux just as their hobby, just because they like it, things you think a waste of time may be the most important thing in their life for themselves. If developing something else cannot give them that level of happiness, they would choose not to. No, not even if something else might be more valuable to the world.

  20. Pan Shi Zhu:

    You make good points. Again, I was being sloppy with my words: I’m not talking about the kernel, which is a major contribution to the world and a beautiful thing. I’m talking about the distros, but I’m also talking about all the ancillary development that goes into the desktop: GNOME, KDE, all the X extensions to make compositing and DRI work (sort of).

    I’m not sure the people working on GNOME or KDE would say they are just doing it for fun. I think they believe they are doing something useful. Maybe they are. I’m just arguing that they could be doing something even more useful.

    Now, since my whole point is not absolute, but based on relative opportunity costs, let me ask you this: what would be more useful to people in the third world, a free OS that replaces Windows XP ($100) or a free scientific computating package that replaces Mathematica ($2500)? What is more useful to developing countries: having pretty windows that wave around when you drag them, or reliable audio so that voice chats will work?

    • Hi jonathan –

      Hey first let me say that every time you mentioned that you could improve your essay in some way – I liked you more, and not deviously.

      I too have disappointments as to what I hope the Linux desktop could be and is.

      A minor point here – I didn’t bother to look – but I’ll bet you someone out there is trying to make a Mathematica replacement.

      What muddies up the water and confuses the issue is the idea that Linux is a business opportunity – “dot com” bubble.

      From my viewpoint its like trying to sell off the Goodwill as Macy’s. (Not saying you are)

      And I’m not trying to misframe that on to you when I say – that pondering over KDE and Gnome – why are they separate – is almost akin to pondering the same of Goodwill and Bargain Box.

      (Although in all honesty – I don’t feel KDE 3 is on a quality class with either, but rather, a public service class) [But I’ll say that about KDE 4 and Gnome, well, just to be more funny then sirious.{I am sirious somewhat}]

      James Leone

        • If you want Mathematica to be free, free it:

          Well, if you want to sing out, sing out
          And if you want to be free, be free
          cause theres a million things to be
          You know that there are

          And if you want to live high, live high
          And if you want to live low, live low
          cause theres a million ways to go
          You know that there are

  21. WRT the users in developing countries issue (I live in one),

    Having a system which works on really low end systems is a godsend. Mathematica isn’t really a question in most places, it’s the office suite which matters. Oh, and ERP type applications.

    Voice chat isn’t exactly relevant (cell phones are dirt cheap). VoIP is nice, but that’s in corporate environments where you don’t want to spend on a physical device.

    Quite a few organisations use Linux desktops, and many are moving towards it here.

  22. How come rhythmbox, banshee etc all have no issues with playing audio on linux, but Skype doesn’t? Is it that rhythmbox et al are using some undocumented APIs? or is it that the skype folks are just not doing something right?

    I’ve been using linux for a while, and with every new release, it never feels like the thing is going backwards, so while you do make valid points, I don’t feel frustrated enough to kick if out and go get Os X. And I actually feel comfortable recommending it to folks who have well defined needs (like small businesses) for which good programs exist, and people who are willing to make the effort to understand the software distribution model on linux (centralized) etc…

    I live in a third world country, and earn less than $2 a day. But thanks to FOSS, I know web development, I am learning C/C++ and I consider myself better than an average computer user. I remember the old days when I used windows, and used to save for months just so I could buy visual basic Standard Edition at $110… you can’t imagine my joy when I found out there were other development tools out there, much less an OS which I could use without feeling like Captain Jack Sparrow 🙂

  23. > We’ve already got two mediocre windowing hierarchical-file-system-based > operating systems aimed at general purpose use. We don’t need a third.

    I currently use NetBSD and Slackware Linux. Neither of these systems is suitable for the so-called ‘desktop’, but they’re still very useful. This is why we DO need more operating systems. I simply cannot have the control I need over my system with Windows/OS X compared to NetBSD.

    > Desktop Linux is severely hobbled by its UNIX roots and lacks the ability > to undergo significant rewrites.

    What’s wrong with the UNIX roots? I use a system that can be traced with little difficulty back to the Unix Time-Sharing System, namely NetBSD. I also sometimes use Slackware Linux, usually considered the most UNIX-like distrobution.

    > Are movable windows really the best way to present information; How > much effort could be saved if we didn’t have to constantly be positioning > and sizing windows?

    The solution has already been mentioned once – tiling window managers. I use dwm, and it’s very hard to go back to traditional window managers. Tiling is vastly more sensible and intuitive.

    • Colniel Crayon – A netbsd desktop, preinstalled with KDe for web users – to my mind – is just as ready as linux.

      NetBSD arriving in the form of a CD is a good mission impossible challenge to get it ready however.

      Compiling KDE, drinking Cocoa next to the fireplace. Those were the days.

  24. Devdas: Thanks for the feedback. I guess I was presumptuous about what people in developing countries need. Thanks for educating me. I do think, however, that whatever effort goes into GNOME and KDE and all the trappings of desktop GNU/Linux (which I think amounts to a lot more effort than people realize) would be better spent on making OpenOffice better and targeting it at Windows.

  25. nucco: The problem with Skype is that it has to use microphone and video inputs. Total mess. I don’t think they did anything wrong.

  26. Colonel: Nothing wrong with UNIX roots if you want UNIX. I love UNIX. I use it every day. But it’s a huge problem if you want to make an operating system your grandmother can use without having to call you up every day. I applaud your courage in recommending Linux to nontechnical people. I’d be too embarassed to recommend any Linux distro to anybody who wasn’t an engineer. In fact, I did just install it for a guy who WAS an engineer, and he can’t deal. He keeps coming in my office asking me how to do things. It’s just not ready for the mainstream yet. It never will be. It’s still UNIX.

    Now, Apple has the resources and the guts to take UNIX and make you forget it’s UNIX. But just look out how much they had to chuck, and how much they had to rewrite. That kind of effort will NEVER happen in the Open Source community. Not enough vision, not enough direction, and not nearly enough discipline.

  27. I don’t recommend Linux to ‘desktop’ users, with the possible exception of SUSE. It’s slow, but it works. As for the proverbial grandmother, I think that NetBSD would be perfect – assuming that I set it up for her. The *BSDs are, by and large, server systems, so they’re very stable. Once I got Grandma’s computer set up, she could use it forever.

  28. To begin…I am of the belief that even the antithesis to your ‘most idealistic Birkenstock-wearing Linux hippie’ would recognise:

    • Those who work on Linux probably enjoy doing so.
    • That the exploration of free thought is paramount.

    Why should the Linux community be singled out for “not making money with the time they have spent on Linux” and “donating it to their favourite charities?”

    The vast majority of people within the community spend their personal time on Linux. They choose to code. They choose to test. They choose to struggle, overcome and achieve.

    It is no more a tragedy that these people choose to do what they do, than my neighbours who chooses to play golf, watch soap-operas, or spend their evenings dancing and drinking…. when they could be working on a cure for cancer.

    So, that is the first point. I think you are trying to make a comment about political correctness…. but it appears to me a statement that is anything but politically correct. You are tilting at windmills.

    I may have misread this though, and would appreciate the clarification because I think this is one of your key points. I share your frustration that as a society we are wasteful and could achieve more. However, we all must learn to respect each others personal choice, freedom of thought, and freedom of expression. I very much doubt anyone is being distracted from the secrets of cold fusion by the lure of bash.

    The second point; I think you are trying to compare the incomparable. It is like you are trying to compare a section of a library to a particular book. Or better yet, the graffiti on the wall outside the library to a book. Linux is not a desktop environment. However, the desktop environment aspect of Linux is comparable – so let’s focus on that.

    You are right that Gnome and KDE only serve to confuse the issue. Personally I would like to see a single unified desktop environment supported and developed by the major distributions. Target this criticism at the major distributions. I expect their response would be two fold:

    • That they have very little leverage over the kernel developers
    • That the other desktop environment is in someway inferior.

    So the kernel will continue to support the known-good, the lowest common denominator, and the distributions will have to build a platform and API model on top of that. Meanwhile the divisive issue of qt libraries Gnome; will continue to be unresolved.

    Point three, a coherent vision isn’t necessarily the best thing. (Hail the anarchy!) On a more serious note; more cooks may not mean a better cake, but it almost certainly does mean more cake.

    I firmly believe that the Linux community is building something which we may not fully see the benefit of for many generations. If since the dawn of time all men (and women) spoke in the same language, we may have achieved more, but I’d never have the chance to be seduced by a French accent.

    Point four, on the poor performance of Skype and Google Earth. One wonders if the reason they behave poorly is primarily due to the hardware implications they have to overcome, or because of their own code. No doubt a combination of both is at play.

    However, [in my experience] over the last few years the increase in hardware recognition has been astounding. Whilst the number of desktop users may not be rising, the number of targetable desktops clearly is. Suddenly, I am reminded of Kevin Costner in Field of Dreams.

    I completely agree that there are developers who consider applications as serving Linux, and not the other way around. For a further example of this, read the threads on a whether Amarok should be ported to Windows.
    There are many that believe that the primary purpose of an Operating System is not to serve programs, but to ‘Operate a System’ and damn the developers; they’ll have to accommodate us. In the free software world these folks can be even more militant.

    There is an economic argument about striving for perfection and failing, as opposed to achieving the best plausible and achieving a passing mark. In the world of free software there are folks who will always strive for perfection as it is a matter of liberty for them. A beautiful kernel does not have to be the exception, but can be the motivation. So top of the wish list is an audio model and a video model right?

    Point five, the upfront cost in dollars and donuts of an Operating System can not be considered as the only cost measure. Many users choose Linux because they don’t want a box that phones home every hour. Was it Tyler Durden that said the things you own end up owning you?

    The point is that there are continuing costs associated with any purchase. There almost certainly is a happy medium between the costs of time and money and the benefit of liberty. Consider how the Ubuntu variants are trying to push towards this with their proprietary-codec-friendly distributions.

    But is it only the geeks that care about libre software? At the moment, but I doubt it will be that way forever. Already we are seeing some governments considering the implications of restrictive software.

    I completely agree with you that Linux won’t always be able to rely on the dominant OS being terrible. Contrary to what appears to be the over-riding opinion on the internet – vista is not bad at all. I expect the next Microsoft release to be even better. Surely it is only a matter of time now before Google are offering us all an online desktop. Which because it is made by Google will be seen as cool.

    Still even though there is nothing else around that can do what compiz does, there is room for improvement. I am sure as the hardware recognition issue reduces, then the desktop environment will improve.

  29. John: Thanks for the comments. You make some very good points. First, to clarify: I would never begrudge anybody for working on linux because they are enjoying it. [Rereading what I wrote, I see I didn’t state that at all, and I should’ve.] Working on hopeless projects for the fun of it is part of the joy of programming. (I’m working on one of my own right now, in fact: a modern, encrypted GUI version of the beloved old UNIX ‘talk’ function that lets you see what the person is typing live. It’s hopeless because I can’t be bothered to do it correctly and write a server to punch holes through firewalls, so it will only work for people with real internet connections who can establish direct TCP connections.)

    I was taking issue with the oft-stated justification for linux that it’s somehow helping. If somebody tells me they are doing something for the public good, I think that opens them up to argumentation on whether or not it’s really in the public good. And if they are being sincere about it, and not simply patting themselves on the back with sweet nothings, then I’d assume they might be interested in having somebody point out to them the possibility that they are wasting their time. If they really are doing this for society, then their enjoyment of the development is secondary, and I’m certainly right to point out that their time might be better spent on other things. Of course, I may be wrong in my conclusions about the usefulness of desktop linux, but that’s a separate issue, and I’m perfectly willing to have my opinion changed on that (and it already has in some ways). And frankly, I’d rather be wrong on this. If desktop GNU/Linux ever gets its shit together it would make me really and truly happy.

    Furthermore, I’m not suggesting everybody should spend all their time trying to cure cancer. Not by a long shot. I hope I didn’t come off as somebody sanctimoniously trying to insinuate we should all be trying to save the world all the time. When I mentioned political correctness, I was using it as a bludgeon with the assumption than those I’m criticizing probably do buy into it. I personally don’t. I’m just saying that if somebody dedicates a lot of time to trying to cure cancer, but they suck at it, somebody should probably tell them.

    Having said that, you really make some good points. I have been looking at desktop linux through the timeframe of normal OS development. If the nature of FOSS is that it takes decades longer but eventually achieves something truly great, something that Apple nor MS could ever do, then I would wholeheartedly support that. Sadly, I don’t think that’s the case, unfortunately. I think by the time linux gets where it needs to go, the idea of a desktop OS will be obsolete, and the FOSS crowd will go about trying to recreate whatever is de rigeur at the moment, achieving just as little success then as they are now (assuming my brilliant argumentative prose hasn’t killed off the movement entirely by then). So, if Google really provides the online desktop, someday, then I’m sure there will be a FOSS project parroting it, except not as well.

    Though you bring up a really interesting point: if software goes completely into the cloud and truly takes advantage of its capabilities (e.g. massive connectivity to varied sources of data, access to massive parallel computation on-demand) then FOSS is probably just plain dead. Google could opensource its entire damn codebase, and what the hell would I do with it? What good is it unless you have the billion dollar distributed computation and database infrastructure? We’re already seeing this to some extent: Where is the FOSS version of GMail? It can’t exist. I’m not sure my computer can really search 5 GB of junk as quickly as those few milliseconds of time where I’m borrowing Google’s massive infrastructure. Maybe it could, but my point is that it’s already on the edge, where we’re getting personal applications from the web where the physical assets are truly enabling it.

    Google Earth is perhaps the best example of this. There will never be a FOSS version of that. It’s impossible. Google could give away the code and data to that, and nothing would change. That must be tough for the really hard core free software folks. I wonder if RMS really never uses Google Earth, or if he does furtively at 3 am in his basement, awash in conflicted feelings before taking a three hour shower as he tries to wash off the shame.

    • Where is the FOSS version of GMail?

      I just wanted to ask you (as politely as possible) did you know that Google ran its business almost entirely off off FOSS?


      This is complicated stuff with a lot of nuiances. I don’t expect you to know everything. {By your tone though – some might – but I like it friendly}

    • I forgot this:

      “It’s hopeless because I can’t be bothered to do it correctly and write a server to punch holes through firewalls, so it will only work for people with real internet connections who can establish direct TCP connections.)”

      Give Wireshark a try – it will tell display network traffic.

      I did this and was able to send scans to a Multifunctional.

  30. While I agree with your basic ideas I was surprised to see that you have not even mentioned what I consider to be Microsoft’s single most important advancement for their platform: .NET.

    .NET is the reason we (a small to medium company) have ditched Linux. With .NET we can use better programming languages (F#), better IDEs (Visual Studio with drag and drop GUI designer), have reliable deployment and a vastly larger and wealthier user base.

    In contrast, Linux has some nice languages like Haskell and OCaml but they are completely uninteroperable with each other and with almost all libraries due to the complete absence of a common language run-time. Deployment is unreliable on Linux due to brittle binary interfaces (we pulled a Linux-based product two years ago because it was so unreliable and have never attempted to sell Linux software since). Finally, Linux being good for people who do not earn a significant income makes it bad for software houses like us. Linux doesn’t just have 2% of the desktop market, it has the poorest 2% of desktop users.

    The problem is that .NET is an essential foundational technology that the vast majority of desktop software must be built upon if you are going to be competitive. Desktop software on the Linux platform just keeps building on sand. For example, consider that by far the hardest part of using GTK+ or Qt from any modern language is the native-code interoperability because those core libraries are not even written in modern, safe, garbage collected languages. Moreover, native-code interoperability for high-level code like GUI libraries is a complete non-starter in the context of multicore computing. The only future-proof way to build such software is to have both the libraries and applications running with a concurrent garbage collector like the one in .NET.

    That is why I think Linux is completely screwed on the desktop. I can well believe that Linux’ desktop market share has been in decline but the ubiquity of multicore computers is about to really drive it into the ground.

  31. Great article.. true words.
    I see all the time how MS tries to attract application developers with their technology stack and that’s what it is all about. You got it..

  32. @Jon Harrop:
    The fact that you program for .net probably means you mainly do high-level application stuff, so one can understand your aversion to lower-level languages. But native-code interop? Do you mean like writing a program with a combination of more than one programming language? Or using a more modern language with the toolkit? Python? C#? Java?

    I also wonder what multi-core CPUs have to do with GUI toolkits.

    I use linux everyday, and asides from well-known issues (like multi-head is still difficult to accomplish), it is stable. If the OS and its associated software was so unreliable, I would have gone insane a long time ago, or switched to something else. {Nvidia and AMD manage to release binary drivers for their hardware. This is an inherently more complicated thing to do than application software, I don’t see them complaining.}

    If Linux had only the poorest 2% of the desktop market, I’m pretty sure the movement would have waned by now.

    But wait, its getting better, and stronger, even in the face of quad-core CPUs and GPUs that can double as a general-purpose CPU… I guess action speaks louder than words.

    Sometimes people make it sound like linux is the only flawed bit of technology out there… See bluetooth programming in Windows? See the filesystem issues the git developers ran into whilst porting git to mac os? Programming the windows API is a black art… .NET exposes only a subset of that API… I have to bend my C/C++ when using Visual Studio… etc etc.

    Curiously, the challenges i find in linux actually inspire me to empower myself to do something about them, since apparently, the only thing stopping me is my knowledge. That’s what I love about it, and that wouldn’t change even if I were a millionaire.

  33. To Nucco:

    We specialize in scientific computing so we use a mixture of low-level languages like Fortran and assembler for simple performance critical code and high-level languages like C# and F# for interactive technical computing with real-time 2D and 3D visualization. We started out on Linux in 2004 and have since migrated to Windows because the developer tools (e.g. F# and Visual Studio compared to OCaml and Emacs) are vastly superior and the market is vastly more lucrative (e.g. we earn more money from our F# software every day than we did from our Linux software in total ever).

    The common language run-time upon which .NET is based makes it really easy to call libraries written in other .NET languages. For example, .NET provides a wealth of libraries like Windows Presentation Foundation (WPF) that are written in C# but we can call them transparently from F# without having to worry about copying, marshalling, message passing, shared data structures, manual deallocation, type safety, segmentation faults and so on. That is almost impossible when calling between high-level languages on Linux. Moreover, this was the last nail in the coffin of one of our Linux products, Smoke:


    Smoke was a high-performance 2D vector graphics renderer that is much faster than Microsoft’s Windows Presentation Foundation. We would have loved to develop this into a competitor for WPF, to give Linux a completely hardware accelerated desktop environment with high fidelity graphics. Our productivity using OCaml and OpenGL was incredibly high and we could use the codebase to develop amazing applications really easily. But the project was an abysmal failure for several reasons. Firstly, our compiled OCaml code did not run reliably on customer’s machines due to subtle differences between core libraries and OpenGL drivers between Linux distributions. So we could only sell in source form. Secondly, without a common language run-time users are forced to write their code in OCaml and few other people want to do that. So we shelved the original complete application (Presenta) and tried to sell source licenses to Smoke. Then we gradually reduced the price but we have never sold a single copy of Smoke. Now we are giving Smoke away in the form of compiled OCaml bytecode and people are not even using that because it is so uninteroperable.

    We clearly have very different observations about the state of Linux. You herald the closed source graphics card drivers as an example of great commercial code for Linux but I would hold them up as an example of exactly the opposite. The ATi Linux drivers are so notoriously unreliably that we have always avoided their graphics cards. nVidia’s drivers can be vastly more reliably but are a huge amount of work to install and maintain, and (as our product showed) third-party closed source software cannot rely upon them to work reliably between Linux distros. Installation of nVidia’s own package requires the user to manually select the version of GCC that their kernel was compiled against. If that isn’t pain enough, this machine here actually requires me to manually recompile and reinstall the nVidia drivers from scratch by hand every time I reboot or X will not start. So I think graphics drivers are an excellent example of Linux doing an extremely poor job of supporting commercial software even from major vendors and even when it is of paramount importance for almost all desktop users. If nVidia cannot make their Linux drivers user friendly, what hope is there for a small software house on the Linux platform?

  34. I work with both XP and linux. I have tried to work with Vista, which I enjoyed for a week, but Vista requires more power then the standard desktop or laptop can provide so I abandoned it.

    XP has a legacy of rich applications. Sharepoint, a collaborative product from Microsoft creates a dependency for MS software. That keeps me there.

    The previous author Jon Harrop revealed his companies frustrations with development under Linux, and I have to say he was right. Linux two years ago was as he experienced.

    But the hardware and software worlds are evolving at a very progressive rate. Two years ago, dual core was an expensive solution for a workstation. Shortly quad core will be common, and affordable.

    As the need for compute power grows, I am forecasting a new architecture, perhaps 64 bit instruction with 128 or 256 bit databuses to overcome the current cumbersome Intel architecture. I believe Linux would be more scalable.

    As for software development, there is the Eclipse workbench, common to both Linux and Microsoft which was not available even for linux two years ago.

    And in forecasting Linux’s growth, we should look beyond the USA, which will not be a significant manufacturer of new software. Emerging countries with low wages will do the engineering and development to produce newer functional software, perhaps even a Linux replacement. The Linux market in Eurasia, Africa, South America, India and Russia is growing because foreign countries do not want the USA to hold them hostage to operating systems, to hidden backdoors for encryption code or to receipt of the OS patches. Microsoft is going to have very major competition from these emerging software engineering countries.

    In my field, ERP systems for manufacturing and big business, this has already happened. The best ERP software is from Germany and India.

    So what am I getting at? My answer is that Linux will expand more quickly then will Microsoft’ desktop market, it will be used where it is a low cost open source competitor– in cell phones, blackberries, eepc and other products. It will be used ubiquitously in the countries I indicated, because computer science university courses are using Linux as a classroom product, The next generation of IT people will be biased towards Linux.

    Even Microsoft realizes that fact, and if you track Microsoft’s growth strategy, it is to concentrate away from the desktop. Microsoft is trying to provide their own ERP system, to enter the search engine market try to compete with Google, and Yahoo. Microsoft is entering into vertical markets which until recently they have always ignored.

    So, for now, as far as the desktop is concerned, the buck is in the MS and Apple pockets. Look at RedHat, Novell, and Ubuntu, three formidable competitors who I predict will collectively own the desktop within the next 5 years.

  35. I tried Redhat 5 back in the day (this is previous to their current numbering and probably 1998 or so) and had trouble installing Netscape and setting up my NetZero dialup connection. Years later (2001?) my brother gave me a copy of Fedora Core 2 (Red Hat) and rather than trying it I ran Knoppix to see how things really were. Success. It replaced Windows ME with flying colors in every way. I am truly sorry Linux hasn’t been a better experience for you. I am a Windows SysAdmin but prefer Linux greatly. I am NOT a programmer. Due to your hardware, OS choice, or expectations Linux is not appropriate for you. You told us on your blog. I also regret the Linux True Believers who so lack in tact and reasoning powers that they think there is One True Way. That being said I will now move on because your post reflects either a lack in knowledge/experience or is intentionally misleading. xorg.conf is nothing like what you say, not optimized for human editting, not complicated at all. I do not recommend Linux for anyone’s grandma (yet). They INTENTIONALLY use a ‘flat file’ rather than crazy registry settings for this stuff. I would use Linux even if it was XP without the Registry because Windows Registry is ugly, stupid, dangerous, inefficient, inappropriate, awkward, convoluted and unnecessary. I say this as a Windows SysAdmin who has to use regedit and finds it easy but self-destructive. Please, next time you criticize Linux, compliment them on using flat files for configurations (unless you use Gnome then just say it’s more stable!).

  36. Bang on Target, Standardising is the way to go. Every time i try to shift to linux, by the time i get configuring everything i am dead bored of even hearing anything about linux

  37. There is room, and a place in the marketplace for everybody. Microsoft will go where large capital investments and strong patent laws protect it, they will dominate their market segment and their populace with patented expensive secretive restricted software. They belong, selling to lazy, indolent, rich American markets where people solve problems by throwing the cash they have at them. Linux systems will invade Europe and the rest of the less well to do world, where problems are solved by tackling them and wrestling them into workable solutions. European and third world children will learn the open source languages and in the long term, innovate in ways far beyond anything the now dying Microsoft Corp. ever imagined, simply because of the huge number of minds sharing in the development on the internet! The Chinese have yet to enter the race! Right now they are content to steal copies of Microsoft’s endeavors, but sooner or later, the brighter among them, and there are a lot of them, will take hold and probably using Linux if not a ‘cracked’ version of Microsoft software, or a methodology of their own, simply outstrip, outproduce and inundate all computers with amazingly clever software at ridiculously low prices for all people. Microsoft is at the far end of its business curve and will soon diversify into areas of investment banking et al. Once they find that they can realize a bigger better profit than they are getting from software, and other software is replacing them in the relatively ‘small potatoes’ software world they will go in another direction, as many American corporations have in the past. the writing is on the wall, it is only a matter of time . . . a short time I hope, and they will drift away into brokerage houses and high finance, where they belong now.

  38. Uncle B: Are you suggesting that mass amounts of innovative, cheap software is about to flow into the world from Europe and China? Maybe you’re right about China, but did France just get computers last week? I have tremendous respect for European programmers; they are the best in the world. But Europe has had Linux for a while, and over a decade to come up with cheap alternatives to the commercial software that you think plagues us, and it hasn’t happened.

    Since you don’t like finance people so much, I’ll list some other industries that are dependent on commercial software and are ripe for the picking. Academics are addicted to MATLAB and Mathematica, both of which are incredibly expensive. Everybody uses Excel. Visual arts and industry uses PhotoShop and Illustrator. Not a single free or cheap version has been able to make a viable competitor to these, and it’s as much due to quality as it is proprietary lock-in. Maybe, just maybe, good software development is expensive?!?

    The real problem is that while it may be cheaper to USE linux, it sure as hell isn’t cheaper to develop for it. And THAT’s what matters. (You should try actually reading my post some time. It’s not that bad.)

    The free/cheap software revolution is always juuuuust around the corner, isn’t it?

    • Jonathan – 1 hyena vs 1 lion – the hyena looses. That’s why hyenas circle a lion in large numbers. Then the hyenas eat.

      China has a national goal of software independence. That’s a lot of Hyenas.

      Now – if everyone were able to give me a penny – I’d be very wealthy. Then if I was able to copy those pennies as easily as software can be copied – well – now do you see the writing?

  39. Obviously huge problem for Linux desktop is usability. Open source stuff is too often sorely lacking in that department and I do understand why. It’s just pretty difficult to do it well without investing some money in it. You just mostly don’t get normal people to participate in usability testing without paying them.

    • I agree 100%. The OSS community seems to think the only thing you need to build software is a geek and a computer. Having seen the physical testing infrastructure involved with software development at Adobe, for instance, I can see why Linux fails so miserably.

      • First the OSS community are not the Borg. Secondly – it would be safer to say – a lot of geeky hyenas with a lot of computers,

        But why add the geek part??

  40. I think it can reasonably be deduced that linux and OSS means a lot of different things to different folks, and many of these folks have the resources, skills and/or time to keep it going forward. I don’t think any more argument is necessary.

  41. personally, i think if a company like google would create a unix based operation system, it could wouldnt fail, because they have the reach and the power to make it popular, sorry if its a little offtopic, but i really think that there should be 1 simple and popular linux.

    I use ubuntu and opensuse, and i think they are too complicated for an average person.

  42. On July 16, 2009, I had to admit to myself that I gave up on the idea of keeping our accounting firm out of Microsoft upgrade treadmill.

    I’ve saved the company the cost of two upgrades from the time we went with Windows 2000. I invested a lot of time to avoid this day – I really hoped that Wine would catch up rapidly and provide application compatibility in Linux .

    Finally – about a week ago a client gave us a backup of a Quickbooks 2009 file – and we couldn’t install the copy we had to purchase just to open the damn thing. And even though Intuit keeps coming out with new versions of Quickbooks, the quality of our client’s accounting records are never any better.

    So, having to buy a newer version of Quickbooks Slow every year just to access our client files annoys me to no end. They really don’t change the software significantly each year it comes out – but purposely change the data file format so folks like us will have to get a newer version. Meanwhile the program gets slower every year.

    What pisses me off secondly – is the fact that anyone could use GnuCash to do everything they need to do, to keep a set of books. If they would, both they and our firm would save money.

    The truth is that even though Quickbooks looks easy, I’ve seen some clueless clients completely misunderstand how to use it. So even though it looks straight forward – that doesn’t ensure that the output will be correct.

    It’s not the ease of the interface, but rather, the accounting skill of the user that ultimately determines the accuracy of the information in their records.

    But since most people don’t have faith in free software or lack the confidence to try Gnucash, I have to pay Intuit $200 every year. It would be much better for both of us actually, if they just gave me the source code. I’d compile it in Windows 2000 – wait a few days – and it will run in Windows 2000. Scratch that – I’d compile it with Winelib or in Linux natively, if possible. But for all practical purposes – Windows 2000 would give the best chance of success.

    I’d then give Intuit the binaries so they could put it on a CD and sell it to other customers. I’d still pay Intuit for the software – as much as it would irk me to – given I have other tools available (Gnucash) – and that an operating system, is an operating system, is an operating system. Once you can boot into a window manager and the kernel has a way of running apps – everything else is just academic. Its just a matter of making the effort to get the software to run elsewhere – or giving the opportunity to someone to do it for you.

    So as a result of being forced to upgrade, I next have to deal with an oddesy of getting Vista to install. Fry’s is out of the upgrade CD – but they still have 3 copies of the regular version on the shelves, but its $100 more expensive then the upgrade. The software is locked up in a cabinet – you have to ask an employee for permission to grab the DVD, walk out of their sight to the large row of cash registers and pay – vs. just grabbing it and paying. Stupid – makes a person feel like they are unimportant.

    Thank you for letting me touch this packaged box of bullshit.

    Since I initially can’t get what I want to save the company $800 overall – I go to the Internet “Microsoft Store” and place an order – but it has no way to separate out a billing and shipping address. It says billing address – you put it in – It says – “We don’t ship to P.O. Boxes.” I say – “Fuck you, its a download.” It can’t hear me. I try to place the order a few times – but the order form lacks the ability to have a shipping address that is different then the credit card billing address. I ended up alternating the office and credit card address between errors (which is always needed to approve an order)- and to my surprise, the order went through.

    Hours later, I still hadn’t received an email confirmation from the Microsoft store saying – start downloading. I mean this should be an automated process. Everyone else can respond reasonably – just not Microsoft.

    A day later still nothing. Must be the billing address thing I thought. I buy again – still no instant access so I call. 1/2 hour on the phone to have the guy have me repeat the info already given on the web site. “Ok I’ll release the product.”

    Huh? I thought I was a customer. Being from the Linux world, I’m not used to being treated as if the software I was going to use, required an interrogation and subtle hints that communicated to me that I was a bad person and would continue to be assumed as such.

    10 minutes later I get a call from the Microsoft Gestapo to “verify my order.”

    Lets see, Bill gates is a multibillionaire – but he’s got to have so much security around his product that I have to go through all this bullshit. God forbid he looses a penny.

    So I go through another interrogation verifying the information I provided, as if I hadn’t paid them for their product. Keep in mind that it’s a product I don’t even want, or believe should be necessary.

    I start to download the Vista installer – 1/2 way through the hour download – 1 of the three download files stops and can’t resume. I then download each file one at a time and finally get the damn thing an hour later.

    And get this – the application that decompresses the installation files won’t run in Windows 2000. I later found it won’t from the DVD of Windows Vista Ultimate upgrade. Even though Microsoft advertised that Windows 2000 owners are eligible to purchase Vista at the upgrade price. The message I got was: you can buy it – you just can’t use it.

    If I was good little runt and paid for Windows XP the Microsoft Gods would allow me, the piss ant, to decompress the files with their utility. But because I’m not overly concerned about giving them $200 every couple of years, my money’s good, but my satisfaction as a customer is unimportant.

    After fooling around with 7-Zip I was able to decompress only a portion of the download file and I can even get setup to start – but I can’t get past “Can’t find the install.wim file.”

    After installing Vista – I don’t find it to be an improvement, but instead, a big nightmare. First, its taking up not 5000 MB like Windows 2000 – but almost 10000 MB of space – the entire size of my partition – which means I can’t install anything. I never imagined an operating system requiring so much room, so much more RAM and providing so little additional benefit besides some eye candy, that isn’t very good. And what did I get for the extra 5000 MB? Nothing really. Just a lot of heartache to install a few hundred megabytes of code from Borland – code that they give away for free.

    Vista doesn’t allow me to do anything more then I could do before with Windows 2000 – or anything new I want to do now. All its done, that I couldn’t do before – is install Quickbooks Pro 2009, which, according to Borland, has a database engine that is compatible with Windows 2000.

    However, I now have a slow bloated operating system that requires a tremendous amount of tweaking just to equal the performance I had in Windows 2000 with the same hardware. Plus Microsoft purposely broke functionality with Samba, which their developers easily remedied. So, I can’t browse the network until I upgrade Samba, because, of course, Microsoft had to mess with that.

    Vista, after the install started installing updates like mad. I’ve ended up rebooting the computer too many times to count. It’s a huge time suck just to open up some Quickbooks files – that Intuit could have allowed me to open with Quickbooks 2008 if they so decided.

    The next insult is the ultimate disrespect for the customer. The best I can say is that it’s the ultimate in stupidity, giving Microsoft the benefit of the doubt.

    I ended up buying two copies of Vista. An important thing to note is that the contents of the CD never change, but instead, certain parts of the operating system are installed based upon which key code you enter. They admit that in the error they show below – but this is well known around the Internet.

    I used the CD I got from Fry’s to install Vista over an existing installation of Windows 2000. I used the product code for the upgrade – and I received the following error during the installation with Windows 2000 sitting on the same partition I was installing to, with the same disk that didn’t have this “problem” when I used a code for a non upgrade version:

    “The following failure occurred while trying to use the product key

    Description: The software licensing service determined that this specified product key can only be used for upgrading, not for clean installations.”

    I went ahead and installed anyway – not entering in the code. After all I know that the DVD contains the right amount of software on it to do a full install.

    After I installed and tried to activate the product, it said:

    “The Software
    Licensing Service determined that this specified product key can only be used
    for upgrading, not for clean installations.”

    Recall from above that the Vista Upgrade will not work in Windows 2000.

    So in other words, Microsoft flat out lied; even though I bought a product that was advertised as an upgrade from Windows 2000, I wasn’t allowed to install the product from both within Windows 2000 and outside of Windows 2000.

    I couldn’t install the product unless I installed twice – all because of the “copy protection” measures taken by Microsoft – which included Microsoft not ensuring that installer functioned properly from within Windows 2000. And again, all of this is not because of any real technical reason that could not have been prevented, but rather, because Microsoft was dishonest in its advertising. Perhaps Microsoft decided not to care about customers that didn’t pay $399 instead of $219.95, even though they offered it to me for that price, and that they don’t like the fact that I didn’t buy Windows XP. But what it really is, is disgusting greed and unjustified guilt and suspicion cast upon the public.

    All that extra hassle to make sure poor old billionaire bill gets paid. Microsoft truly cares about its customers.

    I Googled these issues to look for an answer – that I felt was appropriate. Because nothing other then Vista accepting the code I paid $219.95 for was appropriate. And in that light, downloading a crack would have been appropriate. Because I was treated, during this entire episode as if I had stolen the software, when I had actually paid for it. I didn’t find the answer I wanted – but a way to work around the issue instead. That involves performing a second installation, that does not thing to prove I qualified for the upgrade. Meanwhile – when I had proof that I qualified I was told it I wasn’t allowed to use the software I paid for.

    and found some interesting conversations, that reflect the psyche of the public when it comes to Microsoft: Its OK for Microsoft to rip off its customers. Anyone that complains must be “pirating” software because Microsoft would never do anything dishonest. If its not pirating of software, the end user surely is incompetent. Bottom line is that anything you buy from Microsoft is a gift from Microsoft, even though they have made more money then any effort, any work any suffering should allow. But, that’s OK because its “The American Way” and “Capitalism” and “Survival of the fittest”

    From: http://www.vistax64.com/vista-installation-setup/37064-activation-error-0xc004f061.html

    Poster Question:

    “I’m getting an Activation Error Code: 0xC004F061 when I try to activate my
    Vista Home Premium upgrade. I was running XP Pro on this machine, and when I
    was doing the install, the installer said I had to do a clean install from XP
    Pro. Now The error message is giving me the description “The Software
    Licensing Service determined that this specified product key can only be used
    for upgrading, not for clean installations.” Does this mean that even though
    I had a Genuine Windows XP Pro on my computer, I can’t use the upgrade to

    To top it off, it keeps telling me that my 90 day free support call period
    has expired, although I’ve never called it before, and just installed Vista a
    few days ago. Anyone have any ideas how I can fix this issue?”

    First responder:

    “To use an upgrade version you have to start the upgrade from within an
    installed, activated and genuine copy of XP. It will then allow either an
    in place upgrade (for certain upgrade paths) or only a custom installation
    (programs and data are not migrated). There has been published a work
    around to allow the installation of an upgrade version w/o XP being
    installed. Here is one link for it. Use at your own risk.


    If you want to migrate data and settings from the XP installation use the
    Windows Easy Transfer (WET) on the XP installation before the upgrade.”


    “I did start the upgrade from within XP…so it sounds like I’m pretty much
    hosed because I did what the upgrade told me I needed to do? Am I completely
    unable to upgrade from Windows XP Pro to Windows Vista Home Premium?”

    The next set of posts are classic examples of the mindset of Windows users:

    Poster 1:
    1. What will Vista check for?
    2. Will it want a valid CD key for the previous OS?
    3. Will it give you the option of installing the 64 bit version clean? (assuming u have a 64 bit cpu)

    Poster 2:
    As far as I know, the Vista upgrade disks will require verification of Windows XP and then you will have the option to clean install or upgrade.
    We will know more on or about January 30th:)

    Poster 1:
    Thanks for that but any idea about the rest of the post? Clean installs are need when you install a new M/B or your pc crashes etc, what will MS need to verify you have a legit CD/DVD

    Poster 2:
    The EULA is the same as XP now.

    Poster 3:

    You may be able to get away with that if your comp has XP on board or the CD to pop in during verification. However, I don’t approve of the use of a pirated copy of XP:(

    Poster 1:
    Not so much the pirating issue but does someone actually know the process of upgrading and technically what will happen if you want to upgrade from a fresh install using an upgrade version of vista. We aren’t really answering the initial post but don’t get me wrong, i appreciate any input. i just want to know the actual process, see my first post.

    Poster 4:
    I’ve just received my MAPS (Microsoft Action Pack) copy of Vista Business Upgrade. When booting from the DVD the installer asks for the Serial Key which I entered. However the installer refuses to install as “This version must be launched from Windows” So I deleted the key and continued the install selecting BUSINESS as the install at the relevant screen. However when now I try to activate Vista I get the following error code: 0xc004f061 and the text “The Software Licensing Services determined that this specified product key can only be used for upgrading not for clean installations” ***! Upgrades in the past have always allowed you to clean install as long as you had the original disks. So now it looks like I’ll have to install a clean copy of XP just to get this version installed…

    Poster 1:
    I called Microsoft this morning and spoke to their tech support. They said this was the first time they had heard of an activation problem. I got through to a real tech support guy, and he asked me about whether I was installing an upgrade or full version. I told him it was Windows Vista Home Basic, bought at Staples for $99, upgrading from a pre-loaded Windows XP Home on my Dell PC. I explained that the “Upgrade” label on the front of the box said “For users running Microsoft Windows 2000 Professional, Windows XP, or Windows Vista only. Backup and clean install may be required. See back of box for details.*”. The back of the box says the following: “Back up all your files and settings before upgrading. You must perform a clean install of Windows Vista and then reinstall your existing files, settings, and programs, unless you are upgrading from Windows XP SP2 Home Edition.” This would lead anyone to believe that a clean install is possible when upgrading. I tried the same things as mentioned above. Booting from the Vista DVD (because clean install was NOT an option when launching from Windows XP), then trying my Product Key at the beginning of the install process, which did not work (“this version must be launched from windows”), skipping it, and choosing “Home Basic” as the version being installed. Then tried to activate from the Control Panel in Vista and received Code 0xC004F061: “The Software Licensing Service determined that the specified product key can only be used for upgrading, not for clean installations”. The guy I talked to at Microsoft said that I would have to reinstall XP on my machine, and do a non-clean upgrade to Vista for my key to work. In other words, a clean install does not appear to be an option for versions of Vista labeled “Upgrade”, even though right below the Upgrade label it says “A clean install may be required”. This makes me wonder what would happen if you were not running Windows XP SP2 before upgrading, and were “required” to do a clean install with an “Upgrade” version. If Microsoft does not want clean installs to be an option for Upgrade customers, it should not be listed as possibility on the FRONT OF THE BOX, the back of the box, the Quick Start Guide, and in the setup dialogs. The right hand has no idea what the left hand is doing in this case. Excuse me, I have to go downgrade to XP now…

    Poster 4:
    We don’t install any illegal copies here, so I can’t speak about that. It never mentions that it does a valid key check, but I imagine that’s done in the background — most downloads from their website even require the check.

    Poster 5:

    Use Vista’s ‘upgrade’ version to clean-install

    The secret is that the setup program in Vista’s upgrade version will accept an installed copy of XP, W2K, or an unactivated copy of Vista itself as evidence of a previous installation.

    This enables you to “clean install” an upgrade version of Vista to any formatted or unformatted hard drive, which is usually the preferred method when installing any new operating system. You must, in essence, install Vista twice to take advantage of this trick. But Vista installs much faster than XP, so it’s quicker than installing XP followed by Vista to get the upgrade price.

    Here’s a simplified overview of the steps that are required to clean-install the upgrade version of Vista:

    Step 1. Boot the PC from the Vista DVD.

    Step 2. Select “Install Now,” but do not enter the Product Key from the Vista packaging. Leave the input box blank. Also, turn off the option Automatically activate Windows when I’m online. In the next dialog box that appears, confirm that you really do want to install Vista without entering a Product Key.

    Step 3. Correctly indicate the version of Vista that you’re installing: Home Basic, Home Premium, Business, or Ultimate.

    Step 4. Select the “Custom (Advanced)” install, not the “Upgrade” install.

    Step 5. Vista copies files at length and reboots itself one or more times. Wait for the install to complete. At this point, you might think that you could “activate” Vista, but you can’t. That’s because you haven’t installed the Vista upgrade yet. To do that, run the DVD’s setup.exe program again, but this time from the Vista desktop. The easiest way to start setup again is to eject and then reinsert the DVD.

    Step 6. Click “Install Now.” Select Do not get the latest updates for installation. (You can check for these updates later.)

    Step 7. This time, do enter the Product Key from the Vista packaging. Once again, turn off the option Automatically activate Windows when I’m online.

    Step 8. On this second install, make sure to select “Upgrade,” not “Custom (Advanced).” You’re not doing a clean install now, you’re upgrading to Vista.

    Step 9. Wait while Vista copies files and reboots itself. No user interaction is required. Do not boot from the DVD when asked if you’d like to do so. Instead, wait a few seconds and the setup process will continue on its way. Some DOS-like, character-mode menus will appear, but don’t interact with them. After a few seconds, the correct choice will run for you automatically.

    Step 10. After you click a button labeled Start in the Thank You dialog box, Vista’s login screen will eventually appear. Enter the username and password that you selected during the first install. You’re done upgrading to Vista.

    Step 11. Within 30 days, you must “activate” your copy of Vista or it’ll lose functionality. To activate Vista, click Show more details in the Welcome Center that automatically displays upon each boot-up, then click Activate Windows now. If you’ve dismissed the Welcome Center, access the correct dialog box by clicking Start, Control Panel, System & Maintenance, System. If you purchased a legitimate copy of Vista, it should quickly activate over the Internet. (You can instead activate by calling Microsoft on the phone, which avoids your PC exchanging information with Microsoft’s server.

    Poster 5:
    Thank-you for the report about using a upgrade verison of Vista. I think Microsoft knows about this but is keeping tight lipped about it because this means lost revenue. I will not be buying a retail (Full) version of Vista because as the above post shows how to install Vista in a clean state using the upgrade version of your choice. Come to think about it I think I read somewhere there are oem versions available as well. I’m all for saving money when the top version (Retail) cost $400.00

  43. This is getting a bit more subjective, but I much prefer the Zune Marketplace. The interface is colorful, has more feeling, and some cool features like Mixview ‘that let you quickly see related albums, songs, or other users related to what you’re listening to. Clicking on one of those will center on that item, and another set of “neighbors” will come into view, allowing you to navigate around exploring by similar artists, songs, or users. Speaking of users, the Zune “Social” is also great fun, allowing you to find others with shared tastes and make friends with them. You can then listen to a playlist created based on an amalgamation of what all your friends are listening to, which is also enjoyable. Those concerned with privacy will be relieved to know you can prevent the public from seeing your personal listening habits if you so choose.

Leave a Reply

Your email address will not be published.