• Updated 2023-07-12: Hello, Guest! Welcome back, and be sure to check out this follow-up post about our outage a week or so ago.

Anybody use a Linux machine as their main desktop machine?

II2II

Well-known member
Well, my main issue with Linux is that it fails to distinguish itself in many areas. The areas that it does distinguish itself are either irrelevant, or just mildly interesting.

A distinguishing feature would be most window managers. There are dozens of window managers out there, but most of them fall into one of two broad category: traditional overlapping windows or tiled windows. The dozens of window managers is what I would call an irrelevant feature. After all, would someone choose Linux over Windows simply to choose a Window manager? Very few would. The two categories of window managers is what I would call an interesting distinguishing feature. Tiling window managers allow you to create a different working environment (and, in my opinion, a better working environment). Some people will choose Linux because of the interesting bits of differentiation, but on the whole most people will just learn how to make due. Simply put, most people don't care enough about operating systems to make the switch.

Mostly, the lack real differentiation makes me sad. Linux is an open source environment, so anyone can make it into whatever they want. Well, provided that they know enough people to help implement that vision. I suppose that everyone would have their own list of ideal ways to differentiate yourself, but here's mine:

1) It's open source, include the source and have that source modifiable without the edit-compile-link cycle. Ideally, it would be modifiable during runtime, and the system will have built-in debugging facilities that can be called up on the fly. Open source, in this context, is really just a monkier that means "you have access to the source code if you really want to go to the trouble". Something like Sugar (the XO's Python based environment) is closer to my ideal. Smalltalk environments, like Squeak, are an ideal example of open source where you want to muck around with the source. Alas, it does not have much supporting software.

2) It prides itself in it's command line environment, but the command line (as it stands today) is a barbaric relic of the past. Which is why shell scripting is being replaced by scripting in languages like Perl, Python, and Ruby. Not only is the scripting environment of the coveted shell primitive, but it lacks the command editing abilities of other environments (e.g. MPW or Smalltalk). Very few shells also implement features that have become commonplace in programmer's text editors, like syntax highlighting.

 

paws

Well-known member
1) It's open source, include the source and have that source modifiable without the edit-compile-link cycle. Ideally, it would be modifiable during runtime, and the system will have built-in debugging facilities that can be called up on the fly.
I trust you're ignorant (in any sense of the word) of the technical aspects of this.

Why exactly is it you want to do this to your kernel? I honestly can't see a good use for it...

It prides itself in it's command line environment, but the command line (as it stands today) is a barbaric relic of the past.
Only to people who don't know how to use it.

No GUI on Earth can do this:

tar -c /documents/essayonstuff | gzip > essay.tgz; ftp -u ftp://me@myserver/dir/essay-`date +%m%d%H%M`.tgz essay.tgz; rm essay.tgz

(Tars up a folder, gzips it, uploads it to a server with a timestamp in the filename, and deletes the tgz on your harddrive. You can write a GUI app to do that, of course, but you can't chain small apps together like that with the relative ease... Automator comes close, I suppose, but )

Sure there are things to I miss from the shell, but it's certainly not a relic.

 

pee-air

Well-known member
A distinguishing feature would be most window managers. There are dozens of window managers out there, but most of them fall into one of two broad category: traditional overlapping windows or tiled windows. The dozens of window managers is what I would call an irrelevant feature.
You're the first person I've seen call the dozens of window managers a "feature". I suppose it would be a feature if you're of that disposition. The typical desktop user would choose between one of three or four available desktop environments though. (ie. Gnome, KDE, XFCE, etcetera)

After all, would someone choose Linux over Windows simply to choose a Window manager?
Very few, if any, would. Although a person would choose Linux over Windows if a graphical user interface were not desired.

Simply put, most people don't care enough about operating systems to make the switch.
Conversely, most people wouldn't care enough about an operating system to want to switch from Linux to something else.

Mostly, the lack real differentiation makes me sad. Linux is an open source environment, so anyone can make it into whatever they want.
Which is how the kernel (Linux) that was never meant to be an operating system became one (GNU/Linux).

Well, provided that they know enough people to help implement that vision.
Linux has come a long way. I don't see the numbers as being a problem.

1) It's open source, include the source and have that source modifiable without the edit-compile-link cycle. Ideally, it would be modifiable during runtime, and the system will have built-in debugging facilities that can be called up on the fly. Open source, in this context, is really just a monkier that means "you have access to the source code if you really want to go to the trouble". Something like Sugar (the XO's Python based environment) is closer to my ideal. Smalltalk environments, like Squeak, are an ideal example of open source where you want to muck around with the source. Alas, it does not have much supporting software.

2) It prides itself in it's command line environment, but the command line (as it stands today) is a barbaric relic of the past. Which is why shell scripting is being replaced by scripting in languages like Perl, Python, and Ruby. Not only is the scripting environment of the coveted shell primitive, but it lacks the command editing abilities of other environments (e.g. MPW or Smalltalk). Very few shells also implement features that have become commonplace in programmer's text editors, like syntax highlighting.
No comment on your list. It's your list. It's what YOU want. There's no debating what YOU want; only YOU know what you want.

 

II2II

Well-known member
Why exactly is it you want to do this to your kernel? I honestly can't see a good use for it...
Take a look at Squeak. It is a small talk environment which gives you access to all of the code, so that you can modify it in situ and during runtime. The code is there, so it is easy to access. The entire system is designed so that it is easy to modify its internals. It is also open source software. This is the ideal manifestation, IMHO, of the principles of open source software.

Sure, Linux and the associated applications are open source in the legalistic sense. But most people don't really care about the open source bit because they will never modify the source to their favourite software. This is evident in the sheer number of binary oriented distributions. This is evident in the distributions that don't even install a C compiler by default. In the end I would argue that most people only care about open source because they like the idea of unencumbered software.

No GUI on Earth can do this:

tar -c /documents/essayonstuff | gzip > essay.tgz; ftp -u ftp://me@myserver/dir/essay-`date +%m%d%H%M`.tgz essay.tgz; rm essay.tgz
Two can play at that game:

FILE="essay-`date +%m%d%H%M`.tar.bz2"; tar cjf $FILE /documents/essayonstuff; scp $FILE me@myserver:/dir; rm $FILE

You get better compression, secure authentication, encrypted file transfer, and you can use it in a script. As an added bonus, it's shorter to type. Please read the man pages before you make a fool of yourself on a forum. (Sarcasm intended.)

Of course the reason why you have to go to all of this trouble is because your using LaTeX or some other text processing system which doesn't automagically encapsulate everything into one file. That would make it easier to just gmail a copy of your essay to yourself, which is what most people do for the remote backup and version history of important documents. (No sarcasm intended.)

But my point about barbarism has nothing to do with these one line shell masturbations that make us look like wannabe obfuscated coders. My point is that the nature of shells has changed very little over the past two decades. I already gave the example syntax highlighting, which is a common programming tool to improve readability and to catch common errors. The problem is that you don't get it in your shell (unless you use fish). Shell scripting hasn't changed terribly much either, leaving languages like Perl and Python to pick up the slack. Or take a look at command histories. I don't know whether we've gone forward or backward on that one. Sure, using cursor keys (and other bindings) make it easy to go back and forth, but bang command number is far faster. If you remember the number. Problem is, MPW and the Smalltalk solved that problem decades ago. They also provided a way to manage the broken history mechanism in todays heavy multitasking environments.

But my fundamental reason for calling shells a relic goes beyond this notion of my personal list of missing features. My reason goes back to the suggestion that shells have barely changed in two decades. That means that they are not evolving to fit the changing face of technology. And in an industry where rapid change is the norm, stagnation will lead to extinction.

 

TylerEss

Well-known member
I use OpenBSD on my AMD64 machine at work; it's a lot like linux but more spartan. Works great for what I need. I only really use my TiBook for web browsing and playing Diablo II. :)
How is the install? Graphical and easy?
It's text-based and easy. :-D It does very little or nothing unless you tell it to; this is both it's greatest strength and greatest shortcoming.

 

paws

Well-known member
Take a look at Squeak.
I have played with it, I think. Maybe the principles are good, but I can't really imagine doing all my daily work inside it.

The entire system is designed so that it is easy to modify its internals. It is also open source software. This is the ideal manifestation, IMHO, of the principles of open source software.
So you're saying making the source code to a programme available is worthless if it does not sacrifice size, speed, stability and portability in order to make it hackable by people who can't read? I really don't know what to say to that. The compile-edit-link cycle is just a technical necessity, I think.

There are quite obvious advantages to free software even for people who don't know how to hack it.

You get better compression, secure authentication, encrypted file transfer, and you can use it in a script. As an added bonus, it's shorter to type. Please read the man pages before you make a fool of yourself on a forum. (Sarcasm intended.)
Thanks for that improved version. I'm not entirely sure how I made a fool of myself, though.

I use FTP instead of SCP because I backup to a remote server where that's the only option. The fact that you can use either only serves to strengthen my point, which was not in my shell programming skills (which I'll be the first to admit are lacking - I do read the man pages, far too often!), but in the use of pipes and output redirection. I'm glad you mention scripting, too, as that's an obvious use for it. I've tried doing some maintenance scripts in Ruby, but for most things I basically end up calling the shell from there instead. There are some bonuses to it, I'm not saying bash is perfect, but it's hardly barbaric either...

Of course the reason why you have to go to all of this trouble is because your using LaTeX or some other text processing system which doesn't automagically encapsulate everything into one file.
No, it's because of the way I work. When I'm writing a larger document I'll invariably start in six different places and only connect them at the very end. It makes perfect sense for me to keep those in seperate files. I did that when I used Word, too.

Also, there are some advantages to *TeX. The ArabTeX package, for instance, which first of all produces very high quality output (more than can be said of most word processors I've seen - typesetting Perso-Arabic text is damn difficult), can also output both translitteration and Arabic text from the same input, which is incredibly handy. That's not going to work in WYSIWYG no matter how you spin it. I'm sure the mathematicians and other people who typset formulas with TeX do it for a reason..

That would make it easier to just gmail a copy of your essay to yourself, which is what most people do for the remote backup and version history of important documents. (No sarcasm intended.)
Well, as you can see, it's not what I do.

I could zip things up, open a browser and use GMail, yes. I can even come close to the integration the shell offers with drag'n'drop between the Finder and Mail.app on OS X, but that's besides the point. This was just an example of a concrete use for pipes and output redirection that I've found. I works well for me. I prefer it.

But my point about barbarism has nothing to do with these one line shell masturbations that make us look like wannabe obfuscated coders. My point is that the nature of shells has changed very little over the past two decades.
True. If it's not broken, don't fix it.

And in an industry where rapid change is the norm, stagnation will lead to extinction.
Rapid change? Such as the rapid change of e.g. the x86 ISA or the Windows API? FORTRAN? There are just too many manhours invested in these things to throw them away (sadly, in those cases). It's the same thing with the shell, too many scripts written, too many people with too much accumulated experience... It's not going to go away any time soon.

 

bmacsys

Well-known member
My mother had a pc running XP. She surfed and paid her bills on it. My nephew installed limewire and download some stuff. Whatever it was it basically hijacked her computer. The computer started flashing porno, it actually shouted obscenities, swear words flashed on the screen etc... The computer had been upgraded so to reinstall XP I would have had to call Microsoft. I just said the hell with it and reformatted and installed Ubuntu. To me its easier and less hassle than reinstalling Windows.

 

bmacsys

Well-known member
@ II2II and ealex79,
Well, it's not for everyone but that doesn't negate it as a useful desktop alternative. Otherwise, we wouldn't have people using GNU/Linux (predominantly) or any of the BSDs with exclusively on their computers. The same could be said of either Windows or Mac OS years ago depending on which faction you ask.

BTW, there have been enough threads arguing for and against GNU/Linux on the desktop (or portable). I'll let those speak for themselves in regards to more in-depth arguments.
It has made strides. I remember installing PPC Linux in 1999-2000. It was barely usable.

 

QuadSix50

Well-known member
@ II2II and ealex79,
Well, it's not for everyone but that doesn't negate it as a useful desktop alternative. Otherwise, we wouldn't have people using GNU/Linux (predominantly) or any of the BSDs with exclusively on their computers. The same could be said of either Windows or Mac OS years ago depending on which faction you ask.

BTW, there have been enough threads arguing for and against GNU/Linux on the desktop (or portable). I'll let those speak for themselves in regards to more in-depth arguments.
It has made strides. I remember installing PPC Linux in 1999-2000. It was barely usable.
Tell me about it.... :p

 

II2II

Well-known member
Take a look at Squeak.
I have played with it, I think. Maybe the principles are good, but I can't really imagine doing all my daily work inside it.
I can't imagine working in it from the perspectives of application support and hardware support. But it's construction is fine otherwise.

So you're saying making the source code to a programme available is worthless if it does not sacrifice size, speed, stability and portability in order to make it hackable by people who can't read?
If that's your spin on it, I guess so. Though it is worth pointing out that interpreted and bytecode based languages are frequently used in order to improve stability. (Keep in mind that interpreters often take care of aspects of memory management that human programmers are pooi at.) Intepreted languages are also frequently used to improve portability. A language like C usually links into the system API and is subject to platform irregularities like word size and endian. Languages like Python, Java, and Smalltalk provides a fairly consistent programming environment across platforms because the virtual machine provides that extra layer of isocaltion (a meaningful layer in this case) and these languages provide their own libraries.

Yes, there is a penalty with respect to size and speed since code is usually less dense than a binary executable. I'm not sure if size is relevant for two reasons though (ignoring the famous, 'but the resources are there,' argument). First of all, the size of the binary compared to resources intrinsic to program operation (like regionalization and graphics) is becoming less and less significant. When you consider the supplementary files that are included with a program (like documentation and sample files), the proportion that is code is even less significant. The other reason is that the source code and tailings (intermediate object code) take space if you choose to use them in a binary oriented system anyway. And yeah, that goes back to my assertion that there are very few benefits to open source if you don't use the code.

Thanks for that improved version. I'm not entirely sure how I made a fool of myself, though.
I was being sarcastic.

Also, there are some advantages to *TeX.
I am aware of the advantages of TeX, at least with the LaTeX packages, since I was brought to age in the sciences. LaTeX also has some critical deficiences, that people try masking in arguments like "you should be concentrating on logical structure and not formatting." (Which is true, but you eventually need to get it to the printed or virtual page.)

Rapid change? Such as the rapid change of e.g. the x86 ISA or the Windows API? FORTRAN? There are just too many manhours invested in these things to throw them away (sadly, in those cases). It's the same thing with the shell, too many scripts written, too many people with too much accumulated experience... It's not going to go away any time soon.
Uh, the x86 ISA has been extended over the past 20 years to account for technological change. Think about it: FPUs are now an integral part of every system, when they were an optional component in 1988. SIMD extensions are now common. The largest x86 vulgarity that I'm familiar with, segmentation, was made optionial with 32-bit addressing. And while the x86 ISA still exists on the surface, mu understanding is that the implementation of the x86 is completely different internally.

Or look at FORTRAN. There are two ways to look at the language. You can either look at it from the perspective of FORTRAN 77, which is a dying language that was maintained solely because a few people had the skillset and there was a lot of legacy code. Or you can look at it from the perspective of FORTRAN 9x/HPF and beyond, which has been adapted to the new realities and extended to address issues that other languages barely touch (like parallelization).

The way I look at it, the Unix shell is in the stage that FORTRAN 77 is in. It's there because a lot of unix was implemented with it (e.g. initialization scripts) and because a few people still have the skills. But it has very little to attract new blood, because there is no equivalent to FORTRAN 9x. The new blood is going elsewhere, either to the GUI (because there is little incentive to pull them away) or to scripting environments like Perl or Python -- leaving bash as a tool to manage files and launch programs. Bah!

 

bigD

Well-known member
Until I managed to get a hold of an OS X machine, I ran Ubuntu on my 2.0GHz Athlon box.

It was fast, did everything I needed it to do, and Gnome made a pretty good desktop. But it wasn't *quite* to the point where I could recommend it to a regular computer user. I liked it though, and if it weren't for OS X, would use it as my daily OS.

 

QuadSix50

Well-known member
Until I managed to get a hold of an OS X machine, I ran Ubuntu on my 2.0GHz Athlon box.
It was fast, did everything I needed it to do, and Gnome made a pretty good desktop. But it wasn't *quite* to the point where I could recommend it to a regular computer user. I liked it though, and if it weren't for OS X, would use it as my daily OS.
I have to agree somewhat with what you mention. On my PCs, I use GNU/Linux exclusively because I really do love how it runs there. It also runs great on my iMac G5, but seeing as I already have a working Unix environment in Mac OS X, I'm quite content with Mac OS X on it only. Maybe once Apple decides to drop support in OS X for my iMac will I put a PPC GNU/Linux distribution on it exclusively. I'm sure that by that time, GNU/Linux on the desktop will have progressed even further (assuming that the distros are still supporting the PPC platform by then).

 

chris

Well-known member
Well, I've been using an ASUS EeePC running eeeXubuntu(2gb RAM, 8gb SDHC) as my main PC for several weeks now - before that a desktop (Athlon64 3500+) running various versions of Ubuntu and Debian held that position.

I haven't had any problems thus far, though Ubuntu 5.04 was a bit annoying to setup properly. Not that it really matters now we're 3 versions and 6 releases past it, though.

 

conceitedjerk

Well-known member
My main machine, a Sun Ultra 5 (440MHz, 1Gb RAM, 2 x 80Gb HD, DVD-R, SunPCi card) is a dual-boot Solaris 8/Debian "etch" machine. I spend nearly all my time using Debian, as it lets me do everything I want or need.

I only use Solaris 8 when I want to learn, or when I want to use my 400MHz SunPCi card (256Mb RAM, 8Gb virtual HD, Win2K).

 

TylerEss

Well-known member
I like that Ultra of yours. I'm a network admin and needed a new work box; I very nearly got a 4x450 Ultra 80 but settled for an OpenBSD/AMD64 machine because the Sun is so loud.

I still wonder if I made the right choice. :-D

 

conceitedjerk

Well-known member
I like that Ultra of yours. I'm a network admin and needed a new work box; I very nearly got a 4x450 Ultra 80 but settled for an OpenBSD/AMD64 machine because the Sun is so loud.
I still wonder if I made the right choice. :-D
That's funny - a quad-processor Ultra 80 is likely going to be my next purchase, unless I can acquire a cheap multi-processor Sun Blade machine :)

 

Temetka

Well-known member
Once I aquire a KB and mouse I will be setting up my Ultra 10. Currently it has a single 440MHz CPU, 512MB of RAM and a 120GB HD. I need a video card, the sun PCi card and KB/Mouse. I can't wait to play with Solaris as it will be the only OS I install on that machine (minus Win2K on the PCi card).

I love OS X and *NIX and have been using various *NIX distros (Linux, NetBSD, FreeBSD) for about 10 years now and am very comfortable in them. I have about 4 months Solaris experience so I am really looking forward to diving into it.

In it's current form my opinion is that Linux as a Desktop OS is lalmost there. I still think a unified window manager and proper detection of installed and to be installed hardware is what is needed to finish it off. Compiz fusion is nice as is Beryl. However the window manager fragmentation and lack of any decent looking UI is the main thing that holds Linux back. If it wants to succeed in supplanting the windows market then it needs to look like windows, run like windows and have proper hardware support. The UI should be the same across all applications, and things clearly explained in dumb speak for end users. Currently it's a geek OS and will remain that way until either the Linux community gets off its ass and merges or some super killa distro needs to come out. One distro to rule them all and all that...

Otherwise Linux is nice and functional but will never reach market penetration above 15%. Apple is nearly there right now and the OS X install base is making huge inroads in the enterprise market as well. It's well suited as a consumer OS, a network OS and for Server duties as well. It's polished, well documented and easy to use. Considering it's based on UNIX I (as a dev) know it won't take much work in the *NIX community to achieve the same thing. Problem is most *NIX devs I have met are arrogant pricks who think it's their way or the highway and would rather gnaw off a limb before collaborating on a Distro intended for consumer use. It is due to this that I think Linux will always remain in the workstation and programming realms with some server duty and the occasional home machine on the side. It's sad really as after decades of development it's only seen horizontal growth, where it needs to see vertical growth.

I just don't see that happening. I will no longer run Linux on any of my machines. I am not willing to go down that road for the *nth time. It's like a country song repeating "coulda, shoulda, woulda" over and over again instead of actually accomplishing anything.

NOTE: My position on Linux may or may not be in line with others. That's the wonderful thing about opinions.

 

II2II

Well-known member
I very much doubt that Linux's problem is choice. Indeed, I think that eliminating this choice will be reducing confusion at the expense of flexibility. Flexibility is essential in the world of Linux because it is essentially of OS refugees.

As for the market penetration bit, I think that a lot of people have given up caring. In a world of half a billion computers, even a 1% market share is an incredible number of computers. Five million installations is nothing to cry home about. Indeed, a lot of software vendors would be happy with a few thousand installations.

Then there is the arrogant prick bit. I have mixed feelings on this. I recently upgraded my video card and went on a quest for ATI's proprietary drivers. It took some hoop jumping just to get it working, mostly because the arrogant pricks are trying to make life hard for companies that release proprietary drivers.

On the other hand, you generally have two types of OSS developers out there: those who are paid, and those who aren't. The paid ones are arguably working on stuff that people want, things like Gnome and OpenOffice as well as the kernel and the various commercial distributions.

The ones who aren't paid are doing it on their own time and for their own motivations. Once they release something they have to deal with various types of users. Those users range from helpful to parasites. A parasite may be a well intentioned person suggesting features and submitting bug reports, but do it for too long or do it too aggressively and you're now leeching the developers time. Then there is the whole thing about many programmers being a bit anti-social to start with. I would imagine that is the same in the commercial world, it is just that the commercial world isolates programmers from the outside world with things like marketing departments.

Like temetka, I have mostly dropped Linux. I periodically stick a distro on my main computer to see how things have progressed. They are doing well on the features, but the politics is still there. I wish that they would get it through their heads that there is nothing wrong with a binary blob on an OSS operating system, just like there is nothing wrong with an open source program on a commercial OS. I do have Linux on my XO. There is not much choice there, and I don't think that I'd dump Linux off of it anyway. It's distribution is sufficiently isolated from the mainstream that the politics don't really matter.

Unlike temetka, I am still a big fan of open source. I use a tonne of it under Windows. Why would I spend money on Photoshop when the Gimp does what I need of it? Why would I pirate Photoshop against the wishes of Adobe, when I can get something completely legal that fits my needs? At this point I pretty much leave the part where I do need consistency to Microsoft, because I don't feel like playing games with device drivers and APIs. OSS applications are okay though. Indeed, my only criticism of OSS under Windows is that it's hard to figure out what to do with the source (in Linux, you have a more consistent development environment).

 

QuadSix50

Well-known member
Don't worry, Temetka, I won't do THAT again. ;)

With regards to binary blobs, there are distros out there that already include them as well as the proprietary media formats (I believe Linux Mint does this). For some, this is the answer for what they want. For others, it's probably something like gNewSense or Gobuntu which is strictly free software. Like anything, you have varying degrees of the base philosophy. So it is this way with GNU/Linux distributions, and thus a lot more choice to fall into something that you like.

I personally don't mind there being proprietary drivers for Xorg, but I don't use them. The open source ones work fine for me, and if not then I turn to the proprietary ones as a last resort (NVIDIA comes to mind here). It's nice that companies are making the drivers available for GNU/Linux, but it's at their discretion. As for the open source developers calling for open drivers, I think it has more to do with them wanting more cooperation from these companies in making the open source driver work properly, so that ALL ports of GNU/Linux can benefit. For now, sadly, relying only on the proprietary driver locks you into only one particular architecture, which is not good for GNU/Linux as a whole. If these companies would just help out the community with the open driver, then it would benefit the entire GNU/Linux community and even the open source BSDs (in the case of Xorg). No one is asking for them to give up the crown jewels...just to help make the devices run across all platforms supported by the FLOSS operating systems. The rest can be taken care of by the FLOSS developers themselves.

Consider that even without Broadcom's help, the GNU/Linux kernel developers were able to come up with a better driver than what Broadcom could come up with. And on top of it, the driver was GPLed so Broadcom can't take advantage of it without being required to provide any changes to the source. I'm sure if they had just worked with the community, things would be a whole lot different.

 

II2II

Well-known member
Well, my fun the other night involved an ATI Radeon HD 2600 (or something like that). I decided, for no particular reason, to try out the binary blobs for 3D acceleration. They ended up being about 12 times faster in a very informal test (glxgears). I couldn't do much more testing because the system froze up solid.

Open source developers have the tendency to blame this on binary blobs. I won't. Here's why.

I started off with a Linux 2.6.25.? kernel. Installed the blob, but it didn't work. It turned out that it failed while compiling the kernel module. So I did it via ATI's make scripts and found out that the program would NOT COMPILE because GCC DEVELOPERS deemed that it used an UNACCEPTABLE LICENSE AGREEMENT. Think about that. Open source developers are not only deciding what you can use on your computer, but how you can use the source code that you obtain or created. That is damned scary. It makes MICROSOFT LOOK LIKE AN ANGEL.

So I looked online, found out about the compiler readable license agreement line, and probably broke a few laws by changing it to GPL.

It still would not compile. This time it was because a symbol called init_mm was not found. Sigh. That happens. Sometimes old API are discarded, as much as I hate it because it does reduce backwards compatibility. Particularly for binary blobs, which is part of a typical argument against binary blobs. But I did some looking around and it turns out that init_mm is STILL IN THE KERNEL. It simply is not exposed. Why is it not exposed? It's hidden because none of the kernel modules included IN THE KERNEL SOURCES use init_mm. They somehow figured that mean that no one else was using init_mm. Uh, wait a second. Both ATI and NVidia use init_mm in their binary blobs. A fact that is hard to miss given that ATI and NVidia are two of the biggest video chip makers out there. How can you miss that? I'm left to conclude that it was a DELIBERATE ACT OF SABOTAGE.

At that point, I switched to using the Linux 2.6.26 sources (I forget why). The module went from compiling to not compiling. Apparently something had changed in the kernel headers. By then, I didn't much care any more. Was it an artefact of progress (in a point of a point release?) or sabotage? I don't know. I just want the damned thing to work. So I applied a patch. It compiled. It installed. It crashes my system when using OpenGL. I don't care at all any more. I give the fsck up. If open source developers want people to use their freaking software they can give up on their jihad and learn to co-exist with the corporate infidels. Until then, I'm really not sure that it's worth the effort.

 
Top