- Free software.
- Linux is a unix clone cut down to run on a PC.
- Why compromise just to save a few bucks.
- Linux is neither warranties nor supported.
- Vendors are reluctant to develop for a platform that requires them to release their source code.
- The unices are fragmenting into a plethora of incompatible versions.
- Linux is fragmenting.
- Linux does not conform to the X/Open standard.
- Linux has no direction.
- Linux is made up of a lot of little groups running in different directions.
- Linux is not a technology leader, it is just playing 'catch-up'.
- The kernel may be advanced, but the apps are old, 'second-hand' ports.
- Only NT may be a domain server..
- Linux is insecure.
- Linux is not Year-2000 Compliant.
But money issues are mostly about corporate use. If corporate staff are familiar with one OS and not Linux, setting up a server with Linux will probably cost more as savings on the OS price get eaten up by time on the learning curve. But once installed and setup, Linux systems require low maintenance, and subsequent systems will require less time and thus save money. But beyond saving a few bucks, the license issue goes further. Many system administrators prefer the thin server approach, were services are spread across multiple low cost machines rather than centred on big central boxes, with the load being split by service rather than users . This approach is more easily scalable and limits downtime. It also makes it easier to upgrade and maintain individual system elements. But if you need per user licenses on each box, it can become an economic non-starter. Linux not only makes the thin server model financially viable, the high efficiency of the OS means that desktop machines no longer considered good enough for the latest desktop OS may be recycled as non critical servers and routers.
Since originally writing this, the SAMBA team (who make SMB compatible network code freely available to UNIX systems), have been unravelling the spaghetti of an NT primary domain controller, and are starting to offer support for this in their software. Many net administrators are overjoyed by this unlocking of the NT domain, but many others are indifferent as they have no desire to implement such methods irrespective of who is supplying them, as the protocol is still proprietary and bloated. Many administrators point to open solutions, particularly Kerberos, and note that NT5 uses Kerberos for domain authentication. It seems increasingly likely that the use of proprietary protocols in domain controllers is destined to die out.
Some argue that UNIX itself is vulnerable, but the basis for this argument is the high number of security alerts issued for UNIX network service software. But at the same time it should be remembered that 70% of internet traffic is destined for UNIX like servers, with the rest being spread over a variety of proprietary systems. Unix is also the predominant system for academic servers (where highly skilled would be attackers abound). Most other systems sit behind fire-walls or on closed networks, or are just too unoteworthy to warrant any attack. It is impossible to compare UNIX systems with other systems in terms of vulnerability because no other system comes near it in terms of exposure or the range of services offered.
Certificates such as C2 certification are an aid in setting up a secure system, but they do not state a system is secure, nor does the lack of a certificate (and Linux does not have one) make a system insecure. A C2 certificate is issued when a company submits a system for testing, and states not "This system is secure" but "In order to reach a C2 security level with this system, the following configuration was used" (The interpretation is the authors, not the actual certificate wording). The fact is you cannot globally state an OS as being secure, because security is so dependent on what protocols are being offered and how the system is configured. By the same logic, you can not say a particular OS is insecure.
Security is taken seriously by the Linux community. It must be, many Linux systems are in the front line and would not last two minutes if problems where not properly tackled. When security alerts are issued, fixes arrive very quickly (if not at the same time). Linux distributors, consultants and large sites were Linux is deployed have people dedicated to security, and as ever in the Linux world, these people collaborate. At the same time the basic motto of Linux is flexibility and user choice. You can make (or buy) a secure system with Linux. But as security is inversely proportional to flexibility and ease of use, you may decide to forgo security and enable lots of network 'gadgetry'. On a closed network or behind a firewall were the users are known (actually the case of most servers), there is a lot to be said in favour of using less trusted protocols. The choice is in the hands of the users, secure or flexible, and Linux offers what is probably the largest range of options to the end user than any other system.
Related topic sites:
The Cathedral and the Bazaar
What is FUD?