A Taste of Computer Security© Amit Singh. All Rights Reserved. Written in August 2004
Unix vs. Microsoft Windows
Microsoft Windows is widely regarded as the epitome of insecure platforms — a platform fraught with innumerable security problems. Windows systems top the charts in perhaps every major vulnerability and incident list. In contrast, Unix systems are perceived to be considerably more secure, because __________ (your favorite reason here).
How Did Windows Become "So Insecure"?
I posed this question to a few people: technology experts, and non-technical users. I found that very few people had actually ever given this any serious thought. They "just knew" that Windows is "the most insecure platform." Those who were willing to think on-the-fly ascribed their beliefs to gut-feeling, perpetual digital disasters crashing down upon Windows (as experienced first-hand or as reported by the media), statistics (incidents and vulnerabilities published by respectable organizations), folklore, inherent (sometimes inexplicable) hatred of all things Microsoft, inherent affinity for Unix, etc. Some conjectures took Microsoft's monopoly into account, concluding that Microsoft "doesn't really care" about security, as they can afford not to, and still be successful.
Many of these people are Windows users.
Nevertheless, Windows NT was designed to be a secure system, with provisions for even more security than initially implemented. It provides an impressive set of security mechanisms (Windows 2000 Security Services, Windows Server 2003 Security Services), with more being worked on (Windows XP Service Pack 2, Next-Generation Secure Computing Base).
Current Windows systems have some of the highest security ratings (as compared to other systems). Note that this is factual information, regardless of how much sectarian laughter it induces.
However, the number of documented security issues and the real-life rampant insecurity of Windows are not speculations either! The problems are real, both for Microsoft, and for Windows users.
There is no single incontrovertible explanation of this paradox, and it is not our goal to conclude "which is better" or "which is more secure." Perhaps the best we could do is to attempt a brief objective (and dispassionate) discussion on the topic.
Microsoft has dabbled with numerous operating systems and environments, similarly to Apple (refer to A Technical History of Apple's Operating Systems). However, unlike Apple, whose trial-and-error process was rather excruciating, Microsoft has had considerable success in most cases.
Various incarnations of Microsoft Windows could be classified as follows:
- NT-based: Versions and specializations (such as client or server) of Windows NT, Windows 2000, Windows XP.
- Windows 95-based: Windows 95, Windows 98, Windows ME.
- Earlier: Windows 3.x and earlier.
The "Official" Security of Windows
As we saw in Defining Computer Security, NT-based Windows systems are classified at C2 (Orange Book) or EAL 4+ (Common Criteria) levels — the highest among existing general purpose operating systems. In fact, Windows even meets a few requirements of the next more secure division (Orange Book B2), such as the provision of separate roles for separate administrative functions (trusted facilities management), and the
Ctrl+Alt+Delete Secure Action Sequence (SAS). Thus, Windows officially meets critical security-related requirements of most businesses and government agencies.
The unfortunate part is that the abovementioned security ratings do not necessarily represent the security strength of a system in real life. For example, an EAL level for a system only indicates the level of confidence in how well the system meets its stated security requirements. Thus, if the required security feature set is minimal or perhaps even empty, and the system satisfies it well, it can achieve a high EAL rating. Alternatively, consider the C2 rating of Windows NT. It is not the operating system itself that achieves such a certification: it is a specific configuration, which includes hardware and a typically excruciatingly long checklist of what all to enable, disable, add, remove, and so on. Changing even one aspect could invalidate the certification. The C2 rating of Windows NT 3.5 became a subject of great controversy as it only applied to a stand-alone machine — one that was unplugged from the network.
However, it wouldn't be entirely fair to single out Microsoft Windows when it comes to "unrealistic" (from a daily life standpoint) requirements for effective security. The 100-page long Mac OS X "Security Configuration Guide" from the NSA details several such steps. Its suggestions include physical disabling of devices that may leak information, such as AirPort, Bluetooth, microphone and other audio devices, and so on.
As we have seen in various sections earlier, the types of inflictions normally associated with Windows (such as worms, viruses, Trojan horses, and so on) existed before Windows did, and are not technically limited to Windows.
Points to Ponder
There are several points to consider — intertwined, and often subtly related — in our attempts to understand Windows' situation.
Consider some historical aspects of Microsoft's pre-NT platforms:
- They originated as single-user systems, with hardly any protection or access control mechanisms. Even file permissions made a rather late entry in Microsoft's platforms.
- They historically relied on filename extensions, particularly for executability.
- They historically did not have the concept of a super-user, wherein a set of (potentially dangerous) operations are not allowed without explicit authentication, etc.
- They were the de-facto platforms available to the "general public" (including the "malicious public") for a long time, before Unix systems became widely accessible.
Now, Windows NT was based on a new design, focusing on numerous modern features, portability, reliability, and security. The resounding success of Windows 3.x was instrumental in Microsoft shifting its focus regarding many such goals, and putting the greatest emphasis on native backwards compatibility. Similarly, the graphical user-interface of Windows 95 was considered a critical feature to be passed on to NT-based systems.
It is one thing to come up with a modern, or great design. However, design alone, and even its subsequent implementation, do not magically change real-life scenarios. What about the mind-set of the users? What about the philosophy associated with the platform? An extensive array of security functions in a system is quite ineffective if the system mostly operates in a context where these functions are not used properly, or maybe even are bypassed entirely. While the presence of such mechanisms, and their correct functioning in an evaluative setting would win security ratings, real-life introduces numerous weakening factors: historical, contextual, and even imaginary.
Windows has more execution environments than typical Unix systems, for macro-processing, email attachment processing, and so on. A related point is that Windows tries to implicitly do things for the user. Consequently, there are more situations for code (such as code embedded in a document) to be executed, often without the user being asked proactively.
In many situations, security could be "improved" simply by "turning things off." This especially applies to network services. Many Unix systems, particularly recent ones, emphasize on security by default: services are turned off out-of-the-box. In many cases, the user would not require most of these services, so any vulnerabilities in the corresponding daemons would be inapplicable to such a system.
Windows (again, possibly driven by the "less work for the end-user" tenet) has traditionally shipped with various services enabled by default. Such services increase the attack surface of Windows.
Security and Ease of Use
I earlier said that a common, although not necessary, side-effect of enhancing a system's security is that it becomes harder to program, and harder to use. Security related steps that are required to be performed by end-users must be easy to understand, and easy to use. If not, users may bypass, even altogether, steps that are especially frustrating.
Windows is supposed to be an easy-to-use platform, while Unix is supposed to be cryptic and hard-to-use. Historically, an average Unix user has been an academician, researcher, or somebody who is either proficient in, or is willing to spend time and energy figuring out details of a computer system. In contrast, an average Windows user wants things to "just work", and is not so much interested in the inner workings of the system. This is in conformance with the purported philosophies of the two systems. With time, Unix and Windows have both become less extreme, with an average Windows user being more aware of (and interested in) the system, while not all Unix users want to dissect their systems anymore.
Now, configuring and using security can be extremely difficult on Windows. This is not to say that security is easy on Unix. However, consider that the barrier of entry to using Unix is such that if somebody is using Unix primarily, chances are that he would be able to manage his system reasonably (owing to his interest in the system itself, his willingness or ability to read and understand man pages and HOWTOs, etc.) Compare this with Windows. There are too many "knobs." The exposed interfaces are either too complicated, even with documentation, or too weak and limited. Security on Windows is hard to configure correctly (try setting up IPSEC). As such, expecting an average Windows user to administer his Windows machine competently is an unfair expectation. Thus, we have a detrimental interplay of the platform's philosophy and qualities with its representative user-base.
On a related note, attackers have a better chance of succeeding against an average Windows user. Who do you think is more likely to innocently open a malicious email attachment: the average Windows user, or the average Unix user (the latter probably might not even have an appropriate application to handle the attachment).
Microsoft's market-share is perhaps the most obvious, and the most controversial point raised when discussing Windows vs. Unix, malware-wise. Windows has over 95% of the desktop market-share, though the server market is far less lopsided.
Microsoft's success, as reflected in their incredible market share, amplifies their security problems.
- The number of people using Microsoft software (Windows has over 95% of the desktop market share).
- The number of 3rd party developers creating commercial software for Microsoft platforms.
- The pressure on Microsoft's developers to write new software, add features, fix bugs, ensure legacy compatibility, and so on.
While we have emphasized on the fact that technical differences alone do not account for Windows' security situation, software quality (or its paucity) is a problem in Windows, although it is not as extreme as it is often portrayed to be (such as, "Windows is poorly written. Period.")
Consider some randomly chosen examples:
- Microsoft seems to have a rather large number of flaws in software that is critical (because of the software's role in the system, or because it is widely used). Examples include Internet Explorer and IIS.
- In a typical usage scenario of a Windows machine, the user has too much power, which, in turn, gives too much power to any and all applications running on behalf of the user.
- By some accounts, there are approximately 35,000 (perhaps more) Windows device drivers. Many of these are based on a driver model that is a decade old. While it is not common for drivers to have security holes per-se, they are active contributors to system instability. In many cases though, it would be unfair to blame Microsoft, or even Windows, for a buggy 3rd party driver.
Late for (Net)work?
A well-publicized Microsoft oversight is their undermining of the Internet initially. Once Microsoft realized that they had been late in climbing on the Internet bandwagon, they attached paramount importance to coming up with a web browser, a web server, and related technology. In comparison, Unix had networking much earlier.
Now, it is one thing to incorporate networking into a system. It's another to do so as quickly as possible, make existing applications benefit from networking support, and maintain high standards for software quality and security — in a highly-competitive arena that inexorably demands quick-to-market (for example, Netscape was a major threat to Microsoft at one point). Perhaps the questionable implementation of some core network-related components in Windows, as indicated by the raw number of reported flaws in them, could be attributed to this apparent rush.
Is Popularity Really An Issue?
Regarding the relation between the success (popularity) of Windows and the amount of malware for it, a few points are frequently raised:
- If "abundance" is conducive to mischief, why aren't there viruses, worms, or other digital pestilence on Unix servers? After all, Unix fares much better in the server market. Moreover, a large part of the Internet's infrastructure (web servers, domain name servers, SMTP servers, etc.) runs on Unix, so attackers should have sufficient motivation.
- As Unix becomes more popular (and continues to gain market-share), would malware become commonplace on Unix? Those staunchly favoring Unix brush this off, claiming that Unix precludes such mischief (often cited reasons include "by design", "due to user-base", "due to several major systems and components being open source", etc.) Others point out that as Unix systems are trying to be "more like Windows" (see above), one can already see more mischief happening on Unix.
- Since it is easy to write a virus on Unix, and since there exist enough privilege-granting vulnerabilities on Unix ("root exploits"), why haven't we seen software epidemics like we see on Windows? Will we, in the near future?
Well, the issue is perhaps too subjective to address satisfactorily, but one must realize that even though Windows malware might use bleeding-edge flaws (that may be discovered on a daily-basis), the apparent marriage of malware to Windows is not a new thing, and it did not happen overnight. For close to thirty years, PC (DOS and Windows) viruses have thrived: there is a long-standing viral infrastructure, both real and philosophical, in place.
End-users often play a critical role in spreading malware. As mobile users travel, so does the malware on their laptop computers. A naturally vulnerable end-user does not use a server, if at all, the same way as he does a client computer (for example, downloading and using random software). Moreover, servers are better monitored and protected. Due to these, and related factors, servers often have a higher natural immunity against malware. Exceptions happen when a vulnerability is discovered in a widely-deployed server (the Morris Worm, various IIS flaws). In such cases, servers can act as very effective attack-portals.
Abundance and Homogeneity
Now, it is perfectly feasible, technically and otherwise, for malware to breed on Unix, say, if Unix becomes more popular. However, why does it have to happen, simply because it can? Perhaps it will, perhaps not. While Windows has the misfortune of having decades of malicious momentum, Unix might have the advantage of having decades of inactivity in this area: no rampant viral activity (even if technically feasible), no existing momentum, no traditionally tarnished image, elitism (and snobbery against Windows), and in general, inertia.
There are other factors in favor of Unix.
If you were to enumerate what constitutes "Windows" today, you would get a handful of systems providing essentially the same execution environment. "The" Windows environment is abundant and homogeneous.
Recall that we defined "Unix" to be a family of systems. If you were to enumerate what constitutes "Unix" today, you would get maddening diversity: in architectures, interfaces, flavors, distributions, and many more. Even apparently similar Unix systems, such as two Linux distributions, might be different enough to warrant considerable extra "work", if an attacker were to create (the easy part) and deploy (the hard part), say, a virus. Creating malware, as we have seen, is a technical problem, easily solved on any platform. Spreading malware involves operational and situational issues, which are apparently less of an obstacle on Windows than any other platform.