Nobody would argue with the fact that we live in an insecure world. Though security is talked about in the computer industry almost more than in any other field, some kind of job related security is necessary in any endeavor. A landscape gardener, for example, has to be conscious about security with regards to his/her tools. Improper use of these tools could cost someone a finger or maybe even a limb. Landscapers have even been attacked and killed by alligators in Southern Florida, but this would be considered an unlikely occurrence. IT professionals don't have to worry about losing limbs or getting eaten by alligators but they do have to worry about losing data or having their machines attacked by crackers. Just as a landscaper would never do his job without putting on gloves, a helmet, goggles and other types of protection, and real IT professional would never think of running a machine, especially in a network, without it being secure against attacks from the outside and with a backup system in case there's a hardware failure on the inside.
That said, when we talk about making a machine secure, we mean as secure as possible. 100 percent security, is, at this stage of the game, not a real goal. Most software contains some sort of flaw that can be exploited. In fact, the networking protocols that run the internet were not designed with security in mind. Back then, the internet was a relatively small and happy family of computers run by several universities and the US. Department of Defense. Security only became a major concern after the World Wide Web became a factor in our daily lives. Due to massive growth, instead of that small, happy family, the internet has become a huge, extended at at times dysfunctional family. Attempts at increasing security consist largely of patches applied to the software that provides internet services. Though there are periodic major incidents like CodeRed in July of 2001 or Slammer in January of 2003, luckily, we haven't suffered through more frequent outages. The patches seem to be holding up. We can also consider ourselves fortunate to have Unix and Unix-like systems such as Linux forming the backbone of the Internet. Most experts agree that they are more secure by design than present Microsoft Windows systems.
60 percent of servers on the WWW are running Apache on some version of Unix or on a Linux distribution
Linux, which took its core design from Unix systems, separates user accounts and sets strict limits on what users can do on the system. This makes it very difficult for viruses to invade the whole system. This shouldn't give us a false sense of security, however. Programs have to be installed by the root user and if proper precautions aren't taken, he or she could download a malicious program and install it, thus compromising the system. And this is only one way to compromise a system. Flaws in programs can be exploited, so the administrator needs to keep the system up to date and install newer versions of major programs when these flaws are found and fixed. Unpatched programs could make the system vulnerable to denial of service attacks or even might become potential routes of entry. Even when you're careful about the source of your programs and you religiously update a machine, there is still the need to keep your system locked up by using an effective firewall.
That said, when we talk about making a machine secure, we mean as secure as possible. 100 percent security, is, at this stage of the game, not a real goal. Most software contains some sort of flaw that can be exploited. In fact, the networking protocols that run the internet were not designed with security in mind. Back then, the internet was a relatively small and happy family of computers run by several universities and the US. Department of Defense. Security only became a major concern after the World Wide Web became a factor in our daily lives. Due to massive growth, instead of that small, happy family, the internet has become a huge, extended at at times dysfunctional family. Attempts at increasing security consist largely of patches applied to the software that provides internet services. Though there are periodic major incidents like CodeRed in July of 2001 or Slammer in January of 2003, luckily, we haven't suffered through more frequent outages. The patches seem to be holding up. We can also consider ourselves fortunate to have Unix and Unix-like systems such as Linux forming the backbone of the Internet. Most experts agree that they are more secure by design than present Microsoft Windows systems.
60 percent of servers on the WWW are running Apache on some version of Unix or on a Linux distribution
Linux, which took its core design from Unix systems, separates user accounts and sets strict limits on what users can do on the system. This makes it very difficult for viruses to invade the whole system. This shouldn't give us a false sense of security, however. Programs have to be installed by the root user and if proper precautions aren't taken, he or she could download a malicious program and install it, thus compromising the system. And this is only one way to compromise a system. Flaws in programs can be exploited, so the administrator needs to keep the system up to date and install newer versions of major programs when these flaws are found and fixed. Unpatched programs could make the system vulnerable to denial of service attacks or even might become potential routes of entry. Even when you're careful about the source of your programs and you religiously update a machine, there is still the need to keep your system locked up by using an effective firewall.