Документ взят из кэша поисковой машины. Адрес оригинального документа : http://oit.cmc.msu.ru/lectures/FirewallPolicyGuide(NCSA).txt
Дата изменения: Mon Dec 20 15:52:25 1999
Дата индексирования: Mon Oct 1 22:17:34 2012
Кодировка:

Поисковые слова: dust disk
NCSA Firewall Policy Guide

1. INTRODUCTION

The Internet, the global network of computers that is the basis for
universal electronic mail, the World Wide Web, and numerous forms of
electronic commerce, has variously been described as bigger than the
personal computer, more significant than the printing press, and as
revolutionary as the discovery of fire. These days, the computer section of
every book store is crammed with Internet titles. Every new movie has a Web
site. Billboards and advertisements without URLs are becoming the exception.

Yet firewalls, which are designed to control the flow of information
between two networks, were being developed even before the world at large
had heard of "The Internet". Indeed, common sense says you should consider
using a firewall whenever you internetwork. This term refers to the process
of connecting two networks together. The result is referred to as an
"internet" without the capital 'I'. Typically, we reserve the term
"Internet" for the TCP/IP-based descendant of ARPAnet's marriage to CSnet in
1982, now serving tens of millions of users via hundreds of thousands of
host machines.

Internetworking For computers to successfully communicate with each
other, they have to follow standards and observe rules or protocols. TCP/IP
stands for Transmission Control Protocol/Internet Protocol, the fundamental
protocol of the Internet. Although initial development of TCP/IP occurred
within a defense and government environment, it is important to note that it
was designed to be reliable, not secure. The intent was to develop a
protocol that is good at getting information to its destination, even if
different parts of the information have to travel different paths. However,
because this development took place within an environment of trust, with a
relatively small number of participants, many of whom were known to each
other, security of the data in transit, or the internetwork connections
which it traversed, was not a major concern.

Now the Internet has become global, with tens of millions of users,
almost all of whom are completely unknown to you. So it is no longer wise to
trust other computers or users on the Internet. But the Internet is not the
only place you will find "untrusted" computers. Think about any network that
you do not manage or control. To what extent can you trust it? Do you really
want to connect it to your network without any way of controlling the
traffic between the two? These days, whenever you connect your trusted
network to someone else's untrusted network, it is wise to place a firewall
of some kind between the two. This helps you keep insiders in and outsiders
out. For example, firewalls would be appropriate at points C and D in Figure
1, but may not be needed at points A and B.

Figure 1: The placement of firewalls

The idea is not to cut off communication at these points, but to control
it. This means controlling which users can data pass between the networks on
either side of the firewall as well the types of data they are allowed to
exchange. These principles apply at all levels of internetworking, from
small offices to corporate offices, from a couple of interconnected LANs to
corporate WANs, from Web surfing machines to electronic commerce servers.

Internet Risks So what risks do you face when connecting networks to
each other or the Internet? A recent Ernst & Young survey found that four
out of five large organizations (those with more than 2,500 employees) are
running mission-critical applications on local area networks. Those LANs,
and the vital information they are processing, are increasingly threatened
by internetwork connections. For example, when NCSA studied a profile group
of 61 large organizations they reported 142 separate security breach and
system hacking encounters in the last three months. IP spoofing, which can
be used to gain widespread access to an internal network, accounted for 49
of these encounters. Yet a recent Corporate Information Technology Policies
Survey conducted by the Chicago-based information technology law firm of
Gordon & Glickson revealed that less than half of respondents performed
routine security checks. Only 44% had the ability to track access to
sensitive data and only one third used any form of encryption.

At the same time 98% of these same companies provide access to the
Internet to some employees, 97% provide remote access to corporate networks,
61% host their own Web site, and 9 out of 10 permit some level of access to
commercial on-line services such as CompuServe. To this recipe for disaster
you can add another ingredient: the way that people in the Gordon & Glickson
survey dealt with access to the Internet. While 75% say they would like to
restrict access to some parts of the Internet, only 62% had policies
governing Internet access, 42% did not monitor employee Internet use and
only 30% actually applied access controls. Furthermore, fewer than two out
of five respondents said they imposed restrictions on downloading files from
third parties. No wonder that one out of six surveyed corporations reported
experiencing damage associated with Internet usage by employees (one in
eight reported legal claims arising from the use of information technology
by an employee).

The risks related to using the Internet range from public embarrassment,
when a Web site is defaced (as happened to the U.S. Department of Justice
and the Central Intelligence Agency in 1996) or internal correspondence is
revealed, to theft of trade or government secrets from a poorly protected
internal network. Risks include coordinated and systematic abuse of
computing resources, sometimes for mounting attacks on other sites
[Stall95a]. Consider the findings of the United States General Accounting
Office, which was asked by the Senate Committee on Governmental Affairs to
report on the current vulnerability of Department of Defense non-classified
computer systems. Here are some highlights:

"Unknown and unauthorized individuals are increasingly attacking and
gaining access to highly sensitive unclassified information on the
Department of Defense's computer systems... as many as 250,000 attacks last
year... successful 65 percent of the time.... At a minimum, these attacks
are a multi-million dollar nuisance to Defense. At worst, they are a serious
threat to national security. Attackers have seized control of entire Defense
systems... stolen, modified, and destroyed data and software... installed
unwanted files and "back doors" which circumvent normal system protection
and allow attackers unauthorized access in the future. They have shut down
and crashed entire systems and networks, denying service to users who depend
on automated systems to help meet critical missions. Numerous Defense
functions have been adversely affected, including weapons and supercomputer
research, logistics, finance, procurement, personnel management, military
health, and payroll."

Whether it is viruses, Trojan horses, or penetration of internal
networks, the most important factor affecting network security today is
clearly the Internet. If your network is connected to the Internet you have
a whole new set of problems, some of which make pre-existing problems worse.
If your network is not connected to the Internet, you are most likely facing
pressure to make that connection, even if it is merely a demand for
electronic mail. The is pressure is so strong that some organizations find
that they are already connected to the Internet even though upper management
have not authorized any such connections.

Connecting to the Internet is a bit like opening the shades on the
office windows and letting in the full glare of the midday sun. Problems
with network security that were previously invisible are thrown into sharp
contrast. Unprotected guest accounts and obvious passwords might not have
been much of a problem when your network was only visible to insiders. But
if people manage to penetrate your network from the outside (something
experienced by at least one out of every six respondents in several recent
surveys) you can bet these weaknesses will be exploited. And news of such
vulnerabilities can spread through "the underground" at the speed of
electrons, leading to rapidly escalating attacks and system abuse.

Such incidents represent more than kids getting their kicks with modems.
Systematic and automated probing of new Internet connections is being
carried out by a shady cast of characters that includes hackers-for-hire,
information brokers, and foreign governments. One in five companies
responding to the annual Information Week/Ernst & Young Security Survey
admitted that intruders had broken into, or had tried to break into, their
corporate networks, via the Internet, during the preceding twelve months
[Info]. And most experts agree that the majority of break-ins go undetected.
For example, attacks by the Defense Information Systems Agency (DISA) on
38,000 Department of Defense computer systems had an 88% success rate but
were detected by less that one in twenty of the target organizations. Of
those organizations, only 5% actually reacted to the attack [Wash]. The
bottomline is that when you connect your network to another network, bad
things can happen.

Internetwork Protection Firewalls come into the picture when any of the
networks that you are internetworking are untrusted. The Internet is always
assumed to be untrusted, but experience tells us that we really shouldn't
trust any network, even ones within our own company, unless we have full
assurance of their security status. In other words, if you are responsible
for the company's sales and marketing network you shouldn't just assume that
the company's production and inventory network is trustworthy, at least not
without some fairly strong assurances. Besides, can you really trust, or do
you even know about, all of the other networks that are connected to the
production and inventory network? This might sound paranoid, but that
doesn't mean it is unreasonable. An analogy might be a floppy disk handed to
you by a colleague. Even though you are assured it is virus-free, prudence
dictates that you scan it for viruses anyway.

So firewalls should be considered whenever you connect trusted networks
to untrusted networks. This means they are sometimes appropriate within an
organization, for example to control access between segments of a wide area
network, but they are almost always appropriate when you connect a company
network to the Internet. In a moment we will discuss how firewalls work and
the role they play in internetwork security.

Firewall Limitations Information security professionals often find
themselves working against misconceptions and popular opinions formed from
incomplete data. Some of these opinions spring more from hope than fact,
such as the idea that internal network security problems can be solved
simply by deploying a firewall. It is true firewalls deserve to be near the
top of the agenda for organizations who have, or are thinking about
creating, a connection between their network and another network. However,
firewalls are not the whole answer.

For a start, firewalls are not the answer to attacks behind the
firewall. The nature of firewall protection is perimeter defense [Amor].
Firewalls are not general-purpose access control systems and they are not
designed to control insiders abusing authorized access behind the firewall.
Information security surveys consistently report that more than half of all
incidents are insider attacks (many seasoned security professionals refer to
the 80/20 rule to describe the relative probability that a problem was
caused by insiders as opposed to outsiders).

Firewalls are not a solution to the malicious code problem. There are
two parts to this problem, viruses, self-replicating code that can cause
considerable disruption on networks as well as individual workstations, and
Trojan horses, programs pretending to be something they are not, such as
"password sniffers." To put this problem in perspective, the 1997 NCSA Virus
Study reports that virtually all North American companies and large
organizations have experienced virus infections. Some 90% of organizations
with more than 500 PCs experience, on average, at least one virus incident
per month. The cost of incidents averages over $8,000 and can run as high as
$100,000, with survey results indicating that the problem is getting worse
rather than better. New types of viruses which use macro languages are
spreading through shared documents, not programs. They can travel over the
Internet or through the World Wide Web as e-mail attachments. The Web itself
is a source of virus programs, which can be downloaded from a number of
sites. An additional complication is that many na ve users allow their
e-mail program or their operating systems to load and interpret e-mail
attachments such as MS-Word documents or HTML files without scanning for
harmful code. The Web is also a potential path for Trojan code (e.g., Java
applets or ActiveX controls), which is a potentially serious problem for
distributed application technologies.

Some firewalls can be configured to check incoming code for signs of
viruses and Trojan horses; however, defenses, while helpful, are not
foolproof. As far as Trojan code is concerned, current defenses are
essentially limited to barring known programs, which leaves a big gap
through which new Trojan programs may slip. Furthermore, firewalls can only
be expected to address one aspect of the malicious-code problem. Many virus
infections still occur because people have introduced infected disks into
the network. A typical example is the traveling salesperson who returns with
an infected laptop which is then attached to the network and infects it.
Another classic is the maintenance engineer who uses an infected disk to
test machines. Proper anti-virus policies and procedures can reduce these
risks, but virus-scanning firewalls are only part of the answer.

Another fact lost in the hyperbole about the Internet is that many of
the hacking incidents reported by the media have very little to do with the
Internet itself. Indeed, one of the most widely used hacking techniques is
social engineering, which essentially means tricking someone, either in
person or over the telephone, into revealing something like their network
password. And even though many companies now have an Internet connection,
phone lines intended for data, such as remote maintenance lines and field
office access lines, are still popular as means of gaining access to
internal systems.

In other words, efforts to protect data from Internet threats should not
take place in a vacuum. It must be stressed there is little point installing
a firewall if you haven't addressed the infosec basics, like classifying and
labeling data according to its sensitivity, password protecting
workstations, enforcing anti-virus policies, and tracking removable media.
One effect of the Internet phenomenon has been to hold up a mirror to
internal networks. What a lot of companies see is not pretty. The problem of
securing desktop PCs was not adequately addressed before we cabled, some
might say cobbled, them together to form local area networks (LANs). The
problem of securing LANs was not solved before they became wide area
networks (WANs). These facts come back to haunt us as we rush toward GANs or
global area networks [Eward].

Other Internet Problems Another oft-neglected security-related Internet
fact is that nobody owns the Internet. While the lack of ownership is
sometimes mentioned in articles about the Internet, the implications for
security, which are both positive as well as negative, are seldom
highlighted. The most obvious negative implication is that the Internet
includes some wild and lawless places. Traditionally a playground for
hackers, the Internet has no central authority. Despite recent rumblings
about "cyber-cops" from the U.S. Department of Justice, there is currently
no Internet police force and we are not likely to see one. The
trans-national nature of the Internet alone makes any such attempts at
policing problematic at best.

Ironically, this very lack of ownership has resulted in a growing
awareness of security. Because nobody owns the Internet, nobody is obliged
to minimize the risks associated with using it. Not so long ago, mainframe
makers assured users that their systems were safe and secure behind locked
doors. When personal computers first started appearing on corporate desktops
they were blasted as a security risk by some purveyors of big iron, but soon
these vendors were talking up their own PCs and talking less about security.
The trend continued as PCs came together as LANs.

With the exception of a few vendors selling security products, the lack
of talk about security persisted during the aggregation of LANs into WANs
and the enormous marketing push towards client/server solutions. But when
you start talking about the transition from WANs to Internet-based GANs, the
lack of security is well documented. We now hear major hardware and software
vendors talking publicly about Internet risks as they market their security
solutions, no longer obliged to overlook security issues. The potential
benefits of using the public Internet rather than dedicated private networks
are so financially compelling that few organizations feel they can afford to
turn their back on the Internet just because it is inherently insecure.

Unfortunately SATAN isn't as portable as we would like it to be, but it
still will run on a fairly large number of Un*x machines. One of the main
problems we had is that for it to do all of the tasks that we wanted and to
actually be able to release it within any reasonable time frame, we had to
both rely on many other publically available tools and forego much of our
usual testing methodologies. Still under development, and most often used by
us as a research and discovery tool, it will become more robust and portable
as we get feedback and are able to test it on more platforms ourselves.

Operating systems

Currently SATAN is known to work on the following Operating Systems:

SunOS 4.1.3_ U1 SunOS 5.3 Irix 5.3

Hardware platforms

SATAN has been tested with the following machines:

SPARCstation 4/75 SPARCstation 5 Indigo 2

However, it should run on quite a few more; try typing make for a list
of the ones we think it'll work on (currently, this is AIX, BSD types,
IRIX5, HP-UX 9.x, SunOS 4 & 5, SYSV-R4, Ultrix 4.x, and maybe, just maybe,
with a bit of tweaking, Linux.)

Disk space

Approximately 20 megabytes of total space is needed to install all of
the supplementary packages and the SATAN program. The bulk of this is due to
the other software packages, chiefly Mosaic or netscape (5.5 MB or 2.5 MB on
a sun) and perl5 (10 MB); SATAN itself takes up about two megabytes of
space, including the documentation. If the supplementary programs are
already installed, it isn't necessary to reinstall them. If you use the
binaries only, it can be as small as 5 MB for a full installation.

Memory

Memory is another issue - this is very dependent on how many hosts
you're scanning in or are in your database, but rest assured, SATAN is a
real pig when it comes to memory. From our experiences:

With approximately 1500 hosts scanned, with approximately 18000 facts in
the facts file took about 14 megabytes of memory on a SPARC 4/75 running
SunOS 4.1.3.

With approximately 4700 hosts scanned, with about 150000 facts, it took
up almost 35 megabytes of memory on an Indigo 2.

Needless to say, swapping is very painful if you don't have enough
memory.

Other software tools required and where you can get them

We realize that you may not have all of the additional software required
to run SATAN already on your system. If you're not on the Internet, we're
sorry but we currently do not have the resources to help you get all of
these programs. Perhaps at some point a tape or CD distribution could be
made (probably by a 3rd party) if the demand is high enough.

Although all of it is widely and freely available on the Internet, on a
wide number of sites, here are some easy places to find perl, mosaic, and
netscape:

Perl, version 5.001
Mosaic, version 2.5
Netscape, version 1.0

2. Defining Terms

So how do we define a firewall? Broadly speaking, it is a system or
group of systems that enforces an access control policy between two networks
[FAQ]. More specifically, a firewall is a collection of components or a
system that is placed between two networks and possesses the following
properties:

1. all traffic from inside to outside, and vice-versa, must pass through
it;

2. only authorized traffic, as defined by the local security policy, is
allowed to pass through it; and

3. the system itself is immune to penetration [Ches94].

As we said earlier, a firewall is a mechanism used to protect a trusted
network from an untrusted network; the two networks in question are
typically an organization's internal network (trusted) and the Internet
(untrusted). But there is nothing in the definition of a firewall that ties
the concept to the Internet (remember that we defined the Internet as the
global network of networks that communicates using TCP/IP and an internet as
any connected set of networks).

Internal Firewalls Consider a manufacturing company that has different
networks for sales, marketing, payroll, accounting, production, and product
development. Over time, these have been connected because some users have
made a case for having access to more than one network. But it is probably
unnecessary and undesirable for all users to have access to all of these
networks. Although application level security may be used to protect
sensitive data in a wide area network that offers any-to-any connectivity,
segregation of networks by means of firewalls greatly reduces many of the
risks involved; in particular, firewalls can notably reduce the threat of
hacking between networks by insiders (44% of respondents to Respondents in a
recent Infosecurity News/Yankee Group survey reported security compromises
by insiders). Insider hacking encompasses unauthorized or inappropriate
access to data and processing resources by employees, including authorized
users. It should be noted that the importance of insider abuse consistently
outranks that of external hacking in information security surveys.

Although the phenomenal growth of Internet connections has
understandably focused attention on Internet firewalls, modern business
practices continue to underline the importance of internal firewalls.
Consider mergers, acquisitions, reorganizations, outsourcing, joint
ventures, and strategic partnerships. In all but the most technologically
challenged industries these increasingly common occurrences have significant
internet implications. Suddenly, someone outside the organization needs
access to internal information. Multiple networks designed by different
people, according to different rules, are suddenly asked to trust each
other. In these circumstances, firewalls have an important role to play as a
mechanism to enforce an access- control policy between networks and to
protect trusted networks from those that are untrusted.

Gateways Medieval towns were often surrounded by huge walls for
protection. Access to and from the town was possible only through a limited
number of large gates or gateways. As a digital version of this concept,
"gateway" is now an important term often used as synonymous, or in
conjunction, with firewall; that is, a point of control through which
network traffic must pass. Internet firewalls are often referred to as
secure Internet gateways [Wack].

More specifically, a gateway is a computer that provides relay services
between two networks. As you can see from Figure 2, a firewall may consists
of several different components, including filters or screens that block
transmission of certain classes of traffic. A gateway is a machine or set of
machines that provides relay services which complement the filters. Another
term illustrated in Figure 2 is "demilitarized zone" or "DMZ" [Ches94]. This
is an area or sub-network between the inside and outside networks that is
partially protected. One or more gateway machines may be located in the DMZ.
Exemplifying a traditional security concept, defense-in-depth, the outside
filter protects the gateway from attack, while the inside gateway guards
against the consequences of a compromised gateway [Ches94].

Figure 2: Firewall schematics 3. Policy as the Key

Although it is helpful to diagram various configurations of filters and
gateways, it is imperative that we not lose sight of the broad definition of
a firewall as an implementation of security policy, not the totality of
security. A firewall is an approach to security; it helps implement a larger
security policy that defines the services and access to be permitted [Wack,
which is the basis for the rest of this section]. In other words, a firewall
is both policy and the implementation of that policy in terms of network
configuration, one or more host systems and routers, and other security
measures such as advanced authentication in place of static passwords. There
are two levels of network policy that directly influence the design, the
installation and the use of a firewall system:

Network Service Access Policy: a higher-level, issue-specific policy
which defines those services that will be allowed or explicitly denied from
the restricted network, plus the way in which these services will be used,
and the conditions for exceptions to this policy.

Firewall Design Policy: a lower-level policy which describes how the
firewall will actually go about restricting the access and filtering the
services as defined in the network service access policy.

Network Service Access Policy While focusing on the restriction and use
of internetwork services, the network service access policy should also
include all other outside network access such as dial-in and SLIP/PPP
connections. This is important because of the "belt-and-bulge" effect, where
restrictions on one network service access can lead users to try others. For
example, if restricting access to the Internet via a gateway prevents Web
browsing, users are likely to create dial-up PPP connections in order to
obtain this service. Since these are non-sanctioned, ad hoc connections,
they are likely to be improperly secured while at the same time opening the
network to attack.

Network service-access policy should be an extension of a strong
site-security policy and an overall policy regarding the protection of
information resources in the organization. This includes everything from
document shredders to virus scanners, remote access to floppy disk tracking.
At the highest level, the overall organizational policy might state a few
broad principles. For example, the fictitious Megabank, Inc. might use the
following as a starting point for its information security policy:

A. Information is vital to the economic well-being of Megabank.

B. Every cost-effective effort will be made to ensure the
confidentiality, control, integrity, authenticity, availability and utility
of Megabank information.

C. Protecting the confidentiality, control, integrity, authenticity,
availability and utility of Megabank's information resources is a priority
for all Megabank employees at all levels of the company.

D. All information processing facilities belonging to Megabank will be
used only for authorized Megabank purposes.

Below this statement of principles come site-specific policies covering
physical access to the property, general access to information systems, and
specific access to services on those systems. The firewall's network
service-access policy is formulated at this level.

For a firewall to be successful, the network service-access policy
should be drafted before the firewall is implemented. The policy must be
realistic and sound. A realistic policy is one that provides a balance
between protecting the network from known risks while still providing users
reasonable access to network resources. If a firewall system denies or
restricts services, it usually requires the strength of the network
service-access policy to prevent the firewall's access controls from being
modified or circumvented on an ad hoc basis. Only a sound, management-backed
pol