Do you use HOSTS file ?

G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

Hello all,

I wrote a page about Host File and how to use it.
http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
What do you think about it and what else more could I add to it ?

Stickman answered me : "Unfortunately, using the hosts file to block
unwanted content is terribly inefficient."
Why is that ? Do you think squid is more efficient ? Or iptables ?
What about Microsoft OS too ?

Thanks
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security,comp.security.misc (More info?)

Programmershouse wrote:

> I wrote a page about Host File and how to use it.
> http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
> What do you think about it and what else more could I add to it ?

You cannot know the absolute location of the hosts file. For example, the
hosts file of the XP machine I'm writing this on is not located where you
state it is. Neither is my browser cache. You are writing in second person
when you mean first.

> Stickman answered me : "Unfortunately, using the hosts file to block
> unwanted content is terribly inefficient."
> Why is that ?

Because it blocks nothing - it works by breaking name lookups. It isn't even
on topic for comp.security.firewalls. Follow-ups set.

Thor

--
http://www.anta.net/
 

Dak

Distinguished
Jan 1, 2003
63
0
18,630
Archived from groups: comp.security.firewalls (More info?)

On 23 Jul 2004 07:34:57 -0700, spamprogrammershouse@yahoo.fr (Programmershouse)
wrote:

>Stickman answered me : "Unfortunately, using the hosts file to block
>unwanted content is terribly inefficient."
>Why is that ?

Because it takes two entries for each domain you want to block:

127.0.0.1 example.com
127.0.0.1 www.example.com

And if they have several subdomains and/or top level domains that you want to
block, then it requires two entries for each of those, also. If example.com had
ad1.example.com through ad100.example.com and you wanted to block them all, it
would take another 200 entries. That'd be 202 entries in that example and the
other top level domains haven't been touched...and if there were more
subdomains...all at one entry per line. And if they added new ones you have to
manually add those, once you found out about them, that is.
That's why it is "terribly inefficient."

A simple, single entry of example.com in DNSKong would block ALL mentioned
above, and more (it'd block ALL subdomains). That's one entry versus 202
entries (so far). And it will block any new/future subdomains, whether you know
about them or not, as is...no changes/additions needed.

Now let's say that example.com also has the same unwanted setup for the
toplevel domains NET and ORG and BIZ. That's another 606 entries, bringing the
total up to 808 lines/entries for your HOSTS file...JUST for this one
badguy...how many more do you want/need to block?
That's why it is "terribly inefficient."

And in DNSKong? Another three lines for a total of four. And DNSKong isn't
the only application available to do this "reduction."
If you had a blocking client that used regular expressions, the entire example
could be reduced to one entry (You could use a single entry of "example" in
DNSKong to achieve the same, but it would whole-word match (block) any address
with example in it, as opposed to being able to limit/target/specify as is
possible with regular expressions).
That's why it is "terribly inefficient."

--
dak
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

"Programmershouse" <spamprogrammershouse@yahoo.fr>
wrote in news:eeba0ece.0407230634.3b624881@posting.google.com:
> Hello all,
>
> I wrote a page about Host File and how to use it.
> http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
> What do you think about it and what else more could I add to it ?
>
> Stickman answered me : "Unfortunately, using the hosts file to block
> unwanted content is terribly inefficient."
> Why is that ? Do you think squid is more efficient ? Or iptables ?
> What about Microsoft OS too ?
>
> Thanks

As an example, why bother tracking and updating 50-plus fully qualified
hosts in a hosts file, like at
http://www.mvps.org/winhelp2002/hosts.txt, for all the Doubleclick sites
rather than use just one regular expression in a URL filter in a
firewall, like ":////.*/.doubleclick/..*//" (which might be better
understood as "://*.doubleclick.*/")? You cannot use wildcarding in the
hosts file and that is why there are lots of entries for the same entity
you want to block.

In the hosts file, all hostnames must be fully qualified host names.
URL filtering eliminates having to list, maintain, and update dozens and
dozens of sites associated to just one entity that you want to block.
Even if your firewall only supports simplistic URL rules, you can
probably block on ".doubleclick.com/" and ".doubleclick.net/" to block
almost all of Doubleclick. Also, when your firewall does the blocking,
it typically inserts a message for the blocked content to alert you that
the firewall blocked it. When using the hosts file, all you get is an
error page where it is not obvious what caused the block nor that
anything was actually blocked. Instead the error page looks to be a
problem with the connection, the DNS records, your local DNS cache, or
whatever but which does not announce itself as the agent for the block.

Since anyone that is using a hosts file to block access to some sites
should obviously also be running a firewall, use URL filtering rules in
the firewall that you already have running.

--
__________________________________________________
*** Post replies to newsgroup. Share with others.
(E-mail: domain = ".com", add "=NEWS=" to Subject)
__________________________________________________
 
G

Guest

Guest
Archived from groups: comp.security.firewalls (More info?)

"Jbob" <nobody@SpamCox.net>
wrote in news:05-dnT-B0ta1r5ncRVn-sQ@comcast.com:
> eDexter v1.35 is very similar and you can use wildcards.

Anyone using a hosts file to block access to sites should obviously also
be running a firewall. If they are running a firewall, they don't need
to use the hosts file. Instead they can use URL filtering in their
firewall to block all hostnames and/or subdomains for a blocked domain.
A couple URL filters for ".doubleclick.com/" and ".doubleclick.net/" to
block all Doubleclick sites is far easier to manage than 50, or more,
entries listed in a hosts file, like the one at
http://www.mvps.org/winhelp2002/hosts.txt. In the example hosts file
mentioned, it has 4900 lines. This size of a list becomes unmanageable.
Instead of YOU managing what sites YOU want to block, you relegate that
authority to someone else so you really don't have any idea of what you
are blocking.

It would be quite interesting to see how much small this example hosts
file would become if all domains were replaced and grouped under a
single regular expression like "://.*/.<domain>/..*//" (don't remember
if the colon needs to be escaped) for each domain. With 54 entries for
Doubleclick getting replaced by one regular expression, you have reduced
effort by over 50 times. It is likely that the same reduction will not
occur for other domains, and there may be some that have more than 50
entries listed in the hosts file for the same targeted entity to block.
I suspect the hosts file would reduce to one-tenth, or less, of its size
by using regular expressions to group similar entries. You would then
have a better chance of realizing what all sites you were blocking
rather than hoping someone else's list matches your preferences and
hoping that when you get a "server not found" error web page that you
know it was caused by an entry in the hosts file but which could only be
verified by looking in the hosts file.

Yes, there are products which can run as a local web server to intercept
requests sent to 127.0.0.1 to see if the resolved hostname matches an
entry in your hosts file, like eDexter, but then you have to waste the
resources (memory and CPU cycles) to leave it running all the time. I
use Norton Internet Security and it will display text within the web
page noting it blocked that content or page so I know what caused the
block (as opposed to wondering if there was some connection, DNS, or
site problem). If Norton can do it then I figure other firewalls have a
similar feature to let you know immediately and overtly that they
blocked something.

What I dislike about Norton is that you really don't get to use regular
expressions so you end up with ".doubleclick.com/" and
".doubleclick.net/" (which still work well) but mostly I dislike that
you need to install and enable their Parental Control feature. This
consumes 115MB of system RAM to load their category table used for
blocking sites by their category (eg., porn, cracking, or whatever). My
NAT router has very limited memory so I cannot define many URL filters
in its firewall. When my subscription runs out, one of the features
that I will look for in a different firewall will be URL filters, using
regular expressions in them, and if I get notified overtly in the
browser of the blocked content.


--
__________________________________________________
*** Post replies to newsgroup. Share with others.
(E-mail: domain = ".com", add "=NEWS=" to Subject)
__________________________________________________
 
G

Guest

Guest
Archived from groups: comp.security.firewalls (More info?)

>
>
> Programmershouse wrote:
>
>> I wrote a page about Host File and how to use it.
>> http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
>> What do you think about it and what else more could I add to it ?
>
> You cannot know the absolute location of the hosts file. For example,
> the hosts file of the XP machine I'm writing this on is not located
> where you state it is.

Based on the information in the link, the HOST file is at that location
on my Win 2k and XP Pro machines. Where else is the HOST file going to be
located on a NT based O/S? On Win 9'x and ME, it's in C:\windows I think.

If the HOST file is in play on the O/S, then I have not seen a command
the can be set to direct the O/S to look else where for the Host file.

> Neither is my browser cache. You are writing in
> second person when you mean first.

That's on my XP Pro machine.

C:\Documents and Settings\username\Local Settings\Temporary Internet
Files"

>
>> Stickman answered me : "Unfortunately, using the hosts file to block
>> unwanted content is terribly inefficient."
>> Why is that ?
>
> Because it blocks nothing - it works by breaking name lookups. It
> isn't even on topic for comp.security.firewalls. Follow-ups set.

I think it's a good tool to block the browser redirects to a site where
the dubious site can download something to the machine. If the HOST is in
play and the Domain Name being redirected to is in the HOST file using
the Loopback IP, then the redirect is going to be stopped.

I expect any program Web application or not that's using a URL to access
a site and the Domain Name is in the HOST file with 127.0.0.1, then I
expect the contact to be stopped by the machine.

However, if IE is using the proxy setting, then the HOST file is
bypassed.

I use the HOST on my machines and have no problems in doing so. I think
it's a limited measure to protect the machine, IMHO. The HOST file should
be locked down as it can be hacked.

Duane :)
 
G

Guest

Guest
Archived from groups: comp.security.firewalls (More info?)

On Mon, 26 Jul 2004 03:15:13 GMT, K2NNJ spoketh

>France!.....ewwwwwwwww
>
>Boycott France!
>

Why?

Lars M. Hansen
http://www.hansenonline.net
(replace 'badnews' with 'news' in e-mail address)
 
G

Guest

Guest
Archived from groups: comp.security.firewalls (More info?)

On Sun, 25 Jul 2004 21:49:55 -0500, *Vanguard* spoketh
>
>Anyone using a hosts file to block access to sites should obviously also
>be running a firewall. If they are running a firewall, they don't need
>to use the hosts file.

Not all firewalls have URL blocking...

Lars M. Hansen
http://www.hansenonline.net
(replace 'badnews' with 'news' in e-mail address)
 
G

Guest

Guest
Archived from groups: comp.security.firewalls (More info?)

"Lars M. Hansen" <badnews@hansenonline.net>
wrote in news:sor9g09osu8rhsg03btv6ti93itkd7pm1b@4ax.com:
> On Sun, 25 Jul 2004 21:49:55 -0500, *Vanguard* spoketh
>>
>> Anyone using a hosts file to block access to sites should obviously
>> also be running a firewall. If they are running a firewall, they
>> don't need to use the hosts file.
>
> Not all firewalls have URL blocking...
>
> Lars M. Hansen
> http://www.hansenonline.net
> (replace 'badnews' with 'news' in e-mail address)

I know. What URL blocking there is in Norton Internet Security is very
limited. No wildcards and no regular expressions. Just simple string
matching, like ".doubleclick.com/". But I couldn't stand the 115MB loss
of system memory for their huge categorization table that gets loaded
with their Parental Control feature, so that got uninstalled and I lost
the simplistic URL filtering. When my subscription nears its
expiration, that will be something that I will look for in a firewall so
I have a manageably sized block list rather than thousands of entries in
a hosts file.

Know which firewalls do provide URL blocking? And which of those allow
regular expressions for entries in that list?

--
__________________________________________________
*** Post replies to newsgroup. Share with others.
(E-mail: domain = ".com", add "=NEWS=" to Subject)
__________________________________________________
 
G

Guest

Guest
Archived from groups: comp.security.firewalls (More info?)

*Vanguard* wrote:
> "Lars M. Hansen" <badnews@hansenonline.net>
> wrote in news:sor9g09osu8rhsg03btv6ti93itkd7pm1b@4ax.com:
>
>>On Sun, 25 Jul 2004 21:49:55 -0500, *Vanguard* spoketh
>>
>>>Anyone using a hosts file to block access to sites should obviously
>>>also be running a firewall. If they are running a firewall, they
>>>don't need to use the hosts file.
>>
>>Not all firewalls have URL blocking...
>>
>>Lars M. Hansen
>>http://www.hansenonline.net
>>(replace 'badnews' with 'news' in e-mail address)
>
>
> I know. What URL blocking there is in Norton Internet Security is very
> limited. No wildcards and no regular expressions. Just simple string
> matching, like ".doubleclick.com/". But I couldn't stand the 115MB loss
> of system memory for their huge categorization table that gets loaded
> with their Parental Control feature, so that got uninstalled and I lost
> the simplistic URL filtering. When my subscription nears its
> expiration, that will be something that I will look for in a firewall so
> I have a manageably sized block list rather than thousands of entries in
> a hosts file.
>
> Know which firewalls do provide URL blocking? And which of those allow
> regular expressions for entries in that list?
>
Outpost Pro is one that can be configured to block specific URL's.
Trendmicro Internet Security provides wildcard URL blocking.
 
G

Guest

Guest
Archived from groups: comp.security.firewalls (More info?)

"optikl" <optikl@invalid.net>
wrote in news:NPhNc.194098$Oq2.29013@attbi_s52:
> Outpost Pro is one that can be configured to block specific URL's.
> Trendmicro Internet Security provides wildcard URL blocking.

I scanned the online manual for Outpost Pro 2.1 at
http://www.agnitum.com/download/Outpost_Pro_User_Guide_(ENG).pdf to see
what URL filtering it provides and if wildcarding or regular expressions
are allowed. Section 6.7 discusses Content Filtering. The phrase "To
list particular web sites you do not want to displayed ...", but it is
not just displaying them that causes problems. I don't want anything
linked to, submitted to, downloaded from, or called from that site
besides any images or web pages they may try to proffer. It also only
looks for keywords (i.e., substrings) *anywhere* in the URL whereas, in
some cases, I want to restrict the blocking based only on the domain
portion of the URL, not by a possible match somewhere within the path
under it or in the parameters, and in other cases I do want to match
anywhere within the URL, including the parameters (in the case of a
redirect). Regular expressions are not supported. I might be able to
specify ".domain." but there would be no guarantee the match only
occured in the domain portion of the URL. I could try ".domain.tld/"
but then a site with a ccTLD of ".domain.tld.cctld/", like
".someplace.com.au/" would not get caught. What I see for URL filtering
in Outpost is the same as what I have (or could have) in Norton Internet
Security (but probably without the bloat of the category table for
Parental Control).

Another user mentioned the Blockpost plug-in but trying to find a link
on their web pages to these user/community developed plug-ins is
exascerbating; rather than include them under the Downloads or Support
left-frame menu, you have to go look at the product description and
click on "Download plug-ins" under the "Existing Users" section
(although you may not yet be an existing user). That only lets me list
sites by their domain name or IP address. That won't help if the domain
is specified in the parameters of the URL for a redirect. It also
requires that I enter a FQDN, like "hostname.domain.tld" but a site may
use a ccTLD to designate their country, as in
"hostname.domain.tld.cctld", or even use the ccTLD as a subdomain, like
Doubleclick does with "hostname.cctld.doubleclick.net". And a nasty
domain may rotate or change their hostname, so "www.domain.tld" may
become "humphrey.domain.tld" and then "edina.domain.tld", and some don't
need a hostname, like "grc.com", so I don't want to include a hostname,
nor do I want to include a subdomain.

I figure "^(https?|ftp):////.*/.domain/..*//?" would restrict the block
to looking only at http(s) or ftp sites specified only in the domain
portion of the URL whereas ".*:////.*//.*/.domain/.(com|net).*" would
catch only the .com or .net TLD for that domain using any protocol but
only if the domain was specified in the parameters for a possible
redirect. Since I don't have a firewall that supports regular
expressions, I really can't test if the regular expressions that I
mention here will work as expected. Having to run a gateway running
Linux to provide a proxy or get an expensive firewall appliance or
enterprise-level internet gateway that can understand regular
expressions and then forcing all clients to use that proxy is just too
big and too expensive a task for a home network. It shouldn't take all
that just to get support for regular expressions in a personal firewall.
Maybe the developers figure we users are too stupid to figure out how to
read the product's documentation on how to write their flavor of regular
expressions.

I can't tell what TrendMicro Internet Security provides regarding URL
filtering and support for regular expressions since they don't provide
an online or downloadable manual but just their quick start guide and
readme file.

--
__________________________________________________
*** Post replies to newsgroup. Share with others.
(E-mail: domain = ".com", add "=NEWS=" to Subject)
__________________________________________________
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

Programmershouse wrote:
> Hello all,
>
> I wrote a page about Host File and how to use it.
> http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
> What do you think about it and what else more could I add to it ?
>
> Stickman answered me : "Unfortunately, using the hosts file to block
> unwanted content is terribly inefficient."
> Why is that ? Do you think squid is more efficient ? Or iptables ?
> What about Microsoft OS too ?
>
> Thanks

Windows actually has a hosts file too!! On XP it's in
C:\WINDOWS\system32\drivers\etc
there is a networks, protocols, and services too.

Too block unwanted IP's I'd use tcpwrappers (the files hosts.deny and
hosts.allow in /etc) and for windows OS's I'd do it at the firewall or
router level (for example set up ACLs on cisco's routers)

Dave
 
G

Guest

Guest
Archived from groups: comp.security.firewalls (More info?)

On Tue, 27 Jul 2004 04:40:13 GMT, K2NNJ spoketh

>http://www.nydailynews.com/front/story/209251p-180208c.html
>

Yeah, Bill O'Reilly has always been a source of "fair and balanced"
news.

Lars M. Hansen
http://www.hansenonline.net
(replace 'badnews' with 'news' in e-mail address)
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

spamprogrammershouse@yahoo.fr (Programmershouse) writes:

>http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
>What do you think about it and what else more could I add to it ?

That it's an absolute worst case kludge when you have no other alternative.

>Stickman answered me : "Unfortunately, using the hosts file to block
>unwanted content is terribly inefficient."
>Why is that ?

A number of reasons:
* You have to list every host separately, an impossibly long task.
* If your local machine doesn't run a web server, the references to
127.0.0.1 will take ages to time out, quite possibly taking longer
than just loading the ad banners in the first place!
* If you get around this by pointing those hosts entries at the closest
web server instead, such as your ISP's web server, that's even kludgier,
and has to be changed whenever you change ISPs.
* If your local machine does run a web server, pages may not display
nicely with missing parts.

>Do you think squid is more efficient ? Or iptables ?

www.privoxy.org

>What about Microsoft OS too ?

There's a Windoze version of privoxy too.

--
Craig Macbride <craig@f8d.com> http://www.f8d.com
---------------------------------------------------------------------------
I don't want to achieve immortality through my work...
I want to achieve it through not dying. - Woody Allen
 
G

Guest

Guest
Archived from groups: comp.security.firewalls (More info?)

Thanks for the tip.................Thor...(rolling eyes)


"Thor Kottelin" <thor@anta.net> wrote in message
news:41061C68.67A48CA8@anta.net...
>
>
> K2NNJ wrote:
> >
> > http://www.nydailynews.com/front/story/209251p-180208c.html
>
> This is an international newsgroup about firewalls. If you want to discuss
> unrelated topics, such as the war against Iraq, take it elsewhere.
>
> Thor
>
> --
> http://www.anta.net/
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

Dave Yingling wrote:

> Windows actually has a hosts file too!! On XP it's in
> C:\WINDOWS\system32\drivers\etc
> there is a networks, protocols, and services too.

Once again, it may just as well be elsewhere.
<URL:news:410217E1.AA792B18@anta.net>

Thor

--
http://www.anta.net/
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

On Tue, 27 Jul 2004 12:16:29 +0300, Thor Kottelin spoketh

>
>
>Dave Yingling wrote:
>
>> Windows actually has a hosts file too!! On XP it's in
>> C:\WINDOWS\system32\drivers\etc
>> there is a networks, protocols, and services too.
>
>Once again, it may just as well be elsewhere.
><URL:news:410217E1.AA792B18@anta.net>
>
>Thor

By default, the hosts file are in the locations described. I figure if
someone have gone through the trouble of editing the registry to move
the file, then they would know where it was, and wouldn't need anyone
elses help in telling them where it can be found.

Lars M. Hansen
http://www.hansenonline.net
(replace 'badnews' with 'news' in e-mail address)
 

Dak

Distinguished
Jan 1, 2003
63
0
18,630
Archived from groups: comp.security.firewalls (More info?)

On 27 Jul 2004 08:03:09 -0600, craig@f8d.com (Craig Macbride) wrote:

>* If your local machine doesn't run a web server, the references to
>127.0.0.1 will take ages to time out, quite possibly taking longer
>than just loading the ad banners in the first place!
>* If you get around this by pointing those hosts entries at the closest
>web server instead, such as your ISP's web server, that's even kludgier,
>and has to be changed whenever you change ISPs.
>* If your local machine does run a web server, pages may not display
>nicely with missing parts.

That's what eDexter <http://www.pyrenean.com/> is for. It acts as a web server
so you don't get the long timeouts, and places a small image in the blocked area
so it does display correctly.

--
dak
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

"Lars M. Hansen" wrote:
>
> On Tue, 27 Jul 2004 12:16:29 +0300, Thor Kottelin spoketh

> >Dave Yingling wrote:
> >
> >> Windows actually has a hosts file too!! On XP it's in
> >> C:\WINDOWS\system32\drivers\etc
> >> there is a networks, protocols, and services too.
> >
> >Once again, it may just as well be elsewhere.
> ><URL:news:410217E1.AA792B18@anta.net>

> By default, the hosts file are in the locations described. I figure if
> someone have gone through the trouble of editing the registry to move
> the file, then they would know where it was, and wouldn't need anyone
> elses help in telling them where it can be found.

I haven't edited the registry (in that respect), but my hosts file is still
not in C:\WINDOWS\system32\drivers\etc.

Thor

--
http://www.anta.net/
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

On Tue, 27 Jul 2004 21:21:12 +0300, Thor Kottelin spoketh

>
>
>"Lars M. Hansen" wrote:
>>
>> On Tue, 27 Jul 2004 12:16:29 +0300, Thor Kottelin spoketh
>
>> >Dave Yingling wrote:
>> >
>> >> Windows actually has a hosts file too!! On XP it's in
>> >> C:\WINDOWS\system32\drivers\etc
>> >> there is a networks, protocols, and services too.
>> >
>> >Once again, it may just as well be elsewhere.
>> ><URL:news:410217E1.AA792B18@anta.net>
>
>> By default, the hosts file are in the locations described. I figure if
>> someone have gone through the trouble of editing the registry to move
>> the file, then they would know where it was, and wouldn't need anyone
>> elses help in telling them where it can be found.
>
>I haven't edited the registry (in that respect), but my hosts file is still
>not in C:\WINDOWS\system32\drivers\etc.
>
>Thor

Well, there are a number of reasons why that might be the case...

Lars M. Hansen
http://www.hansenonline.net
(replace 'badnews' with 'news' in e-mail address)
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

On 27 Jul 2004 08:03:09 -0600, Craig Macbride spoketh



>* If your local machine doesn't run a web server, the references to
>127.0.0.1 will take ages to time out, quite possibly taking longer
>than just loading the ad banners in the first place!

No it won't. That will only happen if you are running a software
firewall on your desktop that for some reason are quietly dropping
connection attempts to localhost. Normally, connections from localhost
to localhost on a closed port will result in a quick RST, not a slow
timeout.

>* If you get around this by pointing those hosts entries at the closest
>web server instead, such as your ISP's web server, that's even kludgier,
>and has to be changed whenever you change ISPs.

See above.

>* If your local machine does run a web server, pages may not display
>nicely with missing parts.
>

image tags and embedded objects should have the size of the image/object
specified, in which case it doesn't matter if the object or image is
loaded. The browser will simply set aside an area of the proper size on
the page and not load the object...

Lars M. Hansen
www.hansenonline.net
Remove "bad" from my e-mail address to contact me.
"If you try to fail, and succeed, which have you done?"
 
G

Guest

Guest
Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

"Lars M. Hansen" wrote:
>
> On Tue, 27 Jul 2004 21:21:12 +0300, Thor Kottelin spoketh

> >"Lars M. Hansen" wrote:

> >> By default, the hosts file are in the locations described. I figure if
> >> someone have gone through the trouble of editing the registry to move
> >> the file, then they would know where it was, and wouldn't need anyone
> >> elses help in telling them where it can be found.
> >
> >I haven't edited the registry (in that respect), but my hosts file is still
> >not in C:\WINDOWS\system32\drivers\etc.

> Well, there are a number of reasons why that might be the case...

Exactly. Therefore we should not overgeneralize.

Thor

--
http://www.anta.net/