Sign in with
Sign up | Sign in
Your question

Do you use HOSTS file ?

Last response: in Networking
Share
Anonymous
a b 8 Security
July 23, 2004 11:34:57 AM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

Hello all,

I wrote a page about Host File and how to use it.
http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
What do you think about it and what else more could I add to it ?

Stickman answered me : "Unfortunately, using the hosts file to block
unwanted content is terribly inefficient."
Why is that ? Do you think squid is more efficient ? Or iptables ?
What about Microsoft OS too ?

Thanks

More about : hosts file

Anonymous
a b 8 Security
July 23, 2004 11:53:48 PM

Archived from groups: comp.security.firewalls,comp.os.linux.security,comp.security.misc (More info?)

Programmershouse wrote:

> I wrote a page about Host File and how to use it.
> http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
> What do you think about it and what else more could I add to it ?

You cannot know the absolute location of the hosts file. For example, the
hosts file of the XP machine I'm writing this on is not located where you
state it is. Neither is my browser cache. You are writing in second person
when you mean first.

> Stickman answered me : "Unfortunately, using the hosts file to block
> unwanted content is terribly inefficient."
> Why is that ?

Because it blocks nothing - it works by breaking name lookups. It isn't even
on topic for comp.security.firewalls. Follow-ups set.

Thor

--
http://www.anta.net/
July 24, 2004 9:47:10 AM

Archived from groups: comp.security.firewalls (More info?)

On 23 Jul 2004 07:34:57 -0700, spamprogrammershouse@yahoo.fr (Programmershouse)
wrote:

>Stickman answered me : "Unfortunately, using the hosts file to block
>unwanted content is terribly inefficient."
>Why is that ?

Because it takes two entries for each domain you want to block:

127.0.0.1 example.com
127.0.0.1 www.example.com

And if they have several subdomains and/or top level domains that you want to
block, then it requires two entries for each of those, also. If example.com had
ad1.example.com through ad100.example.com and you wanted to block them all, it
would take another 200 entries. That'd be 202 entries in that example and the
other top level domains haven't been touched...and if there were more
subdomains...all at one entry per line. And if they added new ones you have to
manually add those, once you found out about them, that is.
That's why it is "terribly inefficient."

A simple, single entry of example.com in DNSKong would block ALL mentioned
above, and more (it'd block ALL subdomains). That's one entry versus 202
entries (so far). And it will block any new/future subdomains, whether you know
about them or not, as is...no changes/additions needed.

Now let's say that example.com also has the same unwanted setup for the
toplevel domains NET and ORG and BIZ. That's another 606 entries, bringing the
total up to 808 lines/entries for your HOSTS file...JUST for this one
badguy...how many more do you want/need to block?
That's why it is "terribly inefficient."

And in DNSKong? Another three lines for a total of four. And DNSKong isn't
the only application available to do this "reduction."
If you had a blocking client that used regular expressions, the entire example
could be reduced to one entry (You could use a single entry of "example" in
DNSKong to achieve the same, but it would whole-word match (block) any address
with example in it, as opposed to being able to limit/target/specify as is
possible with regular expressions).
That's why it is "terribly inefficient."

--
dak
Related resources
July 25, 2004 9:20:23 PM

Archived from groups: comp.security.firewalls (More info?)

eDexter v1.35 is very similar and you can use wildcards.
Anonymous
a b 8 Security
July 26, 2004 1:28:51 AM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

"Programmershouse" <spamprogrammershouse@yahoo.fr>
wrote in news:eeba0ece.0407230634.3b624881@posting.google.com:
> Hello all,
>
> I wrote a page about Host File and how to use it.
> http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
> What do you think about it and what else more could I add to it ?
>
> Stickman answered me : "Unfortunately, using the hosts file to block
> unwanted content is terribly inefficient."
> Why is that ? Do you think squid is more efficient ? Or iptables ?
> What about Microsoft OS too ?
>
> Thanks

As an example, why bother tracking and updating 50-plus fully qualified
hosts in a hosts file, like at
http://www.mvps.org/winhelp2002/hosts.txt, for all the Doubleclick sites
rather than use just one regular expression in a URL filter in a
firewall, like ":////.*/.doubleclick/..*//" (which might be better
understood as "://*.doubleclick.*/")? You cannot use wildcarding in the
hosts file and that is why there are lots of entries for the same entity
you want to block.

In the hosts file, all hostnames must be fully qualified host names.
URL filtering eliminates having to list, maintain, and update dozens and
dozens of sites associated to just one entity that you want to block.
Even if your firewall only supports simplistic URL rules, you can
probably block on ".doubleclick.com/" and ".doubleclick.net/" to block
almost all of Doubleclick. Also, when your firewall does the blocking,
it typically inserts a message for the blocked content to alert you that
the firewall blocked it. When using the hosts file, all you get is an
error page where it is not obvious what caused the block nor that
anything was actually blocked. Instead the error page looks to be a
problem with the connection, the DNS records, your local DNS cache, or
whatever but which does not announce itself as the agent for the block.

Since anyone that is using a hosts file to block access to some sites
should obviously also be running a firewall, use URL filtering rules in
the firewall that you already have running.

--
__________________________________________________
*** Post replies to newsgroup. Share with others.
(E-mail: domain = ".com", add "=NEWS=" to Subject)
__________________________________________________
Anonymous
a b 8 Security
July 26, 2004 1:49:55 AM

Archived from groups: comp.security.firewalls (More info?)

"Jbob" <nobody@SpamCox.net>
wrote in news:05-dnT-B0ta1r5ncRVn-sQ@comcast.com:
> eDexter v1.35 is very similar and you can use wildcards.

Anyone using a hosts file to block access to sites should obviously also
be running a firewall. If they are running a firewall, they don't need
to use the hosts file. Instead they can use URL filtering in their
firewall to block all hostnames and/or subdomains for a blocked domain.
A couple URL filters for ".doubleclick.com/" and ".doubleclick.net/" to
block all Doubleclick sites is far easier to manage than 50, or more,
entries listed in a hosts file, like the one at
http://www.mvps.org/winhelp2002/hosts.txt. In the example hosts file
mentioned, it has 4900 lines. This size of a list becomes unmanageable.
Instead of YOU managing what sites YOU want to block, you relegate that
authority to someone else so you really don't have any idea of what you
are blocking.

It would be quite interesting to see how much small this example hosts
file would become if all domains were replaced and grouped under a
single regular expression like "://.*/.<domain>/..*//" (don't remember
if the colon needs to be escaped) for each domain. With 54 entries for
Doubleclick getting replaced by one regular expression, you have reduced
effort by over 50 times. It is likely that the same reduction will not
occur for other domains, and there may be some that have more than 50
entries listed in the hosts file for the same targeted entity to block.
I suspect the hosts file would reduce to one-tenth, or less, of its size
by using regular expressions to group similar entries. You would then
have a better chance of realizing what all sites you were blocking
rather than hoping someone else's list matches your preferences and
hoping that when you get a "server not found" error web page that you
know it was caused by an entry in the hosts file but which could only be
verified by looking in the hosts file.

Yes, there are products which can run as a local web server to intercept
requests sent to 127.0.0.1 to see if the resolved hostname matches an
entry in your hosts file, like eDexter, but then you have to waste the
resources (memory and CPU cycles) to leave it running all the time. I
use Norton Internet Security and it will display text within the web
page noting it blocked that content or page so I know what caused the
block (as opposed to wondering if there was some connection, DNS, or
site problem). If Norton can do it then I figure other firewalls have a
similar feature to let you know immediately and overtly that they
blocked something.

What I dislike about Norton is that you really don't get to use regular
expressions so you end up with ".doubleclick.com/" and
".doubleclick.net/" (which still work well) but mostly I dislike that
you need to install and enable their Parental Control feature. This
consumes 115MB of system RAM to load their category table used for
blocking sites by their category (eg., porn, cracking, or whatever). My
NAT router has very limited memory so I cannot define many URL filters
in its firewall. When my subscription runs out, one of the features
that I will look for in a different firewall will be URL filters, using
regular expressions in them, and if I get notified overtly in the
browser of the blocked content.


--
__________________________________________________
*** Post replies to newsgroup. Share with others.
(E-mail: domain = ".com", add "=NEWS=" to Subject)
__________________________________________________
Anonymous
a b 8 Security
July 26, 2004 4:15:26 AM

Archived from groups: comp.security.firewalls (More info?)

>
>
> Programmershouse wrote:
>
>> I wrote a page about Host File and how to use it.
>> http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
>> What do you think about it and what else more could I add to it ?
>
> You cannot know the absolute location of the hosts file. For example,
> the hosts file of the XP machine I'm writing this on is not located
> where you state it is.

Based on the information in the link, the HOST file is at that location
on my Win 2k and XP Pro machines. Where else is the HOST file going to be
located on a NT based O/S? On Win 9'x and ME, it's in C:\windows I think.

If the HOST file is in play on the O/S, then I have not seen a command
the can be set to direct the O/S to look else where for the Host file.

> Neither is my browser cache. You are writing in
> second person when you mean first.

That's on my XP Pro machine.

C:\Documents and Settings\username\Local Settings\Temporary Internet
Files"

>
>> Stickman answered me : "Unfortunately, using the hosts file to block
>> unwanted content is terribly inefficient."
>> Why is that ?
>
> Because it blocks nothing - it works by breaking name lookups. It
> isn't even on topic for comp.security.firewalls. Follow-ups set.

I think it's a good tool to block the browser redirects to a site where
the dubious site can download something to the machine. If the HOST is in
play and the Domain Name being redirected to is in the HOST file using
the Loopback IP, then the redirect is going to be stopped.

I expect any program Web application or not that's using a URL to access
a site and the Domain Name is in the HOST file with 127.0.0.1, then I
expect the contact to be stopped by the machine.

However, if IE is using the proxy setting, then the HOST file is
bypassed.

I use the HOST on my machines and have no problems in doing so. I think
it's a limited measure to protect the machine, IMHO. The HOST file should
be locked down as it can be hacked.

Duane :) 
Anonymous
a b 8 Security
July 26, 2004 11:46:14 AM

Archived from groups: comp.security.firewalls (More info?)

On Mon, 26 Jul 2004 03:15:13 GMT, K2NNJ spoketh

>France!.....ewwwwwwwww
>
>Boycott France!
>

Why?

Lars M. Hansen
http://www.hansenonline.net
(replace 'badnews' with 'news' in e-mail address)
Anonymous
a b 8 Security
July 26, 2004 11:47:31 AM

Archived from groups: comp.security.firewalls (More info?)

On Sun, 25 Jul 2004 21:49:55 -0500, *Vanguard* spoketh
>
>Anyone using a hosts file to block access to sites should obviously also
>be running a firewall. If they are running a firewall, they don't need
>to use the hosts file.

Not all firewalls have URL blocking...

Lars M. Hansen
http://www.hansenonline.net
(replace 'badnews' with 'news' in e-mail address)
Anonymous
a b 8 Security
July 26, 2004 5:11:28 PM

Archived from groups: comp.security.firewalls (More info?)

"Lars M. Hansen" <badnews@hansenonline.net>
wrote in news:sor9g09osu8rhsg03btv6ti93itkd7pm1b@4ax.com:
> On Sun, 25 Jul 2004 21:49:55 -0500, *Vanguard* spoketh
>>
>> Anyone using a hosts file to block access to sites should obviously
>> also be running a firewall. If they are running a firewall, they
>> don't need to use the hosts file.
>
> Not all firewalls have URL blocking...
>
> Lars M. Hansen
> http://www.hansenonline.net
> (replace 'badnews' with 'news' in e-mail address)

I know. What URL blocking there is in Norton Internet Security is very
limited. No wildcards and no regular expressions. Just simple string
matching, like ".doubleclick.com/". But I couldn't stand the 115MB loss
of system memory for their huge categorization table that gets loaded
with their Parental Control feature, so that got uninstalled and I lost
the simplistic URL filtering. When my subscription nears its
expiration, that will be something that I will look for in a firewall so
I have a manageably sized block list rather than thousands of entries in
a hosts file.

Know which firewalls do provide URL blocking? And which of those allow
regular expressions for entries in that list?

--
__________________________________________________
*** Post replies to newsgroup. Share with others.
(E-mail: domain = ".com", add "=NEWS=" to Subject)
__________________________________________________
Anonymous
a b 8 Security
July 27, 2004 5:01:33 AM

Archived from groups: comp.security.firewalls (More info?)

*Vanguard* wrote:
> "Lars M. Hansen" <badnews@hansenonline.net>
> wrote in news:sor9g09osu8rhsg03btv6ti93itkd7pm1b@4ax.com:
>
>>On Sun, 25 Jul 2004 21:49:55 -0500, *Vanguard* spoketh
>>
>>>Anyone using a hosts file to block access to sites should obviously
>>>also be running a firewall. If they are running a firewall, they
>>>don't need to use the hosts file.
>>
>>Not all firewalls have URL blocking...
>>
>>Lars M. Hansen
>>http://www.hansenonline.net
>>(replace 'badnews' with 'news' in e-mail address)
>
>
> I know. What URL blocking there is in Norton Internet Security is very
> limited. No wildcards and no regular expressions. Just simple string
> matching, like ".doubleclick.com/". But I couldn't stand the 115MB loss
> of system memory for their huge categorization table that gets loaded
> with their Parental Control feature, so that got uninstalled and I lost
> the simplistic URL filtering. When my subscription nears its
> expiration, that will be something that I will look for in a firewall so
> I have a manageably sized block list rather than thousands of entries in
> a hosts file.
>
> Know which firewalls do provide URL blocking? And which of those allow
> regular expressions for entries in that list?
>
Outpost Pro is one that can be configured to block specific URL's.
Trendmicro Internet Security provides wildcard URL blocking.
Anonymous
a b 8 Security
July 27, 2004 5:01:34 AM

Archived from groups: comp.security.firewalls (More info?)

"optikl" <optikl@invalid.net>
wrote in news:NPhNc.194098$Oq2.29013@attbi_s52:
> Outpost Pro is one that can be configured to block specific URL's.
> Trendmicro Internet Security provides wildcard URL blocking.

I scanned the online manual for Outpost Pro 2.1 at
http://www.agnitum.com/download/Outpost_Pro_User_Guide_(ENG).pdf to see
what URL filtering it provides and if wildcarding or regular expressions
are allowed. Section 6.7 discusses Content Filtering. The phrase "To
list particular web sites you do not want to displayed ...", but it is
not just displaying them that causes problems. I don't want anything
linked to, submitted to, downloaded from, or called from that site
besides any images or web pages they may try to proffer. It also only
looks for keywords (i.e., substrings) *anywhere* in the URL whereas, in
some cases, I want to restrict the blocking based only on the domain
portion of the URL, not by a possible match somewhere within the path
under it or in the parameters, and in other cases I do want to match
anywhere within the URL, including the parameters (in the case of a
redirect). Regular expressions are not supported. I might be able to
specify ".domain." but there would be no guarantee the match only
occured in the domain portion of the URL. I could try ".domain.tld/"
but then a site with a ccTLD of ".domain.tld.cctld/", like
".someplace.com.au/" would not get caught. What I see for URL filtering
in Outpost is the same as what I have (or could have) in Norton Internet
Security (but probably without the bloat of the category table for
Parental Control).

Another user mentioned the Blockpost plug-in but trying to find a link
on their web pages to these user/community developed plug-ins is
exascerbating; rather than include them under the Downloads or Support
left-frame menu, you have to go look at the product description and
click on "Download plug-ins" under the "Existing Users" section
(although you may not yet be an existing user). That only lets me list
sites by their domain name or IP address. That won't help if the domain
is specified in the parameters of the URL for a redirect. It also
requires that I enter a FQDN, like "hostname.domain.tld" but a site may
use a ccTLD to designate their country, as in
"hostname.domain.tld.cctld", or even use the ccTLD as a subdomain, like
Doubleclick does with "hostname.cctld.doubleclick.net". And a nasty
domain may rotate or change their hostname, so "www.domain.tld" may
become "humphrey.domain.tld" and then "edina.domain.tld", and some don't
need a hostname, like "grc.com", so I don't want to include a hostname,
nor do I want to include a subdomain.

I figure "^(https?|ftp):////.*/.domain/..*//?" would restrict the block
to looking only at http(s) or ftp sites specified only in the domain
portion of the URL whereas ".*:////.*//.*/.domain/.(com|net).*" would
catch only the .com or .net TLD for that domain using any protocol but
only if the domain was specified in the parameters for a possible
redirect. Since I don't have a firewall that supports regular
expressions, I really can't test if the regular expressions that I
mention here will work as expected. Having to run a gateway running
Linux to provide a proxy or get an expensive firewall appliance or
enterprise-level internet gateway that can understand regular
expressions and then forcing all clients to use that proxy is just too
big and too expensive a task for a home network. It shouldn't take all
that just to get support for regular expressions in a personal firewall.
Maybe the developers figure we users are too stupid to figure out how to
read the product's documentation on how to write their flavor of regular
expressions.

I can't tell what TrendMicro Internet Security provides regarding URL
filtering and support for regular expressions since they don't provide
an online or downloadable manual but just their quick start guide and
readme file.

--
__________________________________________________
*** Post replies to newsgroup. Share with others.
(E-mail: domain = ".com", add "=NEWS=" to Subject)
__________________________________________________
Anonymous
a b 8 Security
July 27, 2004 6:42:53 AM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

Programmershouse wrote:
> Hello all,
>
> I wrote a page about Host File and how to use it.
> http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
> What do you think about it and what else more could I add to it ?
>
> Stickman answered me : "Unfortunately, using the hosts file to block
> unwanted content is terribly inefficient."
> Why is that ? Do you think squid is more efficient ? Or iptables ?
> What about Microsoft OS too ?
>
> Thanks

Windows actually has a hosts file too!! On XP it's in
C:\WINDOWS\system32\drivers\etc
there is a networks, protocols, and services too.

Too block unwanted IP's I'd use tcpwrappers (the files hosts.deny and
hosts.allow in /etc) and for windows OS's I'd do it at the firewall or
router level (for example set up ACLs on cisco's routers)

Dave
Anonymous
a b 8 Security
July 27, 2004 8:40:13 AM

Archived from groups: comp.security.firewalls (More info?)

http://www.nydailynews.com/front/story/209251p-180208c....


"Lars M. Hansen" <badnews@hansenonline.net> wrote in message
news:p nr9g05npr7m8b6pgsmv1m4ip0ttmjcgln@4ax.com...
> On Mon, 26 Jul 2004 03:15:13 GMT, K2NNJ spoketh
>
> >France!.....ewwwwwwwww
> >
> >Boycott France!
> >
>
> Why?
>
> Lars M. Hansen
> http://www.hansenonline.net
> (replace 'badnews' with 'news' in e-mail address)
Anonymous
a b 8 Security
July 27, 2004 12:03:09 PM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

spamprogrammershouse@yahoo.fr (Programmershouse) writes:

>http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
>What do you think about it and what else more could I add to it ?

That it's an absolute worst case kludge when you have no other alternative.

>Stickman answered me : "Unfortunately, using the hosts file to block
>unwanted content is terribly inefficient."
>Why is that ?

A number of reasons:
* You have to list every host separately, an impossibly long task.
* If your local machine doesn't run a web server, the references to
127.0.0.1 will take ages to time out, quite possibly taking longer
than just loading the ad banners in the first place!
* If you get around this by pointing those hosts entries at the closest
web server instead, such as your ISP's web server, that's even kludgier,
and has to be changed whenever you change ISPs.
* If your local machine does run a web server, pages may not display
nicely with missing parts.

>Do you think squid is more efficient ? Or iptables ?

www.privoxy.org

>What about Microsoft OS too ?

There's a Windoze version of privoxy too.

--
Craig Macbride <craig@f8d.com> http://www.f8d.com
---------------------------------------------------------------------------
I don't want to achieve immortality through my work...
I want to achieve it through not dying. - Woody Allen
Anonymous
a b 8 Security
July 27, 2004 4:12:09 PM

Archived from groups: comp.security.firewalls (More info?)

Thanks for the tip.................Thor...(rolling eyes)


"Thor Kottelin" <thor@anta.net> wrote in message
news:41061C68.67A48CA8@anta.net...
>
>
> K2NNJ wrote:
> >
> > http://www.nydailynews.com/front/story/209251p-180208c....
>
> This is an international newsgroup about firewalls. If you want to discuss
> unrelated topics, such as the war against Iraq, take it elsewhere.
>
> Thor
>
> --
> http://www.anta.net/
Anonymous
a b 8 Security
July 27, 2004 4:16:29 PM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

Dave Yingling wrote:

> Windows actually has a hosts file too!! On XP it's in
> C:\WINDOWS\system32\drivers\etc
> there is a networks, protocols, and services too.

Once again, it may just as well be elsewhere.
<URL:news:410217E1.AA792B18@anta.net>

Thor

--
http://www.anta.net/
Anonymous
a b 8 Security
July 27, 2004 4:16:30 PM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

On Tue, 27 Jul 2004 12:16:29 +0300, Thor Kottelin spoketh

>
>
>Dave Yingling wrote:
>
>> Windows actually has a hosts file too!! On XP it's in
>> C:\WINDOWS\system32\drivers\etc
>> there is a networks, protocols, and services too.
>
>Once again, it may just as well be elsewhere.
><URL:news:410217E1.AA792B18@anta.net>
>
>Thor

By default, the hosts file are in the locations described. I figure if
someone have gone through the trouble of editing the registry to move
the file, then they would know where it was, and wouldn't need anyone
elses help in telling them where it can be found.

Lars M. Hansen
http://www.hansenonline.net
(replace 'badnews' with 'news' in e-mail address)
July 27, 2004 10:43:08 PM

Archived from groups: comp.security.firewalls (More info?)

On 27 Jul 2004 08:03:09 -0600, craig@f8d.com (Craig Macbride) wrote:

>* If your local machine doesn't run a web server, the references to
>127.0.0.1 will take ages to time out, quite possibly taking longer
>than just loading the ad banners in the first place!
>* If you get around this by pointing those hosts entries at the closest
>web server instead, such as your ISP's web server, that's even kludgier,
>and has to be changed whenever you change ISPs.
>* If your local machine does run a web server, pages may not display
>nicely with missing parts.

That's what eDexter <http://www.pyrenean.com/&gt; is for. It acts as a web server
so you don't get the long timeouts, and places a small image in the blocked area
so it does display correctly.

--
dak
Anonymous
a b 8 Security
July 28, 2004 1:21:12 AM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

"Lars M. Hansen" wrote:
>
> On Tue, 27 Jul 2004 12:16:29 +0300, Thor Kottelin spoketh

> >Dave Yingling wrote:
> >
> >> Windows actually has a hosts file too!! On XP it's in
> >> C:\WINDOWS\system32\drivers\etc
> >> there is a networks, protocols, and services too.
> >
> >Once again, it may just as well be elsewhere.
> ><URL:news:410217E1.AA792B18@anta.net>

> By default, the hosts file are in the locations described. I figure if
> someone have gone through the trouble of editing the registry to move
> the file, then they would know where it was, and wouldn't need anyone
> elses help in telling them where it can be found.

I haven't edited the registry (in that respect), but my hosts file is still
not in C:\WINDOWS\system32\drivers\etc.

Thor

--
http://www.anta.net/
Anonymous
a b 8 Security
July 28, 2004 1:21:13 AM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

On Tue, 27 Jul 2004 21:21:12 +0300, Thor Kottelin spoketh

>
>
>"Lars M. Hansen" wrote:
>>
>> On Tue, 27 Jul 2004 12:16:29 +0300, Thor Kottelin spoketh
>
>> >Dave Yingling wrote:
>> >
>> >> Windows actually has a hosts file too!! On XP it's in
>> >> C:\WINDOWS\system32\drivers\etc
>> >> there is a networks, protocols, and services too.
>> >
>> >Once again, it may just as well be elsewhere.
>> ><URL:news:410217E1.AA792B18@anta.net>
>
>> By default, the hosts file are in the locations described. I figure if
>> someone have gone through the trouble of editing the registry to move
>> the file, then they would know where it was, and wouldn't need anyone
>> elses help in telling them where it can be found.
>
>I haven't edited the registry (in that respect), but my hosts file is still
>not in C:\WINDOWS\system32\drivers\etc.
>
>Thor

Well, there are a number of reasons why that might be the case...

Lars M. Hansen
http://www.hansenonline.net
(replace 'badnews' with 'news' in e-mail address)
Anonymous
a b 8 Security
July 28, 2004 4:16:33 AM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

On 27 Jul 2004 08:03:09 -0600, Craig Macbride spoketh



>* If your local machine doesn't run a web server, the references to
>127.0.0.1 will take ages to time out, quite possibly taking longer
>than just loading the ad banners in the first place!

No it won't. That will only happen if you are running a software
firewall on your desktop that for some reason are quietly dropping
connection attempts to localhost. Normally, connections from localhost
to localhost on a closed port will result in a quick RST, not a slow
timeout.

>* If you get around this by pointing those hosts entries at the closest
>web server instead, such as your ISP's web server, that's even kludgier,
>and has to be changed whenever you change ISPs.

See above.

>* If your local machine does run a web server, pages may not display
>nicely with missing parts.
>

image tags and embedded objects should have the size of the image/object
specified, in which case it doesn't matter if the object or image is
loaded. The browser will simply set aside an area of the proper size on
the page and not load the object...

Lars M. Hansen
www.hansenonline.net
Remove "bad" from my e-mail address to contact me.
"If you try to fail, and succeed, which have you done?"
Anonymous
a b 8 Security
July 28, 2004 1:16:19 PM

Archived from groups: comp.security.firewalls,comp.os.linux.security (More info?)

"Lars M. Hansen" wrote:
>
> On Tue, 27 Jul 2004 21:21:12 +0300, Thor Kottelin spoketh

> >"Lars M. Hansen" wrote:

> >> By default, the hosts file are in the locations described. I figure if
> >> someone have gone through the trouble of editing the registry to move
> >> the file, then they would know where it was, and wouldn't need anyone
> >> elses help in telling them where it can be found.
> >
> >I haven't edited the registry (in that respect), but my hosts file is still
> >not in C:\WINDOWS\system32\drivers\etc.

> Well, there are a number of reasons why that might be the case...

Exactly. Therefore we should not overgeneralize.

Thor

--
http://www.anta.net/
Anonymous
a b 8 Security
July 28, 2004 1:18:04 PM

Archived from groups: comp.security.firewalls (More info?)

dak wrote:
>
> On 27 Jul 2004 08:03:09 -0600, craig@f8d.com (Craig Macbride) wrote:
>
> >* If your local machine doesn't run a web server, the references to
> >127.0.0.1 will take ages to time out, quite possibly taking longer
> >than just loading the ad banners in the first place!
> >* If you get around this by pointing those hosts entries at the closest
> >web server instead, such as your ISP's web server, that's even kludgier,
> >and has to be changed whenever you change ISPs.
> >* If your local machine does run a web server, pages may not display
> >nicely with missing parts.
>
> That's what eDexter <http://www.pyrenean.com/&gt; is for. It acts as a web server
> so you don't get the long timeouts, and places a small image in the blocked area
> so it does display correctly.

If you want to run a web server, Apache would be a less exotic choice.

Thor

--
http://www.anta.net/
July 28, 2004 1:18:05 PM

Archived from groups: comp.security.firewalls (More info?)

On Wed, 28 Jul 2004 09:18:04 +0300, Thor Kottelin <thor@anta.net> wrote:

>> That's what eDexter <http://www.pyrenean.com/&gt; is for. It acts as a web server
>> so you don't get the long timeouts, and places a small image in the blocked area
>> so it does display correctly.
>>
>If you want to run a web server, Apache would be a less exotic choice.

eDexter only acts as a limited personal web server. Why run a web server when
it is not wanted, needed nor necessary? And eDexter is a hell of a lot smaller
than Apache.
Why don't you just read about it for yourself? Trying to compare eDexter with
Apache is comparing apples and oranges. If you read the website blurb on
eDexter and then the first few paragraphs of its release notes, you'll see how
ridiculous that Apache statement actually seems....

--
dak
Anonymous
a b 8 Security
July 28, 2004 3:07:12 PM

Archived from groups: comp.security.firewalls (More info?)

dak wrote:
>
> On Wed, 28 Jul 2004 09:18:04 +0300, Thor Kottelin <thor@anta.net> wrote:

> >If you want to run a web server, Apache would be a less exotic choice.
>
> eDexter only acts as a limited personal web server. Why run a web server when
> it is not wanted, needed nor necessary?

You can limit Apache as much as you want. I have been very happy with Apache
as a personal proxy. It is much more useful than a hosts file kludge.

Oh, and Apache is free.

Thor

--
http://www.anta.net/
July 29, 2004 12:11:38 AM

Archived from groups: comp.security.firewalls (More info?)

On Wed, 28 Jul 2004 11:07:12 +0300, Thor Kottelin <thor@anta.net> wrote:

>You can limit Apache as much as you want. I have been very happy with Apache
>as a personal proxy. It is much more useful than a hosts file kludge.

eDexter doesn't need to be limited. eDexter does its job, and does it well, as
is.
I have been very happy with eDexter with both HOSTS files and/or DNSKong.
eDexter is not a "hosts file kludge," as you so inaccurately put it. It is
much more useful, convenient and easier than downloading a web server and then
having to limit it.

>Oh, and Apache is free.

Oh, and eDexter is free.
And smaller, 137KB (zipped) vs 5MB-6MB (exe/msi) download for Apache, depending
on version. My entire eDexter directory only consists of 13 objects weighing in
at a whopping 262KB.

Why don't you just find out about eDexter for yourself, as I previously
suggested, so you'll actually know what you are talking about in your
"comparisons."

--
dak
Anonymous
a b 8 Security
July 29, 2004 12:00:18 PM

Archived from groups: comp.security.firewalls (More info?)

dak wrote:
>
> On Wed, 28 Jul 2004 11:07:12 +0300, Thor Kottelin <thor@anta.net> wrote:

> >Oh, and Apache is free.
>
> Oh, and eDexter is free.

That's an interesting definition.

> Why don't you just find out about eDexter for yourself, as I previously
> suggested, so you'll actually know what you are talking about in your
> "comparisons."

According to <URL:http://www.pyrenean.com/pricing.php&gt;, eDexter costs $25,
although it may be used for free "in Standard Mode on home computers for
personal use".

Thor

--
http://www.anta.net/
Anonymous
a b 8 Security
July 29, 2004 12:00:19 PM

Archived from groups: comp.security.firewalls (More info?)

"Thor Kottelin" <thor@anta.net>
wrote in news:41088462.890F9010@anta.net:
> dak wrote:
>>
>> On Wed, 28 Jul 2004 11:07:12 +0300, Thor Kottelin <thor@anta.net>
>> wrote:
>
>>> Oh, and Apache is free.
>>
>> Oh, and eDexter is free.
>
> That's an interesting definition.
>
>> Why don't you just find out about eDexter for yourself, as I
>> previously suggested, so you'll actually know what you are talking
>> about in your "comparisons."
>
> According to <URL:http://www.pyrenean.com/pricing.php&gt;, eDexter costs
> $25, although it may be used for free "in Standard Mode on home
> computers for personal use".
>
> Thor

http://www.pyrenean.com/edexter.php says, "eDexter is free to
individuals for non-commercial use on a single machine." So if you have
more than one "personal" machine then you can't use eDexter except on
one of them. Then they change that policy at
http://www.pyrenean.com/pricing.php by saying, "eDexter server is
restricted to run on a single IP and is also priced at $25 per client"
and "Individuals using DNSKong or eDexter in Standard Mode on home
computers for personal use may use these products for free." Apparently
they can't make up their minds as to what will be free. Of course, they
don't bother to define what *IS* standard mode, but then they don't
bother to provide online documentation for the product. Must be the
programmer writing the web site. Although they write in the code to
create the product, programmers are notoriously lazy or poor writers
when it comes to *user* documentation. You don't even see a System
Requirements listed for the software, but then I didn't find them for
Apache, either. Hell, it seems you could use a pared down install of
IIS to do the same thing.

--
__________________________________________________
*** Post replies to newsgroup. Share with others.
(E-mail: domain = ".com", add "=NEWS=" to Subject)
__________________________________________________
Anonymous
a b 8 Security
July 29, 2004 12:54:39 PM

Archived from groups: comp.security.firewalls,comp.os.linux.security,comp.security.misc (More info?)

Thor Kottelin <thor@anta.net> wrote in message news:<4101429C.C3F47A91@anta.net>...
> Programmershouse wrote:
>
> > I wrote a page about Host File and how to use it.
> > http://www.ifrance.com/programmershouse/HOSTS-EN.HTML
> > What do you think about it and what else more could I add to it ?
>
> You cannot know the absolute location of the hosts file. For example, the
> hosts file of the XP machine I'm writing this on is not located where you
> state it is. Neither is my browser cache. You are writing in second person
> when you mean first.

Thor, I was talking about the usual location. Dont complicate things
please.

>
> > Stickman answered me : "Unfortunately, using the hosts file to block
> > unwanted content is terribly inefficient."
> > Why is that ?
>
> Because it blocks nothing - it works by breaking name lookups. It isn't even
> on topic for comp.security.firewalls. Follow-ups set.
>
> Thor


Thor, I also asked questions about firewalls and proxies, if you read
my message completely.

What is your solution for blocking ads ?
Anonymous
a b 8 Security
July 29, 2004 12:55:04 PM

Archived from groups: comp.security.firewalls (More info?)

dak <comp-security-firewalls@spamtrap.cjb.net> wrote in message

-SNIP-

> A simple, single entry of example.com in DNSKong would block ALL mentioned
> above, and more (it'd block ALL subdomains). That's one entry versus 202
> entries (so far). And it will block any new/future subdomains, whether you know
> about them or not, as is...no changes/additions needed.
>
> Now let's say that example.com also has the same unwanted setup for the
> toplevel domains NET and ORG and BIZ. That's another 606 entries, bringing the
> total up to 808 lines/entries for your HOSTS file...JUST for this one
> badguy...how many more do you want/need to block?
> That's why it is "terribly inefficient."
>
> And in DNSKong? Another three lines for a total of four. And DNSKong isn't
> the only application available to do this "reduction."
> If you had a blocking client that used regular expressions, the entire example
> could be reduced to one entry (You could use a single entry of "example" in
> DNSKong to achieve the same, but it would whole-word match (block) any address
> with example in it, as opposed to being able to limit/target/specify as is
> possible with regular expressions).
> That's why it is "terribly inefficient."

Hi Dak,

Ok DNSKong looks like a better solution than a huge hosts file.
I have questions :
-how do you translate hosts file entries into Named.txt entries ?
-if I put blablo as one entry in Named.txt, will DNSK block all web
sites containing blablo in their www address ? like www.blablo.com or
ads.blablo.thissite.net ? and if I only want to block this.blablo.org
?
-I dont understand why using eDexter and DNSK at the same time : why
also having a pacdata.txt file with entries ? Is the the same as
Named.txt and why writing the same entries in 2 different files ?
-What is the difference between eDexter and DNSK ?
-What is the simplest software and easy to use ?
-What are other free and efficient ads blocking softwares ? I have
heard about Proxomitron, but it seems quite complicated. Which one to
use ?
Anonymous
a b 8 Security
July 30, 2004 10:16:45 AM

Archived from groups: comp.security.firewalls,comp.os.linux.security,comp.security.misc (More info?)

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

On torsdag 29. juli 2004, 17:54 Programmershouse tried to express an opinion:

> What is your solution for blocking ads ?

I use Privoxy on Linux.
It's a proxy server like WebWasher, only with more features.

www.privoxy.org
<quote>
"Privoxy is a web proxy with advanced filtering capabilities
for protecting privacy, modifying web page content, managing cookies,
controlling access, and removing ads, banners, pop-ups
and other obnoxious Internet junk.
Privoxy has a very flexible configuration and can be customized
to suit individual needs and tastes.
Privoxy has application for both stand-alone systems and multi-user networks."
</quote>

- --
Solbu - http://www.solbu.net
Remove 'ugyldig' for email
PGP key ID: 0xFA687324
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.2 (GNU/Linux)

iD8DBQFBCcuxT1rWTfpocyQRAi8mAKCBy9v/X+QDIsPV0N4EpVEO21vj2ACeMaRk
GykVnSPJLwhJ2jeJU11z3Rg=
=4umE
-----END PGP SIGNATURE-----
August 4, 2004 12:31:38 AM

Archived from groups: comp.security.firewalls (More info?)

On 29 Jul 2004 08:55:04 -0700, spamprogrammershouse@yahoo.fr (Programmershouse)
wrote:

>Ok DNSKong looks like a better solution than a huge hosts file.
>I have questions :
>-how do you translate hosts file entries into Named.txt entries?

I don't know of an easy way and I never looked for one as I did it manually,
mainly because I wanted to comb through the file and see exactly what was there
- a direct hands on approach, so to speak - so I could completely control how I
wanted to set it up. I used UltraEdit to do this.

>-if I put blablo as one entry in Named.txt, will DNSK block all web
>sites containing blablo in their www address? like www.blablo.com or
>ads.blablo.thissite.net? and if I only want to block this.blablo.org?

Yes, an entry of blablo will block any address containing the whole word blablo
in it, like your two examples. I mention "whole word" because it would not
block www.newblablo.com or www.blablo2.com.
To block only this.blablo.org you would enter only this.blablo.org. You enter
as much or as little as you wish, depending on how specific or broad you want
the blocking.

>-I dont understand why using eDexter and DNSK at the same time : why
>also having a pacdata.txt file with entries? Is the the same as
>Named.txt and why writing the same entries in 2 different files?

I use them both because they perform different functions and work extremely
well together. I use DNSK for blocking and eDexter mainly to provide a
placeholder for the blocked ads and to prevent timeouts, but its PAC file will
allow IP address entries which neither DNSK nor the HOSTS file allow.
When DNSK (or HOSTS) redirects an entry to localhost it can cause delays as it
searches your computer looking for the "website" or "image." eDexter gives it
something (a small image) to find almost immediately to prevent waiting for the
search to timeout. If the redirect is for an image the webpage can be
(horribly) "mis-aligned" by the missing image(s), so in this case eDexter acts a
placeholder of sorts by substituting its images for the blocked ones, and the
page maintains its intended design/alignment.
I do not duplicate entries in DNSK's named.txt and eDexter's pacdata.txt files.

>-What is the difference between eDexter and DNSK?

Other than what I wrote above, you can use wildcards and partial word matches,
and you can block specific directories on websites, or allow only certain
directories on blocked websites in eDexter's PAC file.
Just as DNSKong can reduce and refine a HOSTS file, eDexter can refine some
entries in special ways that DNSKong can't.
I haven't really explored the PAC file, but it is on my "to-do" list. :o )

>-What is the simplest software and easy to use?

I think the DNSKong/eDexter combination is the easiest to setup and use.
With this combination you only need one line (or two if you use eDExter's PAC
file) in your HOSTS file, period, and then you can forget about it if you want
to. DNSK's preset.txt will do what the HOSTS file was originally intended to
do.
Between DNSK and eDexter, I think DNSK's named.txt is easier to maintain and
work with than eDexter's PAC file, but it's not quite as flexible in its
filtering. Either beat HOSTS file blocking hands down! :o )

>-What are other free and efficient ads blocking softwares? I have
>heard about Proxomitron, but it seems quite complicated. Which one to
>use?

I used Proxomitron a long time ago, and it worked fine, but I think it had a
longer learning curve than DNSK/eDexter. It's been so long I don't really
remember why I quit using it. I think it's easier to get in "trouble" with it
because of all the settings and options. I may not have known/understood enough
about it to have fully appreciated what all it could do.
I prefer, and recommend, DNSK/eDexter because they are small, efficient, simple
and do the job they were designed to do.

--
dak
!