Andres Kroonmaa wrote:
> But with recursion limit, you are immediately to ask what is
> the good depth limit? 2-4,16? Or number of #include directives?
A include depth of 30 should be fine I think. If worried then add a
directive setting the allowed depth.
As the only purpose of this limit is to ensure loops are aborted, any
limit set should be considerably larger than one can practically think
anyone (no matter how stupid) would use. I don't estimate a sane person
would use a depth much larger than 5, which makes 30 a suitable limit
for loop detection.
> Limit would not detect multiple includes of the same file which
> might cause headache for some fancy people. Not our problem
> strictly, but depending on case this might cause very difficult
> to diagnose situations which may lead to blaming bugs in squid.
I am not at all worried about this, provided a somewhat meaningful error
message is provided to the user, not simply a "FATAL, Segmentation
fault" or other meaningless error.
> Also, another issue arrives when adding notion of #includes: what
> is a correct action when #include file is not accessible? Some
> includes are critical for correct operation, some are not, should
> we bail out or continue? Perhaps create #include and #required?
> I tend to wish that we could ignore #include errors, this would
> make it possible to include configs from files on swap disks if
> they are mounted, thinking of hotswap disks or netmounts.
Ideally the situation would be like this:
1. Squid won't start unless the configuration file is fully parseable
2. -k recondigure won't do anything unless the configuration file is
fully parseable.
As it may show above I am not particularily found of software that
accepts errors in their configuration with only a small note in the log.
Yes, I know Squid falls into this category.
Regards
Henrik
Received on Mon Apr 01 2002 - 02:13:47 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:14:55 MST