Linda Walsh writes: > If this was a reactor control program, that's one thing, but in > deciding what solution to implement to save some small lookup time or > throw it away, an 90% solution is probably fine. It's called a > heuristic. AI machines use them. Thinking people use them. Why > should bash be different?
You don't use heuristics when you have access to the underlying mechanisms, that's an extremely poor way to program, and an extremely good way to accrue technical debt. > Fixing it isn't about 0/100% fixed, but a combination of actual cost, > (measurable impact), user perception, and programmer cost to implement > something that works for most. > > As you drive up the 'perfection rate' or 'uptime' to another 9 (i.e. 90% > to 99, or 99 to 99.9%) the costs usually go up exponentially. > If it costs 1 day to implement an 90% algorithm, a 99% algorithm > easily be a 1-3 month project depending on how you measure. > 99.9 could could involve a year or more. If this has been your experience, then you have been working in environments which I would suggest are not representative of typical software development. The time cost involved in maintaining large systems is not directly related to the suitability for use -- quite the opposite. > If you have a machine that can't do a path lookup in <.1 seconds, > Then walking a PATH env var to do multiple path lookups is gonna hurt > that many times more. If your system is so slow that everything is bad, > then having hashing turned on at all seems a rather unimportant issue. Proposing solutions based entirely upon speculation about the environments of the user base of a program that is shipped with almost all consumer/server Linux distributions is myopic. To state that such a situation must also require that the system is generally unsuited for hashing (as opposed to, for example, just having a particularly high load at one point in time) is to extrapolate far beyond reasonable bounds of assumption.
pgpp1PCGDj5Xg.pgp
Description: PGP signature