Hi, On Tue, Oct 25, 2016 at 08:41:35AM +0800, Paul Wise wrote: > On Tue, 2016-10-25 at 01:54 +0900, Osamu Aoki wrote: > > > If we do not do this, we need to loop over scanning many pages... Not a > > good idea. Can you think of non-invasive change? ... > It isn't much of a complication at all really: > > On error, if we scanned a directory, go back and scan the next > directory. Possibly with a configurable limit of scanned dirs.
I was thinking to bunch up all possible URL results by scanning all directory from low version to the high version. But you have a point. Scan from high version and pick page which has matching URL. This makes sense and not as bad situation as I thought. Just push down all the directories. Scan from the latest one. > > FTP > > FTP has nothing to do with this issue, why do you mention it? Yes. I meant HTTP site which looks like old FTP site in terms of its directory and page structure. Osamu