On Thu, Mar 26, 2020 at 01:10:25AM +0300, Anton Shepelev wrote: > Daniel Shahaf: > > however, I don't think the lack of these distinctions is > > necessarily a blocker. It just means we need to be more careful > > about not writing automation that will help some cases and > > backfire in others. > > Certainly not. I still hope that my proposal can be made safe.
There is a way to find out, but it requires some work: Write a patch that implements your proposal. And then ensure that all tests in Subversion's regression test suite keep passing with that patch applied and with the URL argument removed from every merge command that runs a sync-style merge throughout the entire regression test suite. I'm confident that the regression test suite is comprehensive enough to catch any problems and if needed inspire further discussion about those problems in detail. It's hard to thoroughly evaluate your idea without knowing which of the test cases will break and why. I've used the test suite many times to try out random ideas I've had. This approach works really well and is often quite enlightening! In any case, what you're asking for implies that at a minimum either you or someone else would have to invest time into actually doing the above work in order to verify your idea and make it happen, regardless of which potential problems are being discussed now. Cheers, Stefan