On Wed, Jun 12, 2019 at 1:57 AM Martin Sebor <mse...@gmail.com> wrote: > > On 6/11/19 3:02 PM, Aldy Hernandez wrote: > > > > > > On 6/11/19 12:52 PM, Martin Sebor wrote: > >> On 6/11/19 10:26 AM, Aldy Hernandez wrote: > >>> > >>> > >>> On 6/11/19 12:17 PM, Martin Sebor wrote: > >>>> On 6/11/19 9:05 AM, Aldy Hernandez wrote: > >>>>> > >>>>> > >>>>> On 6/11/19 9:45 AM, Richard Biener wrote: > >>>>>> On Tue, Jun 11, 2019 at 12:40 PM Aldy Hernandez <al...@redhat.com> > >>>>>> wrote: > >>>>>>> > >>>>>>> This patch cleans up the various contains, may_contain, and > >>>>>>> value_inside_range variants we have throughout, in favor of one-- > >>>>>>> contains_p. There should be no changes in functionality. > >>>>>>> > >>>>>>> I have added a note to range_includes_zero_p, perhaps as a personal > >>>>>>> question than anything else. This function was/is returning true > >>>>>>> for > >>>>>>> UNDEFINED. From a semantic sense, that doesn't make sense. > >>>>>>> UNDEFINED > >>>>>>> is really the empty set. Is the functionality wrong, or should > >>>>>>> we call > >>>>>>> this function something else? Either way, I'm fine removing the > >>>>>>> comment > >>>>>>> but I'm genuinely curious. > >>>>>> > >>>>>> So this affects for example this hunk: > >>>>>> > >>>>>> - if (vr && !range_includes_p (vr, 1)) > >>>>>> + if (vr && (!vr->contains_p (build_one_cst (TREE_TYPE (name))) > >>>>>> + && !vr->undefined_p ())) > >>>>>> { > >>>>>> > >>>>>> I think it's arbitrary how we compute X in UNDEFINED and I'm fine > >>>>>> with changing the affected predicates to return false. This means > >>>>>> not testing for !vr->undefined_p here. > >>>>> > >>>>> Excellent. > >>>>> > >>>>>> > >>>>>> Note I very much dislike the build_one_cst () call here so please > >>>>>> provide an overload hiding this. > >>>>> > >>>>> Good idea. I love it. > >>>>> > >>>>>> > >>>>>> Not sure why you keep range_includes_zero_p. > >>>>> > >>>>> I wasn't sure if there was some subtle reason why we were including > >>>>> undefined. > >>>>> > >>>>> OK pending tests? > >>>> > >>>> Should the second overload: > >>>> > >>>> + bool contains_p (tree) const; > >>>> + bool contains_p (int) const; > >>>> > >>>> take something like HOST_WIDE_INT or even one of those poly_ints > >>>> like build_int_cst does? (With the former, contains_p (0) will > >>>> likely be ambiguous since 0 is int and HOST_WIDE_INT is long). > >>> > >>> We have a type, so there should be no confusion: > >>> > >>> + return contains_p (build_int_cst (type (), val)); > >>> > >>> (UNDEFINED and VARYING don't have a type, so they are special cased > >>> prior). > >> > >> I didn't mean the overloads are confusing, just that there the one > >> that takes an int doesn't make it possible to test whether a value > >> outside the range of an int is in the range. For example, in > >> the call > >> > >> contains_p (SIZE_MAX) > >> > >> the argument will get sliced (and trigger a -Woverflow). One will > >> need to go back to the more verbose way of calling it. > > > > The int version is not really meant to pass anything but simple > > constants. For anything fancy, one should really be using the tree > > version. But I can certainly change the argument to HOST_WIDE_INT if > > preferred. > > > >> > >> Changing the argument type to HOST_WIDE_INT would avoid the slicing > >> and warning but then make contains_p (0) ambiguous because 0 isn't > >> a perfect match for either void* or long (so it requires a conversion). > > > > Just a plain 0 will match the int version, instead of the tree version, > > right? Nobody should be passing NULL to the tree version, so that seems > > like a non-issue. > > Right, NULL isn't a problem, but I would expect any integer to work > (I thought that's what Richard was asking for) So my suggestion was > to have contains_p() a poly_int64 and provide just as robust an API > as build_int_cst. The argument ends up converted to the poly_int64 > anyway when it's passed to the latter. I.e., why not define > contains_p simply like this: > > bool > value_range_base::contains_p (poly_int64 val) const > { > if (varying_p ()) > return true; > if (undefined_p ()) > return false; > > return contains_p (build_int_cst (type (), val)); > }
I agree that plain 'int' is bad. Given we currently store INTEGER_CSTs only (and not POLY_INT_CSTs) a widest_int argument should be fine. Note widest because when interpreted signed all unsigned values fit. The existing contains_p check is also a bit fishy for the cases TREE_TYPE of the value has a value-range not covered in the value-ranges type... I guess it could do if (!int_fits_type_p (val, this->type ()) return false; but that changes existing behavior which happily throws different sign constants into tree_int_cst_lt for example... Do we want to allow non-INTEGER_CST val arguments at all? That is, you make contains_p return a bool and say "yes" when we couldn't really decide. This is why previously it was named may_contain_p (and we didn't have must_contain_p). I think making the result explicitely clear is a must since contains_p reads like !contains_p means it does _not_ contain. So, either change back the name (and provide the must variant) or make it return the tri-state from value_inside_range. Richard. > Martin