On 05/09/2023 10:33, Jan Beulich wrote:
On 05.09.2023 10:20, Nicola Vetrini wrote:
On 05/09/2023 09:46, Jan Beulich wrote:
On 05.09.2023 09:31, Nicola Vetrini wrote:
Given its use in the declaration
'DECLARE_BITMAP(features, IOMMU_FEAT_count)' the argument
'bits' has essential type 'enum iommu_feature', which is not
allowed by the Rule as an operand to the addition operator.
Given that its value can be represented by a signed integer,
the explicit cast resolves the violation.
Wait - why would this lead to a change to BITS_TO_LONGS()? And if
that
was to be changed, why plain int? I don't think negative input makes
sense there, and in principle I'd expect values beyond 4 billion to
also be permissible (even if likely no such use will ever appear in a
DECLARE_BITMAP(), but elsewhere it may make sense). Even going to
"unsigned long" may be too limiting ...
You have a point. I can think of doing it like this:
DECLARE_BITMAP(features, (int)IOMMU_FEAT_count)
on the grounds that the enum constant is representable in an int, and
it
does not seem likely
to get much bigger.
Having an unsigned cast requires making the whole expression
essentially unsigned, otherwise Rule 10.4 is violated because
BITS_PER_LONG is
essentially signed. This can be done, but it depends on how
BITS_TO_LONGS will be/is used.
It'll need looking closely, yes, but I expect that actually wants to be
an
unsigned constant. I wouldn't be surprised if some use of
DECLARE_BITMAP()
appeared (or already existed) where the 2nd argument involves sizeof()
in
some way.
I think there's one with ARRAY_SIZE. In my opinion this can be resolved
as follows:
#define BYTES_PER_LONG (1U << LONG_BYTEORDER) // the essential type gets
from signed to unsigned
#define BITS_TO_LONGS(bits) \
(((unsigned long long)(bits)+BITS_PER_LONG-1U)/BITS_PER_LONG) //
same here
--- a/xen/include/xen/types.h
+++ b/xen/include/xen/types.h
@@ -23,7 +23,7 @@ typedef signed long ssize_t;
typedef __PTRDIFF_TYPE__ ptrdiff_t;
#define BITS_TO_LONGS(bits) \
- (((bits)+BITS_PER_LONG-1)/BITS_PER_LONG)
+ (((int)(bits)+BITS_PER_LONG-1)/BITS_PER_LONG)
#define DECLARE_BITMAP(name,bits) \
unsigned long name[BITS_TO_LONGS(bits)]
Furthermore, as always - if this was to be touched, please take care
of style violations (numerous missing blanks) at this occasion.
Then the whole file needs a cleanup.
Perhaps, but going as we touch things is generally deemed better than
doing
a single cleanup-only patch. First and foremost to not unduly affect
the
usefulness of "git blame" and alike.
Ok
--
Nicola Vetrini, BSc
Software Engineer, BUGSENG srl (https://bugseng.com)