> On Jan 23, 2019, at 7:15 PM, Warren D Smith <warren....@gmail.com> wrote:
> 
> x = x^x;
> 
> The purpose of the above is to load "x" with zero.
> For very wide types, say 256 bits wide, explicitly loading 0
> is deprecated by Intel since taking too much memory.
> XORing x with itself always yields 0 and is allegedly
> a better thing to do.
> 
> But the problem is, gcc complains:
> variable 'x' is uninitialized when used here [-Wuninitialized]
> note: initialize the variable 'x' to silence this warning
> 
> Well, the thing is, it DOES NOT MATTER that x is not initialized,
> or initialized with wrong data.  No matter what was in x, it becomes 0.
> 
> So, how to get Gcc to shut up and quit whining about this?
> I do not want to actually load 0.

The way to tell gcc that you don't want to hear about x being uninitialized is 
to write the declaration as an initialization to itself:

        int x = x;
        /* then you can do what you wanted: */
        x = x ^ x;

But it seems that GCC should be able to tell expressions that do not depend on 
x and suppress the complaint about uninitialized x if so.

But furthermore: the "too much memory" argument makes no sense.  You should 
write what you mean in the plainest terms, in other words: "x = 0".  It is then 
up to the optimizer to generate the best code for it.  If xor is the better 
code that's what should come out.  Does it already?  If so, Intel's advice is 
simply wrong, ignore it and write the simple code.  If the compiler actually 
generates a transfer from a (potentially big) zero, that might be a missed 
optimization bug.  In that case, Intel's advice may possibly be useful as a 
temporary workaround for this bug.  But only if the small savings is so 
important that obfuscating your source code is justified.

   paul

        

Reply via email to