After some casual analysis, I noticed that libssp (in gets-chk.c) we were doing:

if (slen >= (size_t) INT_MAX)
   return gets (s);

 if (slen <= 8192)
   buf = alloca (slen + 1);
 else
   buf = malloc (slen + 1);
 if (buf == NULL)
   return gets (s);

What's the reason behind casting with size_t for INT_MAX, it seems to be a design error, there is no guarantee that INT_MAX is a valid size_t. If we want to limit it to size_t, we'd instead want a macro finding the minimum of INT_MAX and SIZE_MAX, thereby rendering the whole cast unnecessary and potentially badly thought out.

Please get back to me at this address regarding this so that we may discuss future changes

Thanks!

Reply via email to