On system with signed char, ascii would incorrectly handle bytes with the
highest bit set.

Before:

    $ ascii $(printf '\xff')
    ASCII 268435455/15 is decimal 4294967295, hex ffffffff, octal 37777777777, 
bits 11111111: meta-^_

After:

    $ ascii $(printf '\xff')
    ASCII 15/15 is decimal 255, hex ff, octal 377, bits 11111111: meta-^?

Fixes: https://bugs.debian.org/437945
---
 ascii.c | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/ascii.c b/ascii.c
index 4ad7d5b..8a6ff4d 100644
--- a/ascii.c
+++ b/ascii.c
@@ -224,7 +224,7 @@ static void ascii(char *str)
 
     /* interpret single characters as themselves */
     if (len == 1) { 
-       speak((unsigned int)str[0]); 
+       speak((unsigned char)str[0]);
        /* also interpret single digits as numeric */
        if(!line && strchr("0123456789ABCDEFabcdef",str[0])) {
            int hval;
-- 
2.17.1

Reply via email to