On 2021/10/31 20:48, Scott Cheloha wrote: > In uniq(1), if we use getline(3) instead of fgets(3) we can support > arbitrarily long lines.
It works for me, and getting rid of the length restriction is nice. I don't know how much of a concern it is, but it's about twice as slow: $ wc -l /tmp/z 10000000 /tmp/z $ time (cat /tmp/z | uniq > /dev/null) 0m01.65s real 0m01.62s user 0m00.22s system $ time (cat /tmp/z | obj/uniq > /dev/null) 0m03.60s real 0m03.52s user 0m00.20s system $ time (cat /tmp/z | guniq > /dev/null) 0m01.33s real 0m01.28s user 0m00.20s system though in practice with large files I'd be more likely to sort | uniq -c and uniq's time is dwarfed by sort's.