Package: perl-base
Version: 5.8.4-8
Severity: normal

Text::ParseWords->parse_line() segfaults when trying to split up a line
containing at least one very long word. However, the exact number of
characters necessary to cause a segfault differs from system to system.

Code to reproduce the problem:

---snip---
#!/usr/bin/perl -T

$| = 1;

use strict;
use warnings;

use Text::ParseWords;

for(my $i = $ARGV[0]; ; $i++){
        print "Testing $i characters...";
        
        parse_line('\s+', 0, q{x} x $i);
        
        print "done.\n";
}
---snap---

The sarge system I am reporting this problem from has a limit of just
5222 characters:

| $ ./parse_line-testbed.pl 5222
| Testing 5222 characters...done.
| Testing 5223 characters...Segmentation fault
| $

My local woody system has a limit of 30832 characters, some other boxes
running woody all have a limit of 30828 characters. I have got no idea
what this limit might be dependent on.

As far as I can see, the failure occurs while matching $line against
the complex regular expression at the very beginning of the while()
loop in parse_line().

-- System Information:
Debian Release: 3.1
Architecture: i386 (i686)
Kernel: Linux 2.4.18-1-k7
Locale: LANG=C, LC_CTYPE=C (charmap=ANSI_X3.4-1968)

Versions of packages perl-base depends on:
ii  libc6                       2.3.2.ds1-22 GNU C Library: Shared libraries an

-- no debconf information


-- 
To UNSUBSCRIBE, email to [EMAIL PROTECTED]
with a subject of "unsubscribe". Trouble? Contact [EMAIL PROTECTED]

Reply via email to