Thanks to all who responded to this message. I used gdb
and found the problem.
It turns out that the guy who wrote the code had a line
like this:
j = (nClusters*rand())/(RAND_MAX+1)
where "j" is an integer. He expected j to be set to a number
between 0 and nClusters. However, he hard coded RAND_MAX to
32767 which is what it is in Solaris but NOT in Linux. As a result
j was being set to a number larger than nClusters and when j
was then used as an array index...**BOOM**, ...Seg. Fault.
So, I simply deleted the #define where he set RAND_MAX and instead
included:
#include <stdlib.h>
since RAND_MAX is defined in there so I would have the correct
one by default no matter what architecture I was on.
Thanks again for all the help.
--
+--------------+ Brian Kirkland phone: (512) 425-5235
| _____ _____ | fax : (512) 425-5201
|_| | | |_| Senior Design Engineer email: [EMAIL PROTECTED]
____ | | _ _ CYPRESS SEMICONDUCTOR
/__\ | | |\ /| Austin Design Center
/ \ | | | \/ |
___| |___ CLASS OF '92 Red Hat Linux 5.0 User
|________| GIG'EM AGS!
--
PLEASE read the Red Hat FAQ, Tips, Errata and the MAILING LIST ARCHIVES!
http://www.redhat.com/RedHat-FAQ /RedHat-Errata /RedHat-Tips /mailing-lists
To unsubscribe: mail [EMAIL PROTECTED] with
"unsubscribe" as the Subject.