Hello. I have a friend's quad PPro box temporarily sitting in my garage
that I've been using to play with 4.0-CURRENT and vinum. Since the last
series of bug-fixes a few weeks ago, everything works as advertised.
Just for my own curiosity, I've been running some simple (probably
questionable) tests involving find|cpio of the source tree and then
buildworld -j16 on various vinum volumes.
So I thought I'd forward the results to current, for what it's worth.
The biggest (non)surprise is how big a difference softupdates makes.
Nice!
-- Parag Patel
"Idiocy: Never underestimate the power of stupid people in large groups"
-- Despair.COM
# 4xPPro (256k on-chip cache) 200MHz SMP, 512Mb RAM, 5 x 2Gb (older) SCSI disks
# volumes / /var /usr are on a newer IDE drive (BIOS doesn't grok SCSI)
# softupdates are on for all volumes here, except for /
# /tmp is on a 32Mb MFS filesystem and is mounted async
# /etc/make.conf defaults to running gcc -O -pipe
# vinum.conf
drive d0 device /dev/da0s1e
drive d1 device /dev/da1s1e
drive d2 device /dev/da2s1e
drive d3 device /dev/da3s1e
drive d4 device /dev/da4s1e
volume raid5 setupstate
plex org raid5 256k
sd length 768m drive d0
sd length 768m drive d1
sd length 768m drive d2
sd length 768m drive d3
sd length 768m drive d4
volume raid10 setupstate
plex org striped 256k
sd length 0 drive d0
sd length 0 drive d1
plex org striped 256k
sd length 0 drive d2
sd length 0 drive d3
volume noraid setupstate
plex org concat
sd length 0 drive d4
find /usr/src | cpio -pdum /target (~600k blocks):
/raid5 (5 drives, 256k, softupdates):
949.21s real 6.28s user 142.21s system
/raid5 (5 drives, 256k):
2500.62s real 6.16s user 165.65s system
/raid10 (4 drives, 256k stripe, softupdates):
786.46s real 5.86s user 141.50s system
/raid10 (4 drives, 256k stripe):
1553.88s real 6.37s user 150.64s system
/noraid (1 drive, softupdates):
901.70s real 6.32s user 135.73s system
/noraid (1 drive):
1614.45s real 6.09s user 142.46s system
build kernel -j16:
IDE disk (softupdates)
152.97s real 430.57s user 99.85s system
/raid5 volume (5 drives, 256k, softupdates)
162.40s real 434.40s user 99.75s system
/raid5 (5 drives, 256k):
159.79s real 432.53s user 99.35s system
/raid10 (4 drives, 256k stripe, softupdates):
141.81s real 428.63s user 101.05s system
/raid10 (4 drives, 256k stripe):
143.49s real 433.42s user 99.80s system
/noraid (1 drive, softupdates):
145.35s real 431.35s user 96.50s system
/noraid (1 drive):
148.13s real 433.23s user 95.01s system
buildworld -j16:
src & obj on IDE disk (softupdates)
5676.29 real 7701.09 user 6133.60 sys
src & obj on /raid5 volume (5 drives, 256k, softupdates)
6066.43 real 7929.20 user 7659.18 sys
src & obj on /raid5 volume (5 drives, 256k)
7098.73 real 7979.00 user 7843.46 sys
src & obj on /raid10 volume (4 drives, 256k stripe, softupdates)
5932.15 real 7918.44 user 7645.84 sys
src & obj on /raid10 volume (4 drives, 256k stripe)
6479.33 real 7964.00 user 7565.06 sys
src & obj on /noraid volume (1 drive, softupdates)
6053.86 real 7969.94 user 7601.81 sys
src & obj on /noraid volume (1 drive)
6607.68 real 7965.14 user 7377.00 sys
To Unsubscribe: send mail to [EMAIL PROTECTED]
with "unsubscribe freebsd-current" in the body of the message