Hi Kirill,
I didn't have a chance to run the benchmark with
Substance LF as got sidetracked by some bugs and
JavaOne stuff.
[EMAIL PROTECTED] wrote:
There still seems to be a fair number of
unaccelerated calls, like
MaskFills (fills of antialiased shapes), MaskBlits
(AA shape filled
with gradient), and BI to BI blits.
Hi, Dmitri
Thanks for looking into this. The advice on not setting AA mode prior to using
operations that don't care about it (such as filling a rectangle, shape or
gradient) is a very valuable one. Is this mentioned anywhere in the tutorials /
javadoc? Is this implementation detail for Sun VM? Can this be handled in the
core by ignoring the AA mode on operations that produce exactly the same
results with or without AA turned on?
I think this should be left to the developer. It's
hard for us to predict what exactly the user wants
to do.
In addition, i wonder what your thoughts are on Nimbus performance on my specific card
(the one with acceleration). While in pure software Nimbus is twice as fast as Substance,
on that card the usage of volatile images and accelerated loops seems to hurt Nimbus
rather badly (instead of boosting it by 30-80% as your internal benchmark suggests). What
does your internal benchmark do? Does it just run a sequence of Java2D operations, or is
it a real app being tested? I would suggest such a heavy app as Netbeans or IntelliJ IDEA
for a "real" test of performance gains on D3D pipeline.
Without knowing what your benchmark does it's hard to tell.
There are some areas in the new pipeline which still need
some improvement, especially on smaller primitives.
The benchmark we have is a mock-up of an application which
uses a bunch of swing components, driven by injecting events,
as fast as possible. The score is the time it takes
to complete the scripted run. Typically it takes 10-15 seconds
per run (we use multiple runs).
It does a bunch of stuff like scrolling tables, lists,
adding/removing elements and such. Note that it's not
a pure graphics benchmark - it tests the
whole stack (swing+java2d). For pure graphics testing
we have J2DBench (available in openjdk's jdk repository)
This Swing benchmark is ran with different pipelines, text
anti-aliasing options (no aa, grayscale, lcd) and look and
feels (Ocean/Native/Nimbus).
As for the "real world" testing - I use Netbeans running on
the latest build as my development platforms (although I found
out that for some reason nb 6.1 disables the use of
hw accelerated pipeline by default - I'll need to get to the
bottom of this). Seems to work fine on my system - which is I
guess what most users want unless they have specific performance
issues with particular application.
It would be nice to have an automated test suite driving
applications like JEdit or Netbeans.
The last question - how can i look at the output of sun.java2d.trace=count and understand how it
maps back to the Java2D APIs on Graphics and Graphics2D? How do i read something like
"sun.java2d.loops.Blit::Blit(IntRgb, SrcNoEa, IntArgbPre)" or
"sun.java2d.loops.MaskBlit::MaskBlit(IntArgbPre, SrcOver, IntArgbPre)"?
You can probably figure out for yourself by
inserting printouts in your rendering methods
("doing fill with gradient", "issuing drawImage" etc),
and specifying -Dsun.java2d.trace=log option which
traces rendering primitives on the fly (as opposed
to =count, which counts them and prints out the
summary at the end).
Then you'll see exactly what tracing lines correspond
to your commands.
But in general, there's some information here:
http://java.sun.com/javase/6/webnotes/trouble/TSG-Desktop/html/gcrua.html#gcrus
This Java Client troubleshooting guide is quite
useful, btw.
Thanks,
Dmitri
===========================================================================
To unsubscribe, send email to [EMAIL PROTECTED] and include in the body
of the message "signoff JAVA2D-INTEREST". For general help, send email to
[EMAIL PROTECTED] and include in the body of the message "help".