Re: [PATCH RESEND 0/1] RFC: P1689R5 support
On 10/18/22 14:22, Ben Boeckel wrote: > On Thu, Oct 13, 2022 at 13:08:46 -0400, David Malcolm wrote: >> On Mon, 2022-10-10 at 16:21 -0400, Jason Merrill wrote: >>> David Malcolm would probably know best about JSON wrangling. >> >> Unfortunately our JSON output doesn't make any guarantees about the >> ordering of keys within an object, so the precise textual output >> changes from run to run. I've coped with that in my test cases by >> limiting myself to simple regexes of fragments of the JSON output. >> >> Martin Liska [CCed] went much further in >> 4e275dccfc2467b3fe39012a3dd2a80bac257dd0 by adding a run-gcov-pytest >> DejaGnu directive, allowing for test cases for gcov to be written in >> Python, which can thus test much more interesting assertions about the >> generated JSON. > > Ok, if Python is acceptable, I'll use its stdlib to do "fancy" things. > Part of this is because I want to assert that unnecessary fields don't > exist and that sounds…unlikely to be possible in any maintainable way > (assuming it is possible) with regexen. `jq` could help immensely, but > that is probably a bridge too far :) . Yes, please use Python if you have a more complicated output verification. Examples I introduced: ./gcc/testsuite/g++.dg/gcov/test-pr98273.py ./gcc/testsuite/g++.dg/gcov/test-gcov-17.py Martin > > Thanks, > > --Ben
Re: Announcement: Porting the Docs to Sphinx - 9. November 2022
On 10/17/22 16:16, Paul Iannetta wrote: > Hi Martin, > > Thank you very much for porting the documentation to Sphinx, it is > very convenient to use, especially the menu on the left and the > search bar. Thanks, I also like it! > > However, I also regularly browse and search the documentation through > info, especially when I want to use regexps to search or need to > include a special character (eg.,+,-,_,(; this can happen when I > search for '(define' ) for example) in the search string. > > Does the port to Sphinx means the end of texinfo? Or, will both be > available as it is the case now with the official texinfo and your > unofficial splichal.eu pages. It will be still available same as now where manual pages and info pages are built if you compile GCC from sources. We haven't been publishing manual pages and info pages on our web pages, people typically get these from their distribution packages. Does it help? Or do you expect any change regarding what we publish at: https://gcc.gnu.org/onlinedocs/ ? Cheers, Martin > > Paul > > On Mon, Oct 17, 2022 at 03:28:34PM +0200, Martin Liška wrote: >> Hello. >> >> Based on the very positive feedback I was given at the Cauldron Sphinx >> Documentation BoF, >> I'm planning migrating the documentation on 9th November. There are still >> some minor comments >> from Sandra when it comes to the PDF output, but we can address that once >> the conversion is done. >> >> The reason I'm sending the email now is that I was waiting for latest Sphinx >> release (5.3.0) that >> simplifies reference format for options and results in much simpler Option >> summary section ([1]) >> >> The current GCC master (using Sphinx 5.3.0) converted docs can be seen here: >> https://splichal.eu/scripts/sphinx/ >> >> If you see any issues with the converted documentation, or have a feedback >> about it, >> please reply to this email. >> >> Cheers, >> Martin >> >> [1] https://github.com/sphinx-doc/sphinx/pull/10840 >> [1] >> https://splichal.eu/scripts/sphinx/gcc/_build/html/gcc-command-options/option-summary.html >> >> >> >> > > > >
Re: Redundant constants in coremark crc8 for RISCV/aarch64 (no-if-conversion)
On Wed, Oct 19, 2022 at 5:44 AM Jeff Law via Gcc wrote: > > > On 10/18/22 20:09, Vineet Gupta wrote: > > > > On 10/18/22 16:36, Jeff Law wrote: > There isn't a great place in GCC to handle this right now. If the > constraints were relaxed in PRE, then we'd have a chance, but > getting the cost model right is going to be tough. > >>> > >>> It would have been better (for this specific case) if loop unrolling > >>> was not being done so early. The tree pass cunroll is flattening it > >>> out and leaving for rest of the all tree/rtl passes to pick up the > >>> pieces and remove any redundancies, if at all. It obviously needs to > >>> be early if we are injecting 7x more instructions, but seems like a > >>> lot to unravel. > >> > >> Yup. If that loop gets unrolled, it's going to be a mess. It will > >> almost certainly make this problem worse as each iteration is going > >> to have a pair of constants loaded and no good way to remove them. > > > > Thats the original problem that I started this thread with. I'd > > snipped the disassembly as it would have been too much text but > > basically on RV, Coremark crc8 loop of const 8 iterations gets > > unrolled including extraneous 8 insns pairs to load the same constant > > - which is preposterous. Other arches side-step by using if-conversion > > / cond moves, latter currently WIP in RV International. x86 w/o > > if-convert seems OK since the const can be encoded in the xor insn. > > > > OTOH given that gimple/tree-pass cunroll is doing the culprit loop > > unrolling and introducing redundant const 8 times, can it ne addressed > > there somehow. > > tree_estimate_loop_size() seems to identify constant expression, not > > just an operand. Can it be taught to identify a "non-trivial const" > > and hoist/code-move the expression. Sorry just rambling here, most > > likely non-sense. On GIMPLE all constants are "simple". > Oh, cunroll. There might be a distinct flag for complete unrolling. At -O3 we peel completely, there's no flag to disable that. > I really expect something like Click's work is the way forward. > Essentially when you VN the function you'll identify those constants and > collapse them all down to a single instance. Then the GCM phase will > kick in and find a place to put the evaluation so that you have one and > only one. I'd say postreload gcse would be a place to do that. At least when there's no available hardreg CSEing likely isn't going to be a win. > Some of Bodik's work might catch it as well, though implementing his > ideas is likely a lot more work. > > > Jeff
Re: Announcement: Porting the Docs to Sphinx - 9. November 2022
On Wed, Oct 19, 2022 at 09:24:06AM +0200, Martin Liška wrote: > On 10/17/22 16:16, Paul Iannetta wrote: > > Hi Martin, > > > > Thank you very much for porting the documentation to Sphinx, it is > > very convenient to use, especially the menu on the left and the > > search bar. > > Thanks, I also like it! > > > > > However, I also regularly browse and search the documentation through > > info, especially when I want to use regexps to search or need to > > include a special character (eg.,+,-,_,(; this can happen when I > > search for '(define' ) for example) in the search string. > > > > Does the port to Sphinx means the end of texinfo? Or, will both be > > available as it is the case now with the official texinfo and your > > unofficial splichal.eu pages. > > It will be still available same as now where manual pages and info pages > are built if you compile GCC from sources. We haven't been publishing manual > pages and info pages on our web pages, people typically get these from > their distribution packages. As long as it is possible to build the info manual with "make info", even through something like rst2texinfo, I would be happy. Would it be possible to see the rst source of the port so as to try rst2texinfo on it? > > Does it help? Or do you expect any change regarding what we publish at: > https://gcc.gnu.org/onlinedocs/ > ? Currently, there is a tarball with texinfo sources for all the manuals for each version. Thanks, Paul > > Cheers, > Martin > > > > > Paul > > > > On Mon, Oct 17, 2022 at 03:28:34PM +0200, Martin Liška wrote: > >> Hello. > >> > >> Based on the very positive feedback I was given at the Cauldron Sphinx > >> Documentation BoF, > >> I'm planning migrating the documentation on 9th November. There are still > >> some minor comments > >> from Sandra when it comes to the PDF output, but we can address that once > >> the conversion is done. > >> > >> The reason I'm sending the email now is that I was waiting for latest > >> Sphinx release (5.3.0) that > >> simplifies reference format for options and results in much simpler Option > >> summary section ([1]) > >> > >> The current GCC master (using Sphinx 5.3.0) converted docs can be seen > >> here: > >> https://splichal.eu/scripts/sphinx/ > >> > >> If you see any issues with the converted documentation, or have a feedback > >> about it, > >> please reply to this email. > >> > >> Cheers, > >> Martin > >> > >> [1] https://github.com/sphinx-doc/sphinx/pull/10840 > >> [1] > >> https://splichal.eu/scripts/sphinx/gcc/_build/html/gcc-command-options/option-summary.html > >> > >> > >> > >> > > > > > > > > > > > > >
[RFC] c++, libstdc++: Default make check vs. tests for newest C++ standard
Hi! The screw-up on my side with libstdc++ testing (tested normally rather than in C++23 mode) makes me wonder if we couldn't tweak the default testing. Dunno what libstdc++ testing normally does (just C++17?), make check-g++ tests by default { 98, 14, 17, 20 } (and I regularly use GXX_TESTSUITE_STDS=98,11,14,17,20,2b in environment but that doesn't cover libstdc++ I guess). When adding tests for upcoming C++ version, one always has a dilemma whether to use explicit // { dg-options "-std=c++2b" } or -std=gnu++2b and similar, then the test works in all modes, but it might be forgotten later on to be converted into // { dg-do whatever { target c++23 } } test so that when 23 is tested by default and say 26 or 29 appears too, we test it also in those modes, or just go with // { dg-do whatever { target c++23 } } which has the disadvantage that it is skipped when testing by default and one only tests it if he asks for the newer version. I wonder if we couldn't for the default testing (when one doesn't specify GXX_TESTSUITE_STDS or uses make check-c++-all and similar) improve things a little bit by automatically treat those // { dg-do whatever { target c++23 } } tests as // { dg-options "-std=c++2b" }. g++-dg.exp has: # If the testcase specifies a standard, use that one. # If not, run it under several standards, allowing GNU extensions # if there's a dg-options line. if ![search_for $test "-std=*++"] { if [search_for $test "dg-options"] { set std_prefix "-std=gnu++" } else { set std_prefix "-std=c++" } # See g++.exp for the initial value of this list. global gpp_std_list if { [llength $gpp_std_list] > 0 } { set std_list $gpp_std_list } else { set std_list { 98 14 17 20 } } set option_list { } foreach x $std_list { # Handle "concepts" as C++17 plus Concepts TS. if { $x eq "concepts" } then { set x "17 -fconcepts" } elseif { $x eq "impcx" } then { set x "23 -fimplicit-constexpr" } lappend option_list "${std_prefix}$x" } } else { set option_list { "" } } set nshort [file tail [file dirname $test]]/[file tail $test] foreach flags_t $option_list { verbose "Testing $nshort, $flags $flags_t" 1 dg-test $test "$flags $flags_t" ${default-extra-flags} } so I wonder if in the set std_list { 98 14 17 20 } spot we couldn't do something like special search_for for "{ dg-do * { target c++23 } }" and if so, set std_list { 2b } instead of set std_list { 98 14 17 20 }? It wouldn't handle more complex cases like // { dg-do compile { target { c++23 && { aarch64*-*-* powerpc64le*-*-linux* riscv*-*-* s390*-*-* sparc*-*-linux* } } } } but at least for the majority of tests for the new language version it would run them even in default testing where they'd be otherwise skipped (we'd cycle over 98 14 17 20 only to see it doesn't satisfy any of them). If we wanted to go even further, we could handle similarly say c++11_only tests. What do you think? Jakub
Re: [RFC] c++, libstdc++: Default make check vs. tests for newest C++ standard
On Wed, 19 Oct 2022 at 09:40, Jakub Jelinek wrote: > > Hi! > > The screw-up on my side with libstdc++ testing (tested normally rather > than in C++23 mode) makes me wonder if we couldn't tweak the default > testing. > Dunno what libstdc++ testing normally does (just C++17?), That's the default unless a test has something else in -std=gnu++17 but I do my local testing with: set target_list { "unix{,-D_GLIBCXX_USE_CXX11_ABI=0,-std=gnu++2b,-std=gnu++11}" } and then push to the compile farm and test with: set target_list { "unix{,-std=c++98,-std=gnu++11,-std=gnu++20,-D_GLIBCXX_USE_CXX11_ABI=0/-D_GLIBCXX_DEBUG,-D_GLIBCXX_DEBUG,-std=gnu++23}" } That's far too slow to force on everybody though. > make check-g++ > tests by default { 98, 14, 17, 20 } (and I regularly use > GXX_TESTSUITE_STDS=98,11,14,17,20,2b in environment but that doesn't > cover libstdc++ I guess). It doesn't, correct. It's been on my TODO list for a couple of years. > When adding tests for upcoming C++ version, one always has a dilemma > whether to use explicit // { dg-options "-std=c++2b" } > or -std=gnu++2b and similar, then the test works in all modes, but it might > be forgotten later on to be converted into // { dg-do whatever { target c++23 > } } > test so that when 23 is tested by default and say 26 or 29 appears too, > we test it also in those modes, or just go with > // { dg-do whatever { target c++23 } } > which has the disadvantage that it is skipped when testing by default and > one only tests it if he asks for the newer version. The convention is: // { dg-options "-std=gnu++23" } // { dg-do whatever { target c++23 } } When that becomes the default, we'll remove the first line, so that it runs for all later versions. See r12-678 to r12-686 which removed the dg-options "-std=gnu++17" after that became the default for g++. I should have noticed you were missing that from some of the new tests, sorry. I saw it in a few and didn't check them all. > I wonder if we couldn't for the default testing (when one doesn't > specify GXX_TESTSUITE_STDS or uses make check-c++-all and similar) > improve things a little bit by automatically treat those > // { dg-do whatever { target c++23 } } > tests as // { dg-options "-std=c++2b" }. > > g++-dg.exp has: > # If the testcase specifies a standard, use that one. > # If not, run it under several standards, allowing GNU extensions > # if there's a dg-options line. > if ![search_for $test "-std=*++"] { > if [search_for $test "dg-options"] { > set std_prefix "-std=gnu++" > } else { > set std_prefix "-std=c++" > } > > # See g++.exp for the initial value of this list. > global gpp_std_list > if { [llength $gpp_std_list] > 0 } { > set std_list $gpp_std_list > } else { > set std_list { 98 14 17 20 } > } > set option_list { } > foreach x $std_list { > # Handle "concepts" as C++17 plus Concepts TS. > if { $x eq "concepts" } then { set x "17 -fconcepts" > } elseif { $x eq "impcx" } then { set x "23 > -fimplicit-constexpr" } > lappend option_list "${std_prefix}$x" > } > } else { > set option_list { "" } > } > > set nshort [file tail [file dirname $test]]/[file tail $test] > > foreach flags_t $option_list { > verbose "Testing $nshort, $flags $flags_t" 1 > dg-test $test "$flags $flags_t" ${default-extra-flags} > } > so I wonder if in the set std_list { 98 14 17 20 } spot we couldn't do > something like special search_for for "{ dg-do * { target c++23 } }" > and if so, set std_list { 2b } instead of set std_list { 98 14 17 20 }? > It wouldn't handle more complex cases like > // { dg-do compile { target { c++23 && { aarch64*-*-* powerpc64le*-*-linux* > riscv*-*-* s390*-*-* sparc*-*-linux* } } } } > but at least for the majority of tests for the new language version > it would run them even in default testing where they'd be otherwise > skipped (we'd cycle over 98 14 17 20 only to see it doesn't satisfy any of > them). > If we wanted to go even further, we could handle similarly say c++11_only > tests. > > What do you think? But libstdc++ doesn't use g++.exp so we need to start using that (or something like it) in libstdc++ before any such changes would help.
Re: Announcement: Porting the Docs to Sphinx - 9. November 2022
On 10/19/22 10:13, Paul Iannetta wrote: > On Wed, Oct 19, 2022 at 09:24:06AM +0200, Martin Liška wrote: >> On 10/17/22 16:16, Paul Iannetta wrote: >>> Hi Martin, >>> >>> Thank you very much for porting the documentation to Sphinx, it is >>> very convenient to use, especially the menu on the left and the >>> search bar. >> >> Thanks, I also like it! >> >>> >>> However, I also regularly browse and search the documentation through >>> info, especially when I want to use regexps to search or need to >>> include a special character (eg.,+,-,_,(; this can happen when I >>> search for '(define' ) for example) in the search string. >>> >>> Does the port to Sphinx means the end of texinfo? Or, will both be >>> available as it is the case now with the official texinfo and your >>> unofficial splichal.eu pages. >> >> It will be still available same as now where manual pages and info pages >> are built if you compile GCC from sources. We haven't been publishing manual >> pages and info pages on our web pages, people typically get these from >> their distribution packages. > > As long as it is possible to build the info manual with "make info", even > through > something like rst2texinfo, I would be happy. Would it be possible > to see the rst source of the port so as to try rst2texinfo on it? Well, .rst source files can be seen right now here: https://github.com/marxin/texi2rst-generated And 'texinfo' is created with the standard Sphinx builder: https://www.sphinx-doc.org/en/master/man/sphinx-build.html#cmdoption-sphinx-build-b > >> >> Does it help? Or do you expect any change regarding what we publish at: >> https://gcc.gnu.org/onlinedocs/ >> ? > Currently, there is a tarball with texinfo sources for all the manuals > for each version. Well, then equivalent would be packaging all .rst files together with the corresponding conf.py, logo.* and other files. But I don't see it much useful. Thanks, Martin > > Thanks, > Paul > >> >> Cheers, >> Martin >> >>> >>> Paul >>> >>> On Mon, Oct 17, 2022 at 03:28:34PM +0200, Martin Liška wrote: Hello. Based on the very positive feedback I was given at the Cauldron Sphinx Documentation BoF, I'm planning migrating the documentation on 9th November. There are still some minor comments from Sandra when it comes to the PDF output, but we can address that once the conversion is done. The reason I'm sending the email now is that I was waiting for latest Sphinx release (5.3.0) that simplifies reference format for options and results in much simpler Option summary section ([1]) The current GCC master (using Sphinx 5.3.0) converted docs can be seen here: https://splichal.eu/scripts/sphinx/ If you see any issues with the converted documentation, or have a feedback about it, please reply to this email. Cheers, Martin [1] https://github.com/sphinx-doc/sphinx/pull/10840 [1] https://splichal.eu/scripts/sphinx/gcc/_build/html/gcc-command-options/option-summary.html >>> >>> >>> >>> >> >> >> >> >> > > > >
Re: Announcement: Porting the Docs to Sphinx - 9. November 2022
On 10/18/22 00:26, Sandra Loosemore wrote: > On 10/17/22 07:28, Martin Liška wrote: >> Hello. >> >> Based on the very positive feedback I was given at the Cauldron Sphinx >> Documentation BoF, >> I'm planning migrating the documentation on 9th November. There are still >> some minor comments >> from Sandra when it comes to the PDF output, but we can address that once >> the conversion is done. > > My main complaint about the PDF is that the blue color used for link text is > so light it interferes with readability. Few people are going to print the > document on paper any more, but I did try printing a sample page on a > grayscale printer and the blue link text came out so faint that it was barely > visible at all. Sure, I've just added support for monochromatic PDF output where one needs to use MONOCHROMATIC=1 make latexpdf ... and I linked the file here: https://splichal.eu/scripts/sphinx/gcc/_build/latexmonochromatic/gcc.pdf right now I build only one PDF in this mode and it's mentioned here: https://splichal.eu/scripts/sphinx/ What do you think about it now? > An E-ink reader device would probably have similar problems. There ePUB would be likely better output format. What do you think? Martin > > I'm generally not a fan of the other colors being used for formatting, > either. To me it seems like they all interfere with readability, plus in > code samples it seems like random things get highlighted in random colors, > instead of focusing on the thing the example is trying to demonstrate. > > I've been preferring to use the PDF form of the GNU manuals because it is > easier to search the whole document that way. The search feature in the new > web version doesn't quite cut it it gives you a list of web pages and > then you have to do a second browser search within each page to find the > reference. So I hope we can continue to support the PDF as a canonical > format and better tune it for easy readability, instead of assuming that most > people will only care about the online web version. > > -Sandra > > > >
Re: Announcement: Porting the Docs to Sphinx - 9. November 2022
On 10/19/22 13:09, Martin Liška wrote: > There ePUB would be likely better output format. What do you think? I've just included ePUB books: https://splichal.eu/scripts/sphinx/#epub Martin
[RISCV] RISC-V GNU Toolchain Biweekly Sync-up call (Oct 20, 2022)
Hi all, Here is the agenda for tomorrow's RISC-V GNU toolchain meeting. If you have any topics want to discuss or share, please let me know and I will add them to the agenda, thanks. Agenda: - RISC-V profile develop plan - Patchwork for patch initial review - RISC-V sub-extension supports progress RVV gcc support progress Zc* extension support progress - Open discuss Wei Wu - PLCT Lab is inviting you to a scheduled Zoom meeting. Topic: RISC-V GNU Toolchain Biweekly Sync-up Time: Oct 20, 2022 11:00 PM Singapore Please download and import the following iCalendar (.ics) files to your calendar system. Weekly: https://calendar.google.com/calendar/ical/lm5bddk2krcmtv5iputjgqvoio%40group.calendar.google.com/public/basic.ics Join Zoom Meeting https://zoom.us/j/89393600951?pwd=ZFpWMkZ6Tm1TbUFXT1hZZjZZMHhRQT09 Meeting ID: 893 9360 0951 Passcode: 899662 BEIJING, China 11:00pThu, Oct 20 2022 12:00aFri, Oct 20 2022 PST/PDT, Pacific Standard Time (US) 8:00aThu, Oct 20 2022 9:00aThu, Oct 20 2022 PHILADELPHIA, United States, Pennsylvania 11:00aThu, Oct 20 2022 12:00aThu, Oct 20 2022 Paris, France 17:00pThu, Oct 20 2022 18:00pThu, Oct 20 2022BEGIN:VCALENDAR PRODID:-//zoom.us//iCalendar Event//EN VERSION:2.0 CALSCALE:GREGORIAN METHOD:PUBLISH CLASS:PUBLIC BEGIN:VTIMEZONE TZID:Asia/Singapore LAST-MODIFIED:20220317T223602Z TZURL:http://tzurl.org/zoneinfo-outlook/Asia/Singapore X-LIC-LOCATION:Asia/Singapore BEGIN:STANDARD TZNAME:+08 TZOFFSETFROM:+0800 TZOFFSETTO:+0800 DTSTART:19700101T00 END:STANDARD END:VTIMEZONE BEGIN:VEVENT DTSTAMP:20220713T134316Z DTSTART;TZID=Asia/Singapore:20220714T23 DTEND;TZID=Asia/Singapore:20220715T00 SUMMARY:RISC-V GNU Toolchain Biweekly Sync-up RRULE:FREQ=WEEKLY;WKST=SU;UNTIL=20230323T16;INTERVAL=2;BYDAY=TH UID:ZOOM89393600951 TZID:Asia/Singapore DESCRIPTION:Wei Wu - PLCT Lab is inviting you to a scheduled Zoom meeting .\n\nJoin Zoom Meeting\nhttps://us02web.zoom.us/j/89393600951?pwd=ZFpWMk Z6Tm1TbUFXT1hZZjZZMHhRQT09\n\nMeeting ID: 893 9360 0951\nPasscode: 89966 2\nOne tap mobile\n+6531651065\,\,89393600951#\,\,\,\,*899662# Singapore \n+6531587288\,\,89393600951#\,\,\,\,*899662# Singapore\n\nDial by your location\n+65 3165 1065 Singapore\n+65 3158 7288 Singapo re\n+1 669 900 9128 US (San Jose)\n+1 669 444 9171 US\n +1 346 248 7799 US (Houston)\n+1 253 215 8782 US (Tacoma) \n+1 312 626 6799 US (Chicago)\n+1 646 558 8656 US (New York)\n+1 646 931 3860 US\n+1 301 715 8592 US (Washingto n DC)\nMeeting ID: 893 9360 0951\nPasscode: 899662\nFind your local numb er: https://us02web.zoom.us/u/kk9cyIPNJ\n\n LOCATION:https://us02web.zoom.us/j/89393600951?pwd=ZFpWMkZ6Tm1TbUFXT1hZZj ZZMHhRQT09 BEGIN:VALARM TRIGGER:-PT10M ACTION:DISPLAY DESCRIPTION:Reminder END:VALARM END:VEVENT END:VCALENDAR
Re: [RFC] c++, libstdc++: Default make check vs. tests for newest C++ standard
On 10/19/22 04:40, Jakub Jelinek wrote: Hi! The screw-up on my side with libstdc++ testing (tested normally rather than in C++23 mode) makes me wonder if we couldn't tweak the default testing. Dunno what libstdc++ testing normally does (just C++17?), make check-g++ tests by default { 98, 14, 17, 20 } (and I regularly use GXX_TESTSUITE_STDS=98,11,14,17,20,2b in environment but that doesn't cover libstdc++ I guess). When adding tests for upcoming C++ version, one always has a dilemma whether to use explicit // { dg-options "-std=c++2b" } or -std=gnu++2b and similar, then the test works in all modes, but it might be forgotten later on to be converted into // { dg-do whatever { target c++23 } } test so that when 23 is tested by default and say 26 or 29 appears too, we test it also in those modes, or just go with // { dg-do whatever { target c++23 } } which has the disadvantage that it is skipped when testing by default and one only tests it if he asks for the newer version. I wonder if we couldn't for the default testing (when one doesn't specify GXX_TESTSUITE_STDS or uses make check-c++-all and similar) improve things a little bit by automatically treat those // { dg-do whatever { target c++23 } } tests as // { dg-options "-std=c++2b" }. That would be great. g++-dg.exp has: # If the testcase specifies a standard, use that one. # If not, run it under several standards, allowing GNU extensions # if there's a dg-options line. if ![search_for $test "-std=*++"] { if [search_for $test "dg-options"] { set std_prefix "-std=gnu++" } else { set std_prefix "-std=c++" } # See g++.exp for the initial value of this list. global gpp_std_list if { [llength $gpp_std_list] > 0 } { set std_list $gpp_std_list } else { set std_list { 98 14 17 20 } } set option_list { } foreach x $std_list { # Handle "concepts" as C++17 plus Concepts TS. if { $x eq "concepts" } then { set x "17 -fconcepts" } elseif { $x eq "impcx" } then { set x "23 -fimplicit-constexpr" } lappend option_list "${std_prefix}$x" } } else { set option_list { "" } } set nshort [file tail [file dirname $test]]/[file tail $test] foreach flags_t $option_list { verbose "Testing $nshort, $flags $flags_t" 1 dg-test $test "$flags $flags_t" ${default-extra-flags} } so I wonder if in the set std_list { 98 14 17 20 } spot we couldn't do something like special search_for for "{ dg-do * { target c++23 } }" and if so, set std_list { 2b } instead of set std_list { 98 14 17 20 }? It wouldn't handle more complex cases like // { dg-do compile { target { c++23 && { aarch64*-*-* powerpc64le*-*-linux* riscv*-*-* s390*-*-* sparc*-*-linux* } } } } but at least for the majority of tests for the new language version it would run them even in default testing where they'd be otherwise skipped (we'd cycle over 98 14 17 20 only to see it doesn't satisfy any of them). If we wanted to go even further, we could handle similarly say c++11_only tests. What do you think? Jakub
GCC 10.4.1 Status Report (2022-10-19)
Status == The GCC 10 branch is in regression and documentation fixing mode. Apparently I haven't sent a status report after 10.4 got released, so sending one now. GCC 10.5 is still many months away, maybe spring next year. Quality Data Priority # Change from last report --- --- P10 P2 441 + 32 P3 45 - 6 P4 206 + 2 P5 24 --- --- Total P1-P3 486 + 26 Total 716 + 28 Previous Report === https://gcc.gnu.org/pipermail/gcc/2022-June/238946.html
Re: Redundant constants in coremark crc8 for RISCV/aarch64 (no-if-conversion)
On 10/19/22 01:46, Richard Biener wrote: On Wed, Oct 19, 2022 at 5:44 AM Jeff Law via Gcc wrote: On 10/18/22 20:09, Vineet Gupta wrote: On 10/18/22 16:36, Jeff Law wrote: There isn't a great place in GCC to handle this right now. If the constraints were relaxed in PRE, then we'd have a chance, but getting the cost model right is going to be tough. It would have been better (for this specific case) if loop unrolling was not being done so early. The tree pass cunroll is flattening it out and leaving for rest of the all tree/rtl passes to pick up the pieces and remove any redundancies, if at all. It obviously needs to be early if we are injecting 7x more instructions, but seems like a lot to unravel. Yup. If that loop gets unrolled, it's going to be a mess. It will almost certainly make this problem worse as each iteration is going to have a pair of constants loaded and no good way to remove them. Thats the original problem that I started this thread with. I'd snipped the disassembly as it would have been too much text but basically on RV, Coremark crc8 loop of const 8 iterations gets unrolled including extraneous 8 insns pairs to load the same constant - which is preposterous. Other arches side-step by using if-conversion / cond moves, latter currently WIP in RV International. x86 w/o if-convert seems OK since the const can be encoded in the xor insn. OTOH given that gimple/tree-pass cunroll is doing the culprit loop unrolling and introducing redundant const 8 times, can it ne addressed there somehow. tree_estimate_loop_size() seems to identify constant expression, not just an operand. Can it be taught to identify a "non-trivial const" and hoist/code-move the expression. Sorry just rambling here, most likely non-sense. On GIMPLE all constants are "simple". Oh, cunroll. There might be a distinct flag for complete unrolling. At -O3 we peel completely, there's no flag to disable that. I really expect something like Click's work is the way forward. Essentially when you VN the function you'll identify those constants and collapse them all down to a single instance. Then the GCM phase will kick in and find a place to put the evaluation so that you have one and only one. I'd say postreload gcse would be a place to do that. At least when there's no available hardreg CSEing likely isn't going to be a win. That's an interesting idea. Do it aggressively post-reload when we know there's a register available. Vineet, that seems like it's worth investigation. jeff
Re: Announcement: Porting the Docs to Sphinx - 9. November 2022
On 10/19/22 05:09, Martin Liška wrote: On 10/18/22 00:26, Sandra Loosemore wrote: On 10/17/22 07:28, Martin Liška wrote: Hello. Based on the very positive feedback I was given at the Cauldron Sphinx Documentation BoF, I'm planning migrating the documentation on 9th November. There are still some minor comments from Sandra when it comes to the PDF output, but we can address that once the conversion is done. My main complaint about the PDF is that the blue color used for link text is so light it interferes with readability. Few people are going to print the document on paper any more, but I did try printing a sample page on a grayscale printer and the blue link text came out so faint that it was barely visible at all. Sure, I've just added support for monochromatic PDF output where one needs to use MONOCHROMATIC=1 make latexpdf ... and I linked the file here: https://splichal.eu/scripts/sphinx/gcc/_build/latexmonochromatic/gcc.pdf right now I build only one PDF in this mode and it's mentioned here: https://splichal.eu/scripts/sphinx/ What do you think about it now? Hmmm, removing *all* visual cues that something is a link does not seem so great either, especially since the new format has changed the link text for @xref to remove the page and section information. E.g. we used to get "See Section 3.4 [Options Controlling C Dialect], page 44." and now it just reads "See Options Controlling C Dialect." I realize there is a can of worms here involving philosophical issues about whether the PDF manual is intended to be formatted for reading as a book or is just a handy way to repackage the hyperlinked web presentation for offline reference. Also there is another can of worms involving making the documentation accessible to people who have visual disabilities, specifically color blindness issues. Just speaking for myself, I'd be happy if the PDF just used a darker blue color for links that is both distinguishing and higher contrast with the background than the current light blue, but I think it is one of the principles of accessible design that color really shouldn't be the *only* indication of something that initiates an action. Maybe underlining, or a little link glyph, or restoring the section/page info to the link text? An E-ink reader device would probably have similar problems. There ePUB would be likely better output format. What do you think? Ooof, a lot of problems there. I looked at your new generated .epub in both the "ebook-viewer" utility on my laptop and on my Kobo Forma. The Kobo uses the default proportionally-spaced font for everything; even the code examples fail to come out in a fixed-width font. ebook-viewer shows fixed-width fonts for code examples and inline references to e.g. command line options, but the names of options in the option tables sections are in the proportional body font. Also in both viewers I see hyperlinks to https://splicha.eu/... in place of internal links in some references to command-line options and the like, and the formatting of the option summary tables really sucks, with lines breaking at hyphens in the middle of option names. I suggest we try to focus our efforts on the currently-supported formats before adding EPUB as a new format. -Sandra
Re: Announcement: Porting the Docs to Sphinx - 9. November 2022
On Wed, 19 Oct 2022, Martin Liška wrote: > > Currently, there is a tarball with texinfo sources for all the manuals > > for each version. > > Well, then equivalent would be packaging all .rst files together with the > corresponding > conf.py, logo.* and other files. But I don't see it much useful. I think we should have such a source tarball when the sources are .rst, as the successor to the Texinfo source tarball. (Unfortunately when I added that source tarball - https://gcc.gnu.org/legacy-ml/gcc-patches/2004-01/msg00140.html - I didn't give specific references to any of the individual requests that resulted in adding it.) -- Joseph S. Myers jos...@codesourcery.com
No-named-argument variadic functions
C2x allows variable-argument functions declared with (...) as parameters - no named arguments - as in C++. It *also* allows such functions to access their parameters, unlike C++, by relaxing the requirements on va_start so it no longer needs to be passed the name of the last named parameter. My assumption is that such functions should thus use the ABI for variable-argument functions, to the extent that's different from that for other functions. The main implementation issue I see is that GCC's internal representation for function types can't actually distinguish the (...) type from an unprototyped function - C++ functions with (...) arguments are treated by the middle end and back ends as unprototyped. (This probably works sufficiently well in ABI terms when the function can't actually use its arguments. Back ends may well call what they think are unprototyped functions in a way compatible with variadic callees anyway, for compatibility with pre-standard C code that calls e.g. printf without a prototype, even though standard C has never allowed calling variable-argument functions without a prototype.) So there are a few questions here for implementing this C2x feature: 1. How should (...) be represented differently from unprototyped functions so that stdarg_p and prototype_p handle it properly? Should I add a new language-independent type flag (there are plenty spare) to use for this? 2. Does anyone see any likely ABI or back end issues from allowing single-argument calls to __builtin_va_start to access the arguments to such a function? (I'd propose to redefine va_start in stdarg.h to use a single-argument call, discarding any subsequent arguments, only for C2x.) 3. Should the C++ front end be changed to mark (...) functions in whatever way is chosen for question 1 above, so that they start using the appropriate ABI (and, in particular, calls between C and C++, where a C implementation of such a function might use the arguments, work properly)? Or would there be problems with compatibility with existing callers or callees assuming the unprototyped function ABI? -- Joseph S. Myers jos...@codesourcery.com
Re: Announcement: Porting the Docs to Sphinx - 9. November 2022
On Mon, 2022-10-17 at 15:28 +0200, Martin Liška wrote: > Hello. > > Based on the very positive feedback I was given at the Cauldron Sphinx > Documentation BoF, > I'm planning migrating the documentation on 9th November. There are still > some minor comments > from Sandra when it comes to the PDF output, but we can address that once the > conversion is done. > > The reason I'm sending the email now is that I was waiting for latest Sphinx > release (5.3.0) that > simplifies reference format for options and results in much simpler Option > summary section ([1]) > > The current GCC master (using Sphinx 5.3.0) converted docs can be seen here: > https://splichal.eu/scripts/sphinx/ > > If you see any issues with the converted documentation, or have a feedback > about it, > please reply to this email. Ouch. This will be very painful for Linux From Scratch. We'll need to add 23 Python modules to build the documentation, while we only have 88 packages in total currently... And we don't want to omit GCC documentation in our system. Could generated man and info pages be provided as a tarball on gcc.gnu.org or ftp.gnu.org?
Re: No-named-argument variadic functions
On Thu, Oct 20, 2022 at 1:54 AM Joseph Myers wrote: > > C2x allows variable-argument functions declared with (...) as parameters - > no named arguments - as in C++. It *also* allows such functions to access > their parameters, unlike C++, by relaxing the requirements on va_start so > it no longer needs to be passed the name of the last named parameter. > > My assumption is that such functions should thus use the ABI for > variable-argument functions, to the extent that's different from that for > other functions. The main implementation issue I see is that GCC's > internal representation for function types can't actually distinguish the > (...) type from an unprototyped function - C++ functions with (...) > arguments are treated by the middle end and back ends as unprototyped. > (This probably works sufficiently well in ABI terms when the function > can't actually use its arguments. Back ends may well call what they think > are unprototyped functions in a way compatible with variadic callees > anyway, for compatibility with pre-standard C code that calls e.g. printf > without a prototype, even though standard C has never allowed calling > variable-argument functions without a prototype.) > > So there are a few questions here for implementing this C2x feature: > > 1. How should (...) be represented differently from unprototyped functions > so that stdarg_p and prototype_p handle it properly? Should I add a new > language-independent type flag (there are plenty spare) to use for this? I'd say unprototyped should stay with a NULL TYPE_ARG_TYPES but a varargs function might change to have a TREE_LIST with a NULL type as the trailing element? Not sure if we want to change this also for varargs functions with actual arguments. If we want to go down the route with a flag on the function type then I'd rather flag the unprototyped case and leave varargs without any actual arguments as NULL TYPE_ARG_TYPES? > 2. Does anyone see any likely ABI or back end issues from allowing > single-argument calls to __builtin_va_start to access the arguments to > such a function? (I'd propose to redefine va_start in stdarg.h to use a > single-argument call, discarding any subsequent arguments, only for C2x.) > > 3. Should the C++ front end be changed to mark (...) functions in whatever > way is chosen for question 1 above, so that they start using the > appropriate ABI (and, in particular, calls between C and C++, where a C > implementation of such a function might use the arguments, work properly)? > Or would there be problems with compatibility with existing callers or > callees assuming the unprototyped function ABI? > > -- > Joseph S. Myers > jos...@codesourcery.com