On Thu, Feb 20, 2025 at 8:38 PM James K. Lowden
<jklow...@schemamania.org> wrote:
>
> On Thu, 20 Feb 2025 11:38:58 +0100
> Richard Biener <richard.guent...@gmail.com> wrote:
>
> > Can you clarify on the future development model for Cobol after it has
> > been merged?  Is the cobolworx gitlab still going to be the primary
> > development location and changes should be made there and then merged
> > to the GCC side?
>
> I would like the future development model for Cobol to be convenient
> for all involved.  If we don't change anything then, yes, we'll keep
> using our cobolworx gitlab server.  But we don't insist on that.  From
> our point of view, one git server is as good as another.  That's what
> peer-to-peer is all about, right?  We can use gcc's git as the primary,
> and mirror that ourselves, if that's what you're suggesting.
>
> Branches in git don't have independent permissions.  If we use
> gcc.gnu.org git, are we granted commit rights with the priviso that we
> color inside the lines, and commit only to our own branches?

My expectation is that by contributing the COBOL frontend you are
volunteering to be maintainers for it which grants you permission
(as in reviewing your own and other peoples patches) in that area.
As part of the contribution you should also get commit access to GCCs
git repository (see https://gcc.gnu.org/gitwrite.html)

> > The most important part for GCC 15 will be documentation to get
> > user expectancy right.
>
> Absolutely, that's in everyone's interest.
>
> > Having a minimal harness in GCCs testsuite is critical - I'd expect a
> > gcc/testsuite/gcobol.dg/dg.exp supporting execution tests.  I assume
> > Cobol has a way to exit OK or fatally and this should be
> > distinguished as testsuite PASS or FAIL.
>
> Yes, a COBOL program exits with a return status.  And we rigged up NIST
> to do that.  What that means requires a long explanation, sorry.
>
> NIST Is highly configurable within its envelope.  It generates most of
> its inputs, and a configuration process controls the names of the data
> files.  (There is also a substitution capability to adapt the COBOL
> source to the compiler.)  The "configuration process" is itself a COBOL
> program called EXEC85.  It reads the source archive and produces
> modified COBOL programs for compilation and execution.
>
> NIST as we use it comprises 12 modules covering different aspects
> of COBOL. Each module is designated a 2-letter prefix, NC for "NIST
> core", IX for "Indexed I/O", etc.  We create one directory per module.
> Each module has ~100 programs, each with many tests.  Some programs
> must be run in a fixed order because they produce and process files in
> series.
>
> Each program simply runs and reports its tests' results to a file (which
> could be stdout, but isn't in our setup).  The program always exits
> normally unless it crashes, of course.  Test failures are indicated by
> "FAIL*" in the report.
>
> As it's set up now in our CI/CD, NIST lives in its own subdirectory,
> gcc/cobol/nist.   There we have two critical files: Makefile (900
> lines) and report.awk (56 lines).  In each module's directory we also
> maintain any configuration inputs to EXEC85.  In nist/NC, for example,
> we have NC109M.conf  and NC204M.conf.
>
> The Makefile fetches the NIST archive from our website.  (We originally
> got it from NIST, but their site was reorganized last year.  The file
> went missing, as apparently did my email to the webmaster.
> Technology!)  The file might have 100 targets to run various bits.  For
> gcc's purpose, only one matters: "make report".
>
> That target:
>
> 1. fetches the archive
> 2. extracts & patches EXEC85.cbl
> 3. compiles to produce EXEC85
> 4. runs EXEC85 against the archive to extract modified COBOL
> test programs.
> 5. compiles the programs
> 6. runs each program, producing a .rpt file for each one
> 7. trundles over each .rpt file with report.awk searching for failures
>
> Because the process is run under make(1), steps 2-6 run in parallel, N
> jobs for -j N.  If there are no failures, report.awk returns 0 to make,
> else 1. Start to end, it takes just a few minutes on a fast machine.
>
> Now you know what I know, and I need to know what you know.
>
> > I'm not sure if /* { dg-... } */ directive support is easy or
> > desirable (well, desirable for sure).  I'd be happy with a setup like
> > the old gcc.c-torture/{execute,compile}.  Possibly tcl/shell wrapping
> > around the NIST tests (even if not included in the GCC tree, running
> > that by unpacking it in gcc/testsuite/gcobol/nist would be nice).
>
> I don't understand most of that.  I think you would like to use
> DejaGnu, and I think we can get there.  One hurdle is that I've never
> used that software.
>
> I suggest a 2-phase process, one expedient and the other long term.
>
> 1.  For now, keep the above intact, and put it where it belongs, wired
> up with the least possible glue (to mix metaphors).  That will work,
> and more than meet the "minimal" threshold.

I'll point you to the ACATS testsuite for the Ada frontend which
"integrates" into your testing framework with a shell script
(see testsuite/ada/acats/) - I can't quickly figure how it gets invoked
though ;)  I suggest to follow a similar scheme when integrating
an externally maintained testsuite.

> 2.  Modify the above to run each test file under DejaGnu.  It's my
> understanding DG uses Tcl and expect(1).  (The last time I used that
> program was with tip(1).  There was a Hayes Smartmodem involved.)  We
> know what good output looks like.  Be advised, it is voluminous.
> We can use DG instead of awk to compare results.
>
> There's also a Step 0.  We need to agree on what to do about the
> documentation and the NIST source code archive.  Do they go in the
> repository or are they hosted externally and, if so, where?

It depends on the license terms (and size, obviously).  For now I'd
suggest to put the "plumbing" into the GCC repository and expect
the user to download and unpack the sources in a specific place so
that when present the testsuite is invoked.

> I sincerely believe gcc users are best served when documentation is
> included with any source code they use.  As I mentioned previously, the
> documentation does bear a copyright, but it was also sponsored by NIST,
> and the US government has not so far this week begun claiming copyright
> on US publications. Beyond that I can only say that's why the FSF has
> lawyers.
>
> For the purpose of the merge, I suggest for simplicity we leave things
> as they are, with archive and documentation *not* in the repository.

Agreed.

> By the time Phase 2 is executed (supposing we agree on my proposal)
> I hope the question will be resolved in favor of keeping them in
> the repository.
>
> Questions:
>
> 1.  Where should NIST live in the gcc repository?
> 2.  Where should the NIST archive and documentation be hosted?

I guess gcc/testsuite/cobol/nist or sth like that.

> 3.  Is the plan ok?

2. may be not necessary, but it should be priority to have a somewhat
simple way a contributor can properly test a patch to the GCC cobol
frontend.  We for example have ./contrib/download_prerequesites
so I envision a ./contrib/download_cobol_nist that would download
the relevant parts and installs it in the appropriate places in the source
directory so that a later make check-gcc-cobol will run the testsuite.

> 4.  If there's a new directory, does that involve a patch similar to
> the one that created the libgcobol directory?

No, all of gcc/testsuite has just a single ChangeLog file

> 5.  Can the Phase 1 approach be simply to have DG respond to the output
> of "make report"?  If not, what is the alternative?

Yes, see my comment about Ada ACATS above.

> 6.  Shall I begin by sending a patch, say, next week?
>
> I'm happy to adapt NIST to work within gcc's testing framework.  I
> don't think it will be that difficult once we know what to do.
> Depending on timing and your needs, we might be able to skip directly
> to phase 2.  With my usual optimism, never having used the software, I
> think March 15 looks feasible.

Given I'm not a lawyer and lawyers tend to be slow I'd go with 'phase 1'.
But yes, I think there should be a way to validate the GCC Cobol build
in the usual GCC means as part of a make check (even if that involves
a prior ./cotrib/download_cobol_nist invocation).

Richard.

>
> --jkl

Reply via email to