* Björn Kettunen <[email protected]> [2026-03-25 01:22]:
> Ihor Radchenko <[email protected]> writes:
> 
> >> At some point I'd like to tackle Prolog, which is also abysmal. And
> >> ob-haskell doesn't play well with latest haskel-mode; I have to use a 10 yo
> >> version. Your thoughts on all the ob-<your obscure language here>?
> >
> > If you know bugs in ob-haskell (which *is* a part of Org mode), feel
> > free to report them.
> >
> > Whether you can send LLM-generated patches, it depends.
> >
> > You definitely cannot send complex patches. LLM-produced code is likely
> > public domain, and adding non-trivial amount of public domain code may
> > have implications on GPL licensing. Until GNU clarifies on these
> > implications with lawyers, we are putting large LLM contributions on
> > hold. The best I can suggest here is turning patches into third-party
> > packages - those are definitely not restricted in terms of what you can
> > use or not (all the legal burden will be on you, the author).
> 
> Assuming for a second the second the source of the code isn't by itself
> the issue. How can it be ok to submit code generated from closed source
> SAS LLVM's? That goes against what free software is from anyone's point
> of view in my opinion.

Dear Björn,

Greetings.

Your point about LLVM vs. LLM -- probably you mean LLMs? I personally
do not know what you would mean with LLVM's.

If with SAS, you are referring to to this link below, I can understand
you:

Who Does That Server Really Serve? - GNU Project - Free Software Foundation:
https://www.gnu.org/philosophy/who-does-that-server-really-serve.html

The GNU philosophy piece "Who Does That Server Really Serve?" warns
against exactly the kind of dependency you're describing—but it also
assumes a world where users are forced to interact with software as a
service. That assumption is increasingly outdated. Today, I can run
Qwen, Llama, DeepSeek, or any number of open‑weight models entirely
locally on my own hardware. Hugging Face, Allen AI, IBM, Apertus, and
others are making this the norm. When I generate code, it's on my
machine, with models that are publicly available, often under
permissive or free software licenses. The "proprietary service"
framing doesn't apply when the user controls the tool end‑to‑end.

Like right now while we are speaking, I am running this one in
background, which by using opencode software, improves my Elisp
projects:

(rcd-llm-get-current-running-model) ⇒ "Qwen3-Coder-Next-UD-Q3_K_XL.gguf"

It has "open" weights, it means, there is Apache 2.0 license for it.

Here are the dataset links in a single-level list:

- nvidia/Llama-Nemotron-Post-Training-Dataset: 
https://huggingface.co/datasets/nvidia/Llama-Nemotron-Post-Training-Dataset
- SWE-smith: https://github.com/SWE-bench/SWE-smith
- SWE-Flow: https://github.com/Hambaobao/SWE-Flow
- Multi-SWE-RL: https://huggingface.co/datasets/Multi-SWE-RL/Multi-SWE-RL

This may not be all, it may not be free software licensed, but guess
what? I don't care. People can read information, there is no license
to read, so software can read it on behalf of people and create
weights. I don't want to bother my life with it. Point is that users
of huggingface.co and many other websites providing LLMs, already
enjoy so much of the free software freely generated, where there is
almost zero chance that freedom for those people who do generate
software would be legally attacked somehow. Maybe corporations
creating such could get legal attacks, but chances for users is
basically zero.

That is practical reality.

We now have a dilemma:

To create more free software, faster and more efficiently, while
enjoying our lives — breakfast, the swimming pool — while the computer
works on it in a continuous loop;

Or to pester, annoy, scold, and protest because the new technology is
faster than anything before, and the true "artist" — the programmer —
is now left behind.

So if the process is free (local, "open" models), and the output is
code contributed under GPL with a human submitting it, what exactly is
the ethical problem?

But beyond that, I want to challenge the deeper premise: that we
should even be treating copyright provenance as the gold standard for
free software contributions.

LLMs, by generating code that lacks a clear human author and thus
defies traditional copyright attribution, are a feature rather than a
bug.

They actively undermine the copyright regime that free software has
had to work within for decades. Rather than seeing this as a problem,
I view it as liberation—a way to bypass the very system that free
software has always had to negotiate.

In this framing:

- Copyright was always a burden on software freedom

- Free software fought within that system (GPL, copyright assignment, etc.)

- LLMs now let us produce code without the usual authorship constraints

- This effectively destroys copyright's grip on software production

That's not a bug. That's the point.

The free software movement spent decades playing by copyright's rules
because there was no alternative. Now there is. If I can generate
high‑quality GPL‑compatible code using locally‑run open models,
without assigning copyright to the FSF or worrying about contributor
agreements, I'm not weakening free software—I'm accelerating it. I'm
bypassing the friction that copyright intentionally creates.

So when projects like Org mode say "we're holding large LLM
contributions until GNU clarifies licensing implications," I hear:
"we're waiting for the old system to give us permission to use tools
that make that system obsolete."

The legal caution is understandable. But ethically? The fear that
LLM‑generated code somehow taints free software rests on an assumption
that the process must be constrained by copyright thinking even when
the output is clearly free. That's cargo‑culting free software
formalism over actual freedom.

Let's stop pretending that copyright assignment and human‑only
authorship are essential to freedom. They were tactics, not
principles. If we can now produce more free software with less legal
overhead, using tools we control, that's a win—not a threat.

Jean Louis

Reply via email to