+1 to what Josh said
Sent from my iPhone
> On Jun 25, 2025, at 1:18 PM, Josh McKenzie wrote:
>
>
> Did some more digging. Apparently the way a lot of headline-grabbers have
> been making models reproduce code verbatim is to prompt them with dozens of
> verbatim tokens of copyrighted code as
Did some more digging. Apparently the way a lot of headline-grabbers have been
making models reproduce code verbatim is to prompt them with dozens of verbatim
tokens of copyrighted code as input where completion is then very heavily
weighted to regurgitate the initial implementation. Which makes
Hi,
OK, so exclude instead of allow is an option 5 and I assume this would be
combined with a requiring people to identify when they used generative AI and
what they used? Seems like there is broad support for that and it is the ASF
recommendation.
Would the starting point for an exclusion lis
> 2. Models that do not do output filtering to restrict the reproduction of
> training data unless the tool can ensure the output is license compatible?
>
> 2 would basically prohibit locally run models.
I am not for this for the reasons listed above. There isn’t a difference
between this and