Re: ANN: Dogelog Runtime, Prolog to the Moon (2021)

2021-09-17 Thread Mostowski Collapse

Thanks for your response, will have a look.
Ok, dis() is all that is need to disassemble.

Very cool!

A long term goal could be indeed to have
a Prolog interpreter produce 20MLips, like
SWI-Prolog, but tightly integrated into

Python. So that it directly makes use of
the Python objects and the Python garbage
collection like Dogelog Runtime.

Although Dogelog Runtime has its own
garbage collection, its only used to help
the native Python garbage collection.

The result is that you can enjoy bi-directly
calling Python. For example the Prolog
adding of two numbers is realized as:

###
# +(A, B, C): [ISO 9.1.7]
# The predicate succeeds in C with the sum of A and B.
##
def eval_add(alpha, beta):
check_number(alpha)
check_number(beta)
try:
return alpha + beta
except OverflowError:
raise make_error(Compound("evaluation_error", ["float_overflow"]))

And then register it:

add("+", 3, make_dispatch(eval_add, MASK_MACH_FUNC))

Could also map the exception to a Prolog term later.
Thats not so much an issue for speed. The sunshine
case is straight forward.

But I might try dis() on eval_add(). Are exceptions
blocks in Python cheap or expensive? Are they like
in Java, some code annotation, or like in Go

programming language pushing some panic handler?

Greg Ewing schrieb:

On 16/09/21 4:23 am, Mostowski Collapse wrote:

I really wonder why my Python implementation
is a factor 40 slower than my JavaScript implementation.


There are Javascript implementations around nowadays that are
blazingly fast. Partly that's because a lot of effort has been
put into them, but it's also because Javascript is a different
language. There are many dynamic aspects to Python that make
fast implementations difficult.


I use in Python:

   temp = [NotImplemented] * code[pos]
   pos += 1

is the the idiom [_] * _ slow?


No, on the contrary it's probably the fastest way to do it
in Python. You could improve it a bit by precomputing
[NotImplemented]:

# once at the module level
NotImplementedList = [NotImplemented]

# whenever you want a new list
temp = NotImplementedList * code[pos]

That's probably at least as fast as built-in function for
creating lists would be.


does it really first create an
array of size 1 and then enlarge it?


It does:

 >>> def f(code, pos):
...  return [NotImplemented] * code[pos]
...
 >>> from dis import dis
 >>> dis(f)
   2   0 LOAD_GLOBAL  0 (NotImplemented)
   2 BUILD_LIST   1
   4 LOAD_FAST    0 (code)
   6 LOAD_FAST    1 (pos)
   8 BINARY_SUBSCR
  10 BINARY_MULTIPLY
  12 RETURN_VALUE

BTW, the Python terminology is "list", not "array".
(There *is* something in the stdlib called an array, but
it's rarely used or needed.)



--
https://mail.python.org/mailman/listinfo/python-list


Re: ANN: Dogelog Runtime, Prolog to the Moon (2021)

2021-09-17 Thread Greg Ewing

On 16/09/21 6:56 am, Mostowski Collapse wrote:

What could be slow, repeatedly requesting the "args"
field. Maybe I should do:

help = term.args
i = 0
while i < len(help) - 1:
mark_term(help[i])
i += 1
term = help[i]


Yes, that will certainly help.

But you're still evaluating len(help) - 1 every time around
the loop, so this is even better:

help = term.args
n = len(help) - 1
i = 0
while i < n:
mark_term(help[i])
i += 1
term = help[i]

Some general principles to be aware of:

* Almost nothing is free -- Python very literally does what
you tell it to do, even if it looks dumb.

* Access to attributes and global variables is expensive
(requires at least one dict lookup, often more in the case
of attributes).

* Access to *local* variables, on the other hand, is very
cheap (essentially an array lookup).

* Function calls are expensive -- both to look up the name, if
it's global, which it usually is, and the machinery of the
call itself.

* Creating objects is expensive. Creating instances of
user-defined objects is more expensive than built-in ones.

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list


Re: ANN: Dogelog Runtime, Prolog to the Moon (2021)

2021-09-17 Thread Mostowski Collapse

No its cooperative. Usually objects do get
garbage collected by the native garbage collector
of the host language in Dogelog runtime.

The Prolog garbage collection is only to help
the host language garbage collector when you have
a deep recursion in Prolog.

You can then reclaim intermediate variables.
A simple example to test the slightly idio-
syncratic Prolog garbage collection is:

fibo(0, 1) :- !.
fibo(1, 1) :- !.
fibo(N, X) :-
   M is N-1, fibo(M, Y),
   L is M-1, fibo(L, Z),
   X is Y+Z.

When running fibo(30,X) SWI-Prolog does around
800 garbage collections to keep the environment
small. But SWI-Prolog allocates all its objects

only very seldom on the heap. It uses its own
stack. On the other hand Dogelog runtime creates
everything on the heap. And completely relies on

the host language garbage collection. It only
helps the host language garbage collection it
that it performs from time to time this movement:

Before:

-->[ A ]-->[ B ]-->[ C ]-->

After:

-->[ A ]-->[ C ]-->

A,B,C are objects of type Variable. The above
movement only happens for objects of type Variables
from time to time. For objects of type Compound

no modifications are done during Prolog garbage
collection. The Prolog garbage collection aggressively
nulls the Variable object B, and the host language

later will garbage collect what the Variable object B
was pointing to. But the Variable object B might
nevertheless have point to something shared with

some other Variable object or a local or a global
Python variable, or a Compound. This is then all
courtesy of the host language to decide reachability.

Chris Angelico schrieb:

On Fri, Sep 17, 2021 at 7:17 AM Mostowski Collapse  wrote:


About Exceptions: Thats just building ISO core
standard Prolog error terms.

About Garbage Collection: Thats just Prolog
garbage collection, which does shrink some
single linked lists, which ordinary
programmig language GC cannot do,



Okay, so you're building your own garbage collection on top of
Python's, and you're wondering why it's slow?

Change your code to not try to implement one language inside another,
and you'll see a massive performance improvement.

ChrisA



--
https://mail.python.org/mailman/listinfo/python-list


Re: ANN: Dogelog Runtime, Prolog to the Moon (2021)

2021-09-17 Thread Mostowski Collapse

The Prolog garbage collection that does
the movement on the variable trail is only
a very small fraction of the runtime.

The garbage collection time is measured.
Some measurements with version 0.9.5
took the following values:

%%%
% Standard Python Version, Warm Run
% ?- time(fibo(23,X)).
% % Wall 3865 ms, gc 94 ms, 71991 lips
% X = 46368.

%%%
% GraalVM Python Version, Warm Warm Run
% ?- time(fibo(23,X)).
% % Wall 695 ms, gc 14 ms, 400356 lips
% X = 46368.

The "gc" timing measures Prolog garbage
collection. So you get the following percentage
of time spent in Prolog garbage collection:

Standard Python: 94 / 3865 = 2.4%

GraalVM Python: 14 / 695 = 2.0%

I consider this a good result. The Prolog
garbage collection is not utterly expensive.
The detecting the movement and performing

the variable movement on the trail, doesn't
take so much time. Currently the garbage collection
in Dogelog runtime is configured towards

synchronization with 60FPS, it does around
60-120 garbage collections per second. This
is less than what SWI-Prolog does.

SWI-Prolog has a much higher GC rate.

But I did not yet measure new version 0.9.6.

Mostowski Collapse schrieb:

No its cooperative. Usually objects do get
garbage collected by the native garbage collector
of the host language in Dogelog runtime.

The Prolog garbage collection is only to help
the host language garbage collector when you have
a deep recursion in Prolog.

You can then reclaim intermediate variables.
A simple example to test the slightly idio-
syncratic Prolog garbage collection is:

fibo(0, 1) :- !.
fibo(1, 1) :- !.
fibo(N, X) :-
    M is N-1, fibo(M, Y),
    L is M-1, fibo(L, Z),
    X is Y+Z.

When running fibo(30,X) SWI-Prolog does around
800 garbage collections to keep the environment
small. But SWI-Prolog allocates all its objects

only very seldom on the heap. It uses its own
stack. On the other hand Dogelog runtime creates
everything on the heap. And completely relies on

the host language garbage collection. It only
helps the host language garbage collection it
that it performs from time to time this movement:

Before:

     -->[ A ]-->[ B ]-->[ C ]-->

After:

     -->[ A ]-->[ C ]-->

A,B,C are objects of type Variable. The above
movement only happens for objects of type Variables
from time to time. For objects of type Compound

no modifications are done during Prolog garbage
collection. The Prolog garbage collection aggressively
nulls the Variable object B, and the host language

later will garbage collect what the Variable object B
was pointing to. But the Variable object B might
nevertheless have point to something shared with

some other Variable object or a local or a global
Python variable, or a Compound. This is then all
courtesy of the host language to decide reachability.

Chris Angelico schrieb:
On Fri, Sep 17, 2021 at 7:17 AM Mostowski Collapse 
 wrote:


About Exceptions: Thats just building ISO core
standard Prolog error terms.

About Garbage Collection: Thats just Prolog
garbage collection, which does shrink some
single linked lists, which ordinary
programmig language GC cannot do,



Okay, so you're building your own garbage collection on top of
Python's, and you're wondering why it's slow?

Change your code to not try to implement one language inside another,
and you'll see a massive performance improvement.

ChrisA





--
https://mail.python.org/mailman/listinfo/python-list


Re: ANN: Dogelog Runtime, Prolog to the Moon (2021)

2021-09-17 Thread Mostowski Collapse
Concerning garbage collection, did a long term
measurement for the first time. I measured
LIPS for fibonacci numbers, i.e. time(fibo(23,X)).

Doing the same thing 16 times, how long does it take?
Here is a depiction how the LIPS relatively differ in each run:
https://gist.github.com/jburse/c85297e97091caf22d306dd8c8be12fe#gistcomment-3896343

I can also calculate the mean and standard deviation. 
>From this we see that Python has a 5% deviation, whereas
GraalVM has a 1% deviation. So the GraalVM garbage

collector works more evenly? Disclaimer, I measured
different time spans, the GraalVM is now 7x times
faster than Standard Python, so this is inconclusive.

Mostowski Collapse schrieb am Freitag, 17. September 2021 um 10:58:57 UTC+2:
> The Prolog garbage collection that does 
> the movement on the variable trail is only 
> a very small fraction of the runtime. 
> 
> The garbage collection time is measured. 
> Some measurements with version 0.9.5 
> took the following values: 
> 
> %%% 
> % Standard Python Version, Warm Run 
> % ?- time(fibo(23,X)). 
> % % Wall 3865 ms, gc 94 ms, 71991 lips 
> % X = 46368. 
> 
> %%% 
> % GraalVM Python Version, Warm Warm Run 
> % ?- time(fibo(23,X)). 
> % % Wall 695 ms, gc 14 ms, 400356 lips 
> % X = 46368. 
> 
> The "gc" timing measures Prolog garbage 
> collection. So you get the following percentage 
> of time spent in Prolog garbage collection: 
> 
> Standard Python: 94 / 3865 = 2.4% 
> 
> GraalVM Python: 14 / 695 = 2.0% 
> 
> I consider this a good result. The Prolog 
> garbage collection is not utterly expensive. 
> The detecting the movement and performing 
> 
> the variable movement on the trail, doesn't 
> take so much time. Currently the garbage collection 
> in Dogelog runtime is configured towards 
> 
> synchronization with 60FPS, it does around 
> 60-120 garbage collections per second. This 
> is less than what SWI-Prolog does. 
> 
> SWI-Prolog has a much higher GC rate. 
> 
> But I did not yet measure new version 0.9.6. 
> 
> Mostowski Collapse schrieb:
> > No its cooperative. Usually objects do get 
> > garbage collected by the native garbage collector 
> > of the host language in Dogelog runtime. 
> > 
> > The Prolog garbage collection is only to help 
> > the host language garbage collector when you have 
> > a deep recursion in Prolog. 
> > 
> > You can then reclaim intermediate variables. 
> > A simple example to test the slightly idio- 
> > syncratic Prolog garbage collection is: 
> > 
> > fibo(0, 1) :- !. 
> > fibo(1, 1) :- !. 
> > fibo(N, X) :- 
> > M is N-1, fibo(M, Y), 
> > L is M-1, fibo(L, Z), 
> > X is Y+Z. 
> > 
> > When running fibo(30,X) SWI-Prolog does around 
> > 800 garbage collections to keep the environment 
> > small. But SWI-Prolog allocates all its objects 
> > 
> > only very seldom on the heap. It uses its own 
> > stack. On the other hand Dogelog runtime creates 
> > everything on the heap. And completely relies on 
> > 
> > the host language garbage collection. It only 
> > helps the host language garbage collection it 
> > that it performs from time to time this movement: 
> > 
> > Before: 
> > 
> > -->[ A ]-->[ B ]-->[ C ]--> 
> > 
> > After: 
> > 
> > -->[ A ]-->[ C ]--> 
> > 
> > A,B,C are objects of type Variable. The above 
> > movement only happens for objects of type Variables 
> > from time to time. For objects of type Compound 
> > 
> > no modifications are done during Prolog garbage 
> > collection. The Prolog garbage collection aggressively 
> > nulls the Variable object B, and the host language 
> > 
> > later will garbage collect what the Variable object B 
> > was pointing to. But the Variable object B might 
> > nevertheless have point to something shared with 
> > 
> > some other Variable object or a local or a global 
> > Python variable, or a Compound. This is then all 
> > courtesy of the host language to decide reachability. 
> > 
> > Chris Angelico schrieb: 
> >> On Fri, Sep 17, 2021 at 7:17 AM Mostowski Collapse 
> >>  wrote: 
> >>> 
> >>> About Exceptions: Thats just building ISO core 
> >>> standard Prolog error terms. 
> >>> 
> >>> About Garbage Collection: Thats just Prolog 
> >>> garbage collection, which does shrink some 
> >>> single linked lists, which ordinary 
> >>> programmig language GC cannot do, 
> >>> 
> >> 
> >> Okay, so you're building your own garbage collection on top of 
> >> Python's, and you're wondering why it's slow? 
> >> 
> >> Change your code to not try to implement one language inside another, 
> >> and you'll see a massive performance improvement. 
> >> 
> >> ChrisA 
> >> 
> >
-- 
https://mail.python.org/mailman/listinfo/python-list


How to support annotations for a custom type in a C extension?

2021-09-17 Thread Marco Sulla
I created a custom dict in a C extension. Name it `promethea`. How can
I implement `promethea[str, str]`? Now I get:

TypeError: 'type' object is not subscriptable
-- 
https://mail.python.org/mailman/listinfo/python-list


Re: How to support annotations for a custom type in a C extension?

2021-09-17 Thread MRAB

On 2021-09-17 21:03, Marco Sulla wrote:

I created a custom dict in a C extension. Name it `promethea`. How can
I implement `promethea[str, str]`? Now I get:

TypeError: 'type' object is not subscriptable

Somewhere you'll have a table of the class's methods. It needs an entry 
like this:



static PyMethodDef customdict_methods[] = {
...
{"__class_getitem__", (PyCFunction)Py_GenericAlias, METH_CLASS | 
METH_O | METH_COEXIST, PyDoc_STR("See PEP 585")},

...
};


Note the flags: METH_CLASS says that it's a class method and 
METH_COEXIST says that it should use this method instead of the slot.

--
https://mail.python.org/mailman/listinfo/python-list


Re: How to support annotations for a custom type in a C extension?

2021-09-17 Thread Marco Sulla
Ooook. I have a question. Why is this code not present in
dictobject.c? Where are the dict annotations implemented?

On Sat, 18 Sept 2021 at 03:00, MRAB  wrote:
>
> On 2021-09-17 21:03, Marco Sulla wrote:
> > I created a custom dict in a C extension. Name it `promethea`. How can
> > I implement `promethea[str, str]`? Now I get:
> >
> > TypeError: 'type' object is not subscriptable
> >
> Somewhere you'll have a table of the class's methods. It needs an entry
> like this:
>
>
> static PyMethodDef customdict_methods[] = {
> ...
>  {"__class_getitem__", (PyCFunction)Py_GenericAlias, METH_CLASS |
> METH_O | METH_COEXIST, PyDoc_STR("See PEP 585")},
> ...
> };
>
>
> Note the flags: METH_CLASS says that it's a class method and
> METH_COEXIST says that it should use this method instead of the slot.
> --
> https://mail.python.org/mailman/listinfo/python-list
-- 
https://mail.python.org/mailman/listinfo/python-list