Re: [Python-Dev] A more flexible task creation

2018-06-15 Thread Steve Holden
On Thu, Jun 14, 2018 at 8:14 PM, Chris Barker via Python-Dev <
python-dev@python.org> wrote:

> Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to
> understand the problem here.
>


> So why do queries fail with 1 tasks? or ANY number? If the async DB
> access code is written right, a given query should not "await" unless it is
> in a safe state to do so.
>
> So what am I missing here???
>
> because threads ARE concurrent, and there is no advantage to having more
>> threads than can actually run at once, and having many more does cause
>> thread-switching performance issues.
>>
>
> To me, tasks are (somewhat) logically analogous to threads.
>>
>
> kinda -- in the sense that they are run (and completed) in arbitrary
> order, But they are different, and that difference is key to this issue.
>
> As Yury expressed interest in this idea, there must be something I'm
> missing.
>
> What is it?
>

All tasks need resources, and bookkeeping for such tasks is likely to slow
things down. More importantly, with an uncontrolled number of tasks you can
require an uncontrolled use of resources, decreasing efficiency to levels
well below that attainable with sensible conservation of resources.
Imagine, if you will, a task that starts by allocating 1GB of memory. Would
you want 10,000 of those?
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A more flexible task creation

2018-06-15 Thread Michel Desmoulin

> 
> The strict API compatibility requirements of core Python stdlib, coupled
> with the very long feature release life-cycles of Python, make me think
> this sort of thing perhaps is better built in an utility library on top
> of asyncio, rather than inside asyncio itself?  18 months is a long long
> time to iterate on these features.  I can't wait for Python 3.8...
>  

A lot of my late requests come from my attempt to group some of that in
a lib: https://github.com/Tygs/ayo

Most of it works, although I got read of context() recently, but the
lazy task part really fails.


Indeed, the API allows to do:

async with ayo.scope() as run:
task_list = run.all(foo(), foo(), foo())
run.asap(bar())
await task_list.gather()
run.asap(baz())



scope() return a nursery like object, and this works perfectly, with the
usual guaranty of Trio's nursery, but working in asyncio right now.

However, I tried to add to the mix:

async with ayo.scope(max_concurrency=2) as run:
task_list = run.all(foo(), foo(), foo())
run.asap(bar())
await task_list.gather()
run.asap(baz())

And I can get it to work. task_list will right now contains a list of
tasks and None, because some tasks are not scheduled immediately. That's
why I wanted lazytasks. I tried to create my own lazy tasks, but it
never really worked. I'm going to try to go down the road of wrapping
the unscheduled coro in a future-like object as suggested by Yuri. But
having that built-in seems logical, elegant, and just good design in
general: __init__ should not have side effects.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A more flexible task creation

2018-06-15 Thread Michel Desmoulin


Le 14/06/2018 à 04:09, Nathaniel Smith a écrit :
> How about:
> 
> async def wait_to_run(async_fn, *args):
>     await wait_for_something()
>     return await async_fn(*args)
> 
> task = loop.create_task(wait_to_run(myfunc, ...))
> 

It's quite elegant, although figuring out the wait_for_something() is
going to be tricky.


> -
> 
> Whatever strategy you use, you should also think about what semantics
> you want if one of these delayed tasks is cancelled before it starts.
> 
> For regular, non-delayed tasks, Trio makes sure that even if it gets
> cancelled before it starts, then it still gets scheduled and runs until
> the first cancellation point. This is necessary for correct resource
> hand-off between tasks:
> 
> async def some_task(handle):
>     with handle:
>         await ...
> 
> If we skipped running this task entirely, then the handle wouldn't be
> closed properly; scheduling it once allows the with block to run, and
> then get cleaned up by the cancellation exception. I'm not sure but I
> think asyncio handles pre-cancellation in a similar way. (Yury, do you
> know?
> 
> Now, in delayed task case, there's a similar issue. If you want to keep
> the same solution, then you might want to instead write:
> 
> # asyncio
> async def wait_to_run(async_fn, *args):
>     try:
>         await wait_for_something()
>     except asyncio.CancelledError:
>         # have to create a subtask to make it cancellable
>         subtask = loop.create_task(async_fn(*args))
>         # then cancel it immediately
>         subtask.cancel()
>         # and wait for the cancellation to be processed
>         return await subtask
>     else:
>         return await async_fn(*args)
> 
> In trio, this could be simplified to
> 
> # trio
> async def wait_to_run(async_fn, *args):
>     try:
>         await wait_for_something()
>     except trio.Cancelled:
>         pass
>     return await async_fn(*args)
> 
> (This works because of trio's "stateful cancellation" – if the whole
> thing is cancelled, then as soon as async_fn hits a cancellation point
> the exception will be re-delivered.)

Thanks for the tip. It schedules it in all cases, but I don't know what
asyncio does with it. I'll add a unit test for that.
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Some data points for the "annual release cadence" concept

2018-06-15 Thread Nick Coghlan
On 14 June 2018 at 06:30, Ronald Oussoren  wrote:

> On 13 Jun 2018, at 15:42, Nick Coghlan  wrote:
>
> Yeah, pretty much - once we can get to the point where it's routine for
> folks to be building "abiX" or "abiXY" wheels (with the latter not actually
> being a defined compatibility tag yet, but having the meaning of "targets
> the stable ABI as first defined in CPython X.Y"), rather than feature
> release specific "cpXYm" ones, then a *lot* of the extension module
> maintenance pain otherwise arising from more frequent CPython releases
> should be avoided.
>
> There'd still be a lot of other details to work out to turn the proposed
> release cadence change into a practical reality, but this is the key piece
> that I think is a primarily technical hurdle: simplifying the current
> "wheel-per-python-version-per-target-platform" community project build
> matrices to instead be "wheel-per-target-platform”.
>
>
> This requires getting people to mostly stop using the non-stable ABI, and
> that could be a lot of work for projects that have existing C extensions
> that don’t use the stable ABI or cython/cffi/…
>
> That said, the CPython API tends to be fairly stable over releases and
> even without using the stable ABI supporting faster CPython feature
> releases shouldn’t be too onerous, especially for projects with some kind
> of automation for creating release artefacts (such as a CI system).
>

Right, there would still be a non-zero impact on projects that ship binary
artifacts.

Having a viable stable ABI as a target just allows third party projects to
make the trade-off between the upfront cost of migrating to the stable ABI
(but then only needing to rebuild binaries when their own code changes),
and the ongoing cost of maintaining an extra few sets of binary wheel
archives. I think asking folks to make that trade-off on a case by case
basis is reasonable, whereas back in the previous discussion I considered
*only* offering the second option to be unreasonable.

Cheers,
Nick.

-- 
Nick Coghlan   |   ncogh...@gmail.com   |   Brisbane, Australia
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A more flexible task creation

2018-06-15 Thread Gustavo Carneiro
On Fri, 15 Jun 2018 at 09:18, Michel Desmoulin 
wrote:

>
> >
> > The strict API compatibility requirements of core Python stdlib, coupled
> > with the very long feature release life-cycles of Python, make me think
> > this sort of thing perhaps is better built in an utility library on top
> > of asyncio, rather than inside asyncio itself?  18 months is a long long
> > time to iterate on these features.  I can't wait for Python 3.8...
> >
>
> A lot of my late requests come from my attempt to group some of that in
> a lib: https://github.com/Tygs/ayo
>

Ah, good idea.


> Most of it works, although I got read of context() recently, but the
> lazy task part really fails.
>
>
> Indeed, the API allows to do:
>
> async with ayo.scope() as run:
> task_list = run.all(foo(), foo(), foo())
> run.asap(bar())
> await task_list.gather()
> run.asap(baz())
>
>
>
> scope() return a nursery like object, and this works perfectly, with the
> usual guaranty of Trio's nursery, but working in asyncio right now.
>

To be honest, I see "async with" being abused everywhere in asyncio,
lately.  I like to have objects with start() and stop() methods, but
everywhere I see async context managers.

Fine, add nursery or whatever, but please also have a simple start() /
stop() public API.

"async with" is only good for functional programming.  If you want to go
more of an object-oriented style, you tend to have start() and stop()
methods in your classes, which will call start() & stop() (or close())
methods recursively on nested resources.  So of the libraries (aiopg, I'm
looking at you) don't support start/stop or open/close well.


> However, I tried to add to the mix:
>
> async with ayo.scope(max_concurrency=2) as run:
> task_list = run.all(foo(), foo(), foo())
> run.asap(bar())
> await task_list.gather()
> run.asap(baz())
>
> And I can get it to work. task_list will right now contains a list of
> tasks and None, because some tasks are not scheduled immediately. That's
> why I wanted lazytasks. I tried to create my own lazy tasks, but it
> never really worked. I'm going to try to go down the road of wrapping
> the unscheduled coro in a future-like object as suggested by Yuri. But
> having that built-in seems logical, elegant, and just good design in
> general: __init__ should not have side effects.
>

I tend to slightly agree, but OTOH if asyncio had been designed to not
schedule tasks automatically on __init__ I bet there would have been other
users complaining that "why didn't task XX run?", or "why do tasks need a
start() method, that is clunky!".  You can't please everyone...

Also, in
 task_list = run.all(foo(), foo(), foo())

As soon as you call foo(), you are instantiating a coroutine, which
consumes memory, while the task may not even be scheduled for a long time
(if you have 5000 potential tasks but only execute 10 at a time, for
example).

But if you do as Yuri suggested, you'll instead accept a function
reference, foo, which is a singleton, you can have many foo references to
the function, but they will only create coroutine objects when the task is
actually about to be scheduled, so it's more efficient in terms of memory.

-- 
Gustavo J. A. M. Carneiro
Gambit Research
"The universe is always one step beyond logic." -- Frank Herbert
___
Python-Dev mailing list
Python-Dev@python.org
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Summary of Python tracker Issues

2018-06-15 Thread Python tracker

ACTIVITY SUMMARY (2018-06-08 - 2018-06-15)
Python tracker at https://bugs.python.org/

To view or respond to any of the issues listed below, click on the issue.
Do NOT respond to this message.

Issues counts and deltas:
  open6691 ( -1)
  closed 38930 (+62)
  total  45621 (+61)

Open issues with patches: 2644 


Issues opened (42)
==

#33656: IDLE: Turn on DPI awareness on Windows
https://bugs.python.org/issue33656  reopened by terry.reedy

#33811: asyncio accepting connection limit
https://bugs.python.org/issue33811  opened by lguo

#33813: Update overdue 'Deprecated ... removed in 3.x' messages
https://bugs.python.org/issue33813  opened by terry.reedy

#33816: New metaclass example for Data Model topic
https://bugs.python.org/issue33816  opened by adelfino

#33817: PyString_FromFormatV() fails to build empty strings
https://bugs.python.org/issue33817  opened by Tey

#33820: IDLE subsection of What's New 3.6
https://bugs.python.org/issue33820  opened by terry.reedy

#33821: IDLE subsection of What's New 3.7
https://bugs.python.org/issue33821  opened by terry.reedy

#33822: IDLE subsection of What's New 3.8
https://bugs.python.org/issue33822  opened by terry.reedy

#33823: A BUG in concurrent/asyncio
https://bugs.python.org/issue33823  opened by Python++

#33824: Settign LANG=C modifies the --version behavior
https://bugs.python.org/issue33824  opened by hroncok

#33826: enable discovery of class source code in IPython interactively
https://bugs.python.org/issue33826  opened by t-vi

#33829: C API: provide new object protocol helper
https://bugs.python.org/issue33829  opened by Bartosz Gołaszewski

#33830: Error in the output of one example in the httplib docs
https://bugs.python.org/issue33830  opened by Aifu LIU

#33832: Make "magic methods" a little more discoverable in the docs
https://bugs.python.org/issue33832  opened by adelfino

#33833: ProactorEventLoop raises AssertionError
https://bugs.python.org/issue33833  opened by twisteroid ambassador

#33834: Test for ProactorEventLoop logs InvalidStateError
https://bugs.python.org/issue33834  opened by twisteroid ambassador

#33836: [Good first-time issue] Recommend keyword-only param for memoi
https://bugs.python.org/issue33836  opened by zach.ware

#33837: Closing asyncio.Server on asyncio.ProactorEventLoop causes all
https://bugs.python.org/issue33837  opened by mliska

#33838: Very slow upload with http.client on Windows when setting time
https://bugs.python.org/issue33838  opened by ivknv

#33839: IDLE tooltips.py: refactor and add docstrings and tests
https://bugs.python.org/issue33839  opened by terry.reedy

#33840: connection limit on listening socket in asyncio
https://bugs.python.org/issue33840  opened by lguo

#33841: lock not released in threading.Condition
https://bugs.python.org/issue33841  opened by lev.maximov

#33842: Remove tarfile.filemode
https://bugs.python.org/issue33842  opened by inada.naoki

#33843: Remove deprecated stuff in cgi module
https://bugs.python.org/issue33843  opened by inada.naoki

#33846: Misleading error message in urllib.parse.unquote
https://bugs.python.org/issue33846  opened by thet

#33847: doc: Add '@' operator entry to index
https://bugs.python.org/issue33847  opened by adelfino

#33851: 3.7 regression: ast.get_docstring() for a node that lacks a do
https://bugs.python.org/issue33851  opened by mgedmin

#33852: doc Remove parentheses from sequence subscription description
https://bugs.python.org/issue33852  opened by adelfino

#33854: doc Add PEP title in seealso of Built-in Types
https://bugs.python.org/issue33854  opened by adelfino

#33855: IDLE: Minimally test every non-startup module.
https://bugs.python.org/issue33855  opened by terry.reedy

#33856: Type "help" is not present on win32
https://bugs.python.org/issue33856  opened by matrixise

#33857: python exception on Solaris : code for hash blake2b was not fo
https://bugs.python.org/issue33857  opened by goron

#33858: A typo in multiprocessing documentation
https://bugs.python.org/issue33858  opened by aruseni

#33859: Spelling mistakes found using aspell
https://bugs.python.org/issue33859  opened by xtreak

#33861: Minor improvements of tests for os.path.
https://bugs.python.org/issue33861  opened by serhiy.storchaka

#33864: collections.abc.ByteString does not register memoryview
https://bugs.python.org/issue33864  opened by fried

#33865: [EASY] Missing code page aliases: "unknown encoding: 874"
https://bugs.python.org/issue33865  opened by winvinc

#33866: Stop using OrderedDict in enum
https://bugs.python.org/issue33866  opened by inada.naoki

#33867: Module dicts are wiped on module garbage collection
https://bugs.python.org/issue33867  opened by natedogith1

#33868: test__xxsubinterpreters: test_subinterpreter() fails randomly 
https://bugs.python.org/issue33868  opened by vstinner

#33869: doc Add set, frozen set, and tuple entries to Glossary
https://bugs.python.org/issue33869  opened by adelfino

#33870: pdb continue + b