> To be honest, I see "async with" being abused everywhere in asyncio,
> lately. I like to have objects with start() and stop() methods, but
> everywhere I see async context managers.>
> Fine, add nursery or whatever, but please also have a simple start() /
> stop() public API.
>
> "async with"
On Fri, 15 Jun 2018 at 09:18, Michel Desmoulin
wrote:
>
> >
> > The strict API compatibility requirements of core Python stdlib, coupled
> > with the very long feature release life-cycles of Python, make me think
> > this sort of thing perhaps is better built in an utility library on top
> > of a
Le 14/06/2018 à 04:09, Nathaniel Smith a écrit :
> How about:
>
> async def wait_to_run(async_fn, *args):
> await wait_for_something()
> return await async_fn(*args)
>
> task = loop.create_task(wait_to_run(myfunc, ...))
>
It's quite elegant, although figuring out the wait_for_somethin
>
> The strict API compatibility requirements of core Python stdlib, coupled
> with the very long feature release life-cycles of Python, make me think
> this sort of thing perhaps is better built in an utility library on top
> of asyncio, rather than inside asyncio itself? 18 months is a long lo
On Thu, Jun 14, 2018 at 8:14 PM, Chris Barker via Python-Dev <
python-dev@python.org> wrote:
> Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to
> understand the problem here.
>
> So why do queries fail with 1 tasks? or ANY number? If the async DB
> access code is wri
On Thu, Jun 14, 2018 at 3:31 PM, Tin Tvrtković wrote:
> * my gut feeling is spawning a thousand tasks and having them all fighting
> over the same semaphore and scheduling is going to be much less efficient
> than a small number of tasks draining a queue.
Fundamentally, a Semaphore is a queue:
h
On Thu, Jun 14, 2018 at 10:03 PM Steve Dower wrote:
> I often use
> semaphores for this when I need it, and it looks like
> asyncio.Semaphore() is sufficient for this:
>
>
> import asyncio
> task_limiter = asyncio.Semaphore(4)
>
> async def my_task():
> await task_limiter.acquire()
> tr
Other folks have already chimed in, so I'll be to the point. Try writing a
simple asyncio web scraper (using maybe the aiohttp library) and create
5000 tasks for scraping different sites. My prediction is a whole lot of
them will time out due to various reasons.
Other responses inline.
On Thu, Ju
On 14Jun2018 1214, Chris Barker via Python-Dev wrote:
Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying
to understand the problem here.
But if I have this right:
I've been using asyncio a lot lately and have encountered this
problem several times. Imagine you want
On Thu, Jun 14, 2018 at 9:17 PM Chris Barker via Python-Dev <
python-dev@python.org> wrote:
> Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to
> understand the problem here.
>
Vocabulary-wise 'queue depth' might be a suitable mental aid for what
people actually want to li
Excuse my ignorance (or maybe it's a vocabulary thing), but I'm trying to
understand the problem here.
But if I have this right:
I've been using asyncio a lot lately and have encountered this problem
> several times. Imagine you want to do a lot of queries against a database,
> spawning 1 tas
On Thu, 14 Jun 2018 at 17:40, Tin Tvrtković wrote:
> Hi,
>
> I've been using asyncio a lot lately and have encountered this problem
> several times. Imagine you want to do a lot of queries against a database,
> spawning 1 tasks in parallel will probably cause a lot of them to fail.
> What you
On Thu, Jun 14, 2018 at 12:40 PM Tin Tvrtković wrote:
>
> Hi,
>
> I've been using asyncio a lot lately and have encountered this problem
> several times. Imagine you want to do a lot of queries against a database,
> spawning 1 tasks in parallel will probably cause a lot of them to fail.
> W
Hi,
I've been using asyncio a lot lately and have encountered this problem
several times. Imagine you want to do a lot of queries against a database,
spawning 1 tasks in parallel will probably cause a lot of them to fail.
What you need in a task pool of sorts, to limit concurrency and do only
How about:
async def wait_to_run(async_fn, *args):
await wait_for_something()
return await async_fn(*args)
task = loop.create_task(wait_to_run(myfunc, ...))
-
Whatever strategy you use, you should also think about what semantics you
want if one of these delayed tasks is cancelled be
On Wed, Jun 13, 2018 at 4:47 PM Michel Desmoulin
wrote:
>
> I was working on a concurrency limiting code for asyncio, so the user
> may submit as many tasks as one wants, but only a max number of tasks
> will be submitted to the event loop at the same time.
What does that "concurrency limiting co
16 matches
Mail list logo