To get around the fact that activating a virtualenv requires setting
environment variables in the current shell poetry has a shell command that
simply spawns a new shell with the appropriate environment variables:
https://python-poetry.org/docs/cli/#shell
So a cross-platform `activate` command
It's kind of weird that people seem to be missing the point about this. Python
already has comprehensions for all the iterable builtins except strings. The
proposed syntax doesn't introduce any new concept and would simply make strings
more consistent with the rest of the builtins. The argument
> the ONLY predicate that can be expressed about a single character is it being
> a member of a subset of all Unicode characters
You seem to be assuming that the comprehension would be purposefully restricted
to iterating over strings. The original author already provided examples with
predicat
> c"f(c) for c in some_string if g(c)"
Even this example would allow the interpreter to skip building the generator
object and having to feed the result of every f(c) back into the iterator
protocol. This is similar to f-strings vs str.format. You could say that
f-strings are redundant because
> The builtin interables bytearray, bytes, enumerate, filter frozenset, map,
> memoryview, range, reversed, tuple and zip suggest differently.
enumerate, filter, map, range, reversed and zip don't apply because they're not
collections, you wouldn't be able to store the result of the computation
> you talked about builtin *iterables*
My mistake, I reused the terminology used by the original author to make it
easier to follow.
> The point of iterators like map, zip and filter is to *avoid* performing the
> computation until it is required.
Of course. Maybe I wasn't clear enough. I don'
> But that was not the primary motivator for adding them to the language.
I don't think the original author thinks that way either about string
comprehensions. I was asked about the kind of speed benefits that string
comprehensions would have over using a generator with "".join() and I used
f-s
Recently there's been some discussion around string comprehensions, and I
wanted to look at a specific variant of the proposal in a bit more detail.
Original thread:
https://mail.python.org/archives/list/[email protected]/thread/MVQGP4GGTIWQRJTSY5S6SDYES6JVOOGK/
Let's say i have a matrix o
I didn't even realize f'{n for n in row}' was already valid syntax. Since
generator expressions can usually only appear within parentheses, I assumed the
syntax wouldn't conflict with anything because you would need to parenthesize
the generator to make it work. Anyway, now I see that the better
There was a discussion about this a couple months ago but instead of adding a
new keyword the idea was that the walrus operator could be upgraded from simply
being a binding operator to matching patterns supported by the match statement.
if ["example", *files] := variable:
print(files)
If t
On the other hand I think extending the walrus operator would make the change
less intrusive and the syntax more easily discoverable:
if match := re.match(r"...", some_string):
print(match[1], match[2])
if [_, a, b] := re.match(r"...", some_string):
print(a, b) # Assuming match objects
Now this is a really interesting proposal. Something wasn't right in the other
discussion, I didn't think making variable decorators inconsistent with the
current class and function decorators by providing the variable name was
particularly good. I've always felt like that something like
__deco
> (len(collection) == 0) is True
bool((len(collection) == 0) is True) == True
___
Python-ideas mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-ideas.pytho
I find that when I run into a similar scenario the reason why I need the
iterable to be non-empty is because I'm trying to find something in it, and for
this the `else` clause works pretty well:
for item in get_items():
if check(item):
do_thing(item)
break
else:
raise Val
You can do that with a custom protocol
https://docs.python.org/3/library/typing.html#typing.Protocol
___
Python-ideas mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/l
+1
I think this is a very sensible proposal and I encountered the use-cases you
mentioned several times.
___
Python-ideas mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailm
Yes. This is desperately needed. Usually I'm not a big fan of adding new
standard library modules but in this case since toml is becoming such a
critical part of packaging it seems like a no-brainer.
___
Python-ideas mailing list -- python-ideas@python.
Pattern-matching is great. I think PEP 634 is on the right track, but it
would be a waste to only use pattern-matching for choosing a branch in a
match statement.
Let’s look at Rust:
if let [x, y] = my_array {
...
}
Rust "if let" constructs are an alternative to full-blown mat
> If you look in PEP 622 you'll see that there was a rejected idea `if
> match ...` that's pretty similar. We nixed it because it just made the
> PEP larger. For 3.11 we can consider something like this.
Of course I understand the PEP is already pretty big. But if it goes
through and we start thin
> Why should a failed match return None? That's not helpful if it matches
> but the value itself is None.
The only pattern that would match `None` is this one:
print(None := get_value()) # Always None
Here, the walrus operator would always return `None`. Either because the
function return
I think that instead of dict unpacking specifically, what we need is to come up
with a way to use the pattern-matching proposed in PEP 634 outside of match
statements. This would make it possible to unpack any pattern.
My opinion is that the walrus operator is practically waiting to support
pat
I'm in favor of keeping the PEP as it currently is. Mappings are naturally
structural subtypes of one another, therefore mapping patterns should be
consistent with class patterns.
car = Car(...)
match car:
case Vehicle():
pass
case Car(): # will never match
I'm in favor of keeping the PEP as it currently is. Mappings are naturally
structural subtypes of one another, therefore mapping patterns should be
consistent with class patterns.
car = Car(...)
match car:
case Vehicle():
pass
case Car(): # will never match
This thread is a mess. Move semantics is nothing more than creating a shallow
copy that steals the inner state of a previous instance. It's an optimization,
and moving out of a variable never makes the previous instance unusable:
void f1(std::vector&& vec);
void f2() {
std::vect
The C++ example specifically shows that if you're talking about ownership and
lifetimes, you're not talking about move semantics. As you pointed out, the
example wouldn't work in Rust specifically because Rust has a borrow checker,
and not just move semantics.
A compiler with a borrow checker w
Single-letter variables are common. If your use-case is inserting breakpoints
into arbitrary library code there's no way to guarantee that the builtin won't
be shadowed by some arguments or other local variables, making your alias
extremely unreliable.
def add(a, b):
"""Arbitrary li
I think this is the kind of feature that can very easily be abused. Whenever I
want to break out of a nested loop I take this as an opportunity to extract the
loop into its own function and use the return statement to break out of the
loop. IMO this is a lot better than having named or indexed l
Hi. Bundling this into the standard library doesn't seem to provide any real
advantage over defining it as a free-standing utility function in your own
code.
And if it's in your own code you can easily tweak it if you need to :)
___
Python-ideas maili
Hi,
I've been thinking that it would be nice if regex match objects could be
deconstructed with pattern matching. For example, a simple .obj parser could
use it like this:
match re.match(r"(v|f) (\d+) (\d+) (\d+)", line):
case ["v", x, y, z]:
print("Handle vertex")
I see. I guess the ambiguity would stem from trying to force match objects into
the sequence protocol even though the custom __getitem__() means that they're
essentially a mixed mapping:
Mapping[int | str, str | None]
If we avoid any sort of "smart" length derived from only mo.groups() or
A while ago there was a discussion about allowing "match" patterns for the
walrus operator. This would cover iterable unpacking as you suggested along
with all the patterns allowed in match statements.
if [x, y, z] := re.match(r"...").groups():
print(x, y, z)
The walrus expression w
> What if it does match, though?
The walrus operator returns the value on the right side so this wouldn't
change. In your example the original dict would get printed.
some_dict = {"x": 1, "y": 2}
print({"x": x} := some_dict) # {"x": 1, "y": 2}
The only pattern where it's not possible to know
> Yes, but what if you're testing for something that could *potentially* match
> one of these empty objects?
The right side can absolutely be falsy but to be able to conflate the falsy
return value with the None emitted when the pattern doesn't match, the left
side has to be one of the dubious
Yeah you can technically craft such pathological edge cases but this is already
heavily discouraged. Libraries that change the usual semantics of python's
object model are rare.
The only exception I can think of would be numpy which disallows truthiness
checks because of the ambiguity of arrays
I really like this. One problem though is that it's not immediately obvious
what happens with binding patterns and "as" clauses. In the code inside the
case statement, should these identifiers refer to the last value matched, or
should they accumulate all the matches in a list?
_
I'm -1 on this. You can easily make a helper that achieves the desired syntax.
Presenting "human readable data" isn't just about collapsing spaces, and having
your own helper means that you can adjust the formatting to your specific use
case if needed (for example with a different separator).
> add multi-line commenting to the Python programming language
Python uses docstrings for multi-line comments.
> add capability to optimize for database access and usage
Do you have an example of a concrete "capability to optimize for database
access and usage"? What does that mean? How would it
Do you mean something like C# nameof()?
https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/operators/nameof
___
Python-ideas mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mai
38 matches
Mail list logo