Getting rid of virtual environments with a better dependency system
Hello all, I don't know if this suggestion is missing some point, or it's part of something already proposed before. In a professional environment, we've came to a point in which most people use virtual environments or code environments to avoid "polluting a global environment". However, I think that's a problem with the default behaviour of the module management in Python. A nice default behaviour would be to search for a requirements.txt file in the same directory as __file__, and use the newest version of every module that matches the constraints. If no requirements where given, the newest version already installed could be used. That would require a structure that allows multiple versions of the same module to be downloaded. I already anticipate some problems: increased disk usage for people that are not using virtual environments, the possibility of breaking changes for scripts with no constraints over a module (so that if a different module download a newer version, both would be using it), and of course the hassle of a completely new default behaviour that would require a transition in many codebases. But still, I believe it would pay off in terms of time saved in environment installing and switching. Also, I think it's a good step in the path to integrating pip as something closer to the Python core. What's your opinion, is the effort required too big for the returns? Do you think other problems may arise? -- https://mail.python.org/mailman/listinfo/python-list
Getting rid of virtual environments with a better dependency system
Hello all, I don't know if this suggestion is missing some point, or it's part of something already proposed. In a professional environment, we've came to a point in which most people use virtual environments or conda environments to avoid "polluting a global environment". However, I think that's a problem with the default behaviour of the module management in Python. A nice default behaviour would be to search for a requirements.txt file in the same directory as __file__, and use the newest version of every module that matches the constraints. If no requirements where given, the newest version already installed could be used. That would require allowing multiple versions of the same module to be downloaded. I already anticipate some problems: increased disk usage for people that are not using virtual environments, the possibility of breaking changes for scripts with no constraints over a module (so that if a different module download a newer version, both would be using it), and of course the hassle of a completely new default behaviour that would require a transition in many codebases. That there are other solutions to the problem, such as forcing the usage of semantic versioning, but that's a bit utopic. But still, I believe it would pay off in terms of time saved in environment installing and switching. Also, it's a good step in the path to integrating pip as something closer to the Python core. What's your opinion, is the effort required too big for the returns? Do you think other problems may arise? -- https://mail.python.org/mailman/listinfo/python-list
Re: Getting rid of virtual environments with a better dependency system
On Wednesday, 11 November 2020 at 12:22:24 UTC+1, Chris Angelico wrote: > On Wed, Nov 11, 2020 at 10:06 PM j c wrote: > > > > Hello all, > > > > I don't know if this suggestion is missing some point, or it's part of > > something already proposed. > > > > In a professional environment, we've came to a point in which most people > > use virtual environments or conda environments to avoid "polluting a global > > environment". > > > > However, I think that's a problem with the default behaviour of the module > > management in Python. A nice default behaviour would be to search for a > > requirements.txt file in the same directory as __file__, and use the newest > > version of every module that matches the constraints. If no requirements > > where given, the newest version already installed could be used. That would > > require allowing multiple versions of the same module to be downloaded. > > > This would stop venvs from providing the isolation that they are > supposed to, and instead would just create yet another way to invoke > dependency hell. No thank you. > > A virtual environment isn't just a way to install different versions > of modules. It's way WAY more than that, and if you need to have too > many different versions around, you have bigger problems to deal with. > > (As a simple thought experiment to prove the problem with your > proposal: what happens with your dependencies' dependencies, and what > if they conflict? At what point would that be detected?) > > ChrisA How can this behaviour turn into dependency hell? Every dependency use the specified version if any, otherwise the most recent one, which also applies to second order dependencies. In case of conflict, the first version would be imported, which is currently the default behaviour. The main difference is that this approach would be able to generate a warning before running the script with no need for pipdeptree. -- https://mail.python.org/mailman/listinfo/python-list
Re: How to implement multiple constructors
[EMAIL PROTECTED] wrote: > I am a C++ developer with only a little experience using Python. I > want to create a Python class where by I can construct an instance from > that class based on one of two different object types. The approaches I've seen used are to use a new class method as an alternate ctor with a special name, and to use the types module for type comparison within such a ctor. -- J C LawrenceThey said, "You have a blue guitar, -(*)You do not play things as they are." [EMAIL PROTECTED] The man replied, "Things as they are http://www.kanga.nu/~claw/ Are changed upon the blue guitar." -- http://mail.python.org/mailman/listinfo/python-list
