FW: pip Error
Sent from [1]Mail for Windows From: [2]Sidharth S Nair Sent: 11 December 2021 01:00 PM To: [3][email protected] Subject: pip Error Sent from [4]Mail for Windows Hi, I am NINJAGAMING107 a active python user and lately I have been trying to make my own personal AI assistant but I am not able to make because I am not able to import Speech Recognition and some other more. The error says module not found even though in my folder where I installed python and the packages there is a folder of speech recognition and the rest of the things which I installed. I used Pycharm and VS code for this still not working. If there is any way to fix this please tell me or if I am doing the installation wrong please tell me. Hope that you will reply to ASAP. Thank you in advance. NINJAGAMING107 This email has been checked for viruses by Avast antivirus [5]Avast logo software. [6]www.avast.com References Visible links 1. https://go.microsoft.com/fwlink/?LinkId=550986 2. mailto:[email protected] 3. mailto:[email protected] 4. https://go.microsoft.com/fwlink/?LinkId=550986 5. https://www.avast.com/antivirus 6. https://www.avast.com/antivirus -- https://mail.python.org/mailman/listinfo/python-list
Fwd: I/O bound threads got to no chance to run with small CPU bound threads with new GIL
*Resending this message after subscribing in python-mail-list*
-- Forwarded message -
From: Souvik Ghosh
Date: Sat, Dec 11, 2021 at 5:10 PM
Subject: I/O bound threads got to no chance to run with small CPU bound
threads with new GIL
To:
Hello PSF,
I'm Souvik Ghosh from India. I've been coding for Python for almost 5
years now. And, I love Python and care about it so much.
The issue is stated below,
According to David Beazley' talk in PyCon'2010 in Atlanta Georgia, he
demonstrated about a new GIL with running CPU bound and I/O bound
threads together.
He said the talk that the threads which are forced to timeout of 5ms,
will have the lower priority(which is CPU bound) and the thread which
suspends the GIL within 5ms will have higher priority (which is I/O
bound).
What happens in the following code is if I set args=(1000,) (seven
zero after 1) then only I/O bound runs and returns when CPU bound
takes much time to execute. But if I decrease that args to
args=(1000,) then I/O bound got no chance to reaquire the GIL in the
meantime even though the sys.getswitchinterval() is equal to 5ms(By
default). If I/O bound doesn't reacquire GIL with args=(1,) then
the time to execute to run
only the CPU bound takes 0.426035414 seconds. Thats means
almost ticks 0.426035414/0.005=85 (approx) times to set the
priority in between the two threads. In that case if the I/O got more
priority within that time, it should have returned the value within
that ticks. But I didn't happen.
import threading
from queue import Queue
from timeit import default_timer as timer
import urllib.request
q = Queue() # Queue technique to pass returns among threads while running
def decrement(numbers): # CPU bound
while numbers > 0:
numbers -= 1
if not q.empty():
"""I added this method because this thread will run most of the time
because it's mostly cpu bound"""
print(numbers)
print(q.get(block=False))
print(timer() - start) # It tell after when exactly I/O
bound returns value after both the threads started to run
def get_data(): # I/O bound
with urllib.request.urlopen("https://www.google.com";) as dt:
q.put(dt.read(), block=False)
if __name__ == "__main__":
start = timer()
t1 = threading.Thread(target=get_data)
#t2 = threading.Thread(target=decrement, args=(1000,)) #For
this I/O responds and returns
t2 = threading.Thread(target=decrement, args=(10,)) # I/O
doesn't responds at all
t1.start()
t2.start()
t1.join()
t2.join()
print(timer() - start)
Look at the second code...
import threading
from queue import Queue
from timeit import default_timer as timer
import urllib.request
import sys
q = Queue() # Queue technique to pass returns among threads while running
def decrement(numbers): # CPU bound
while numbers > 0:
numbers -= 1
if not q.empty():
"""I added this method because this thread will run most of the time
because it's mostly cpu bound"""
print(numbers)
print(q.get(block=False))
print(timer() - start) # It tell after when exactly I/O
bound returns value after both the threads started to run
def get_data(): # I/O bound
with urllib.request.urlopen("https://www.google.com";) as dt:
q.put(dt.read(), block=False)
if __name__ == "__main__":
sys.setswitchinterval(0.0001)
start = timer()
t1 = threading.Thread(target=get_data)
#t2 = threading.Thread(target=decrement, args=(100,)) #I/O
responds with this
t2 = threading.Thread(target=decrement, args=(1,))# I/O
doesn't responds at all even with this 0.0001
seconds of threads switching interval
t1.start()
t2.start()
t1.join()
t2.join()
print(timer() - start)
Can't we have a more better version of GIL to set I/O threads(overall)
priorities even more better and not to degrade the CPU bound and
better callbacks in response? Or, try to remove the GIL?
The issue I submitted in here:- https://bugs.python.org/issue46046
Thank you so much, great future of Python!
--
https://mail.python.org/mailman/listinfo/python-list
Re: Sad news: Fredrik Lundh ("Effbot") has passed away
On Saturday, December 11, 2021 at 1:37:07 AM UTC, Roel Schroeven wrote: > Message from Guido van Rossum > (https://mail.python.org/archives/list/[email protected]/thread/36Q5QBILL3QIFIA3KHNGFBNJQKXKN7SD/): > > > > A former core dev who works at Google just passed the news that > > Fredrik Lundh (also known as Effbot) has died. > > > > Fredrik was an early Python contributor (e.g. Elementtree and the 're' > > module) and his enthusiasm for the language and community were > > inspiring for all who encountered him or his work. He spent countless > > hours on comp.lang.python answering questions from newbies and > > advanced users alike. > > > > He also co-founded an early Python startup, Secret Labs AB, which > > among other software released an IDE named PythonWorks. Fredrik also > > created the Python Imaging Library (PIL) which is still THE way to > > interact with images in Python, now most often through its Pillow > > fork. His effbot.org site was a valuable resource for generations of > > Python users, especially its Tkinter documentation. > > > > Fredrik's private Facebook page contains the following message from > > November 25 by Ylva Larensson (translated from Swedish): > > > > """ > > > > It is with such sorrow and pain that I write this. Fredrik has left us > > suddenly. > > > > """ > > > > A core dev wrote: "I felt privileged to be able to study Fredrik's > > code and to read his writing. He was a huge asset to our community in > > the early days. I enjoyed his sense of humor as well. I'm sad that he > > passed away." > > > > We will miss him. > > > > -- > "Your scientists were so preoccupied with whether they could, they didn't > stop to think if they should" > -- Dr. Ian Malcolm Thanks for passing on this sad news. Effbot was one of the luminaries when I started writing Python (v1.4/v1.52 times). I still have his 'standard library' book, as well as being a user of PIL/Pillow. Jon N -- https://mail.python.org/mailman/listinfo/python-list
Re: FW: pip Error
On 12/11/21 00:35, Sidharth S Nair wrote: Hi, I am NINJAGAMING107 a active python user and lately I have been trying to make my own personal AI assistant but I am not able to make because I am not able to import Speech Recognition and some other more. The error says module not found even though in my folder where I installed python and the packages there is a folder of speech recognition and the rest of the things which I installed. I used Pycharm and VS code for this still not working. If there is any way to fix this please tell me or if I am doing the installation wrong please tell me. Hope that you will reply to ASAP. Rule #1 on module not found errors: it's always a path problem. Unfortunately, on today's systems, there are often many places things can go as you may have several Pythons, several virtualenvs, etc. The basic rule is to install with the Python you're intending to use, that way the paths it's going to look in are the same paths your packages got installed in. In the case of VS Code, you can open a terminal (inside the IDE) and run your pip commands there, and that will be happening in the environment that is defined for your project. In the case of PyCharm, you can click on the Python version in the lower right and click Interpreter Settings, through there you can search for and install packages from PyPi. Either one will give you the option to setup a fresh virtualenv for the project, which you may want to do since you're unsure of the current state. -- https://mail.python.org/mailman/listinfo/python-list
Set git config credential.helper cache and timeout via python
Hey All, I have a set of bash and python scripts that all interact with a remote git repository. While running the scripts, when ever a repo is cloned using HTTP and a user/pass combination, that information is stored in the .git/config file on disk including the password. I'm able to issue the following commands in git to effectively enable the caching of credentials in memory for a predefined amount of time off the linux shell: git config credential.helper 'cache git config credential.helper 'cache --timeout=300' However, what is the best way to do so via Python? I've read through the following to try and make sense of how I could do so: https://gitpython.readthedocs.io/en/stable/reference.html?highlight=cache#git.index.fun.read_cache https://pypi.org/project/git-credential-helpers/ But neither means appears to have or be able to do what I'm interested in. I would like to prevent the user and password from being stored on disk in any shape or form, including eliminating it from .gitconfig / .git/config files. I keep trying to search in google for possible solutions using "python git credential helper" and "git prevent storing passwords in git config files" but google keeps returning results on how to store passwords IN .git/config files, which is the exact opposite of what I want. :) An alternative to the above is to use ssh keys, however I would like to know an alternate way that avoids using ssh keys. Wondering if folks here did something similar using python and could provide a bit of insight how I could go about doing so? -- Thx, TK. -- https://mail.python.org/mailman/listinfo/python-list
Re: Set git config credential.helper cache and timeout via python
> Hey All, > > I have a set of bash and python scripts that all interact with a remote > git repository. > This does not exactly answer your question, but whenever I have wanted to interact with (popular) software via Python I have checked to see if someone has already written that code for me. https://pypi.org/search/?q=git https://gitpython.readthedocs.io/en/stable/tutorial.html -- https://mail.python.org/mailman/listinfo/python-list
Re: Set git config credential.helper cache and timeout via python
> > > Hey All, > > I have a set of bash and python scripts that all interact with a remote > git repository. > > > https://gitpython.readthedocs.io/en/stable/reference.html?highlight=cache#git.index.fun.read_cache > https://pypi.org/project/git-credential-helpers/ > > But neither means appears to have or be able to do what I'm interested > in. > I guess I should have read more carefully :( Ignore my second link. Maybe the first (https://pypi.org/search/?q=git) will be helpful? -- https://mail.python.org/mailman/listinfo/python-list
