httpd: Syntax error on line 54
I installed apache 2.2.4 and modPython 3.3.1 on Fc6 while starting the apache server the following error occurs: httpd: Syntax error on line 54 of /usr/local/apache2/conf/httpd.conf: Cannot load /usr/local/apache2/modules/mod_python.so into server: /usr/ local/apache2/modules/mod_python.so: cannot restore segment prot after reloc: Permission denied help plz -- http://mail.python.org/mailman/listinfo/python-list
Available candidate for Oracle DBA
Hi, Hope you are doing good,, Skill SetCurrent-location RelocationAvailability oracal DBAtennessee yes immediately As per your requirement for Oracle DBA position, I’m forwarding you one of my consultant’s resume. Please find the attachment and revert back to me. Let me know who is the client and whether he is your direct client or not… Have a great day ahead… -- http://mail.python.org/mailman/listinfo/python-list
python installation help
please help me in installing python -- https://mail.python.org/mailman/listinfo/python-list
Using threads in python is safe ?
Hi All, I want to use therads in my application. Going through the docs , I read about GIL. Now I am confused whether using threads in python is safe or not. One thing I know that if I am accessing global variables in two or more threads I need to synchronize them using locking or such mechanism so that only one thread access them at a time. below are few questions for which I am looking answers. 1. In order to support multi-threaded Python programs, there's a global lock that must be held by the current thread before it can safely access Python objects. Does this lock need to be held by python application script expliciltly before accessing any python object or interpreter takes acre of it ? 2. Does multithreaded python script need to held lock before calling any blocking I/O call? Or one should not worry about GIL while using python threads if job to be processed by thread does not call any global variables and thread unsafe Python/C extension ? 3. I. want to use threadpool mechanism in python. Would it not help in multiprocessor environment if job to be processed by worker thread is I/O bound ? Sorry there are many questions here , but mostly they are related and are because I am confused with GIL funda. -- Thanx & Regards, Deepak Rokade Do what u Enjoy & Enjoy what u Do... -- http://mail.python.org/mailman/listinfo/python-list
Re: Using threads in python is safe ?
Thanks all for removing confusion about GIL, one more question; If jobs to be processed by threds is I/O bound would multithreading help python to improve speed of application ? Since I read that " multithreading is not a good strategy to improve speed of python application." On Mon, Mar 17, 2008 at 7:00 AM, Benjamin <[EMAIL PROTECTED]> wrote: > On Mar 16, 3:40 pm, "Gabriel Genellina" <[EMAIL PROTECTED]> > wrote: > > En Sat, 15 Mar 2008 11:57:44 -0200, Deepak Rokade <[EMAIL PROTECTED]> > > escribi�: > > > > > I want to use therads in my application. Going through the docs , I > read > > > about GIL. > > > Now I am confused whether using threads in python is safe or not. > > > > > One thing I know that if I am accessing global variables in two or > more > > > threads I need to synchronize them > > > using locking or such mechanism so that only one thread access them at > a > > > time. > > > > Yes, altough some operations are known to be atomic so you don't need a > > lock in that cases. I think there is a list in the wiki somewhere > http://wiki.python.org/moinor perhaps at the effbot's site > http://www.effbot.org > Even for atomic operations, you should lock, though. That is not > consistent over different Python implementations and is not always > going to be in same in CPython. > > > > > 1. In order to support multi-threaded Python programs, there's a > global > > > lock that must be held > > > by the current thread before it can safely access Python objects. > > > Does this lock need to be held by python application script > expliciltly > > > before accessing any python object or > > > interpreter takes acre of it ? > > > > No, the interpreter takes care of it. The GIL is a concern for those > > writing extensions using the Python API. > > > > > 2. Does multithreaded python script need to held lock before calling > any > > > blocking I/O call? > > > Or one should not worry about GIL while using python threads if job to > be > > > processed by thread does not call > > > any global variables and thread unsafe Python/C extension ? > > > > Python code should not worry about the GIL. The problem would be, a > > callback written in Python for a not-thread-aware extension that had > > released the GIL. > > > > -- > > Gabriel Genellina > > -- > http://mail.python.org/mailman/listinfo/python-list > -- Thanx & Regards, Deepak Rokade Do what u Enjoy & Enjoy what u Do... -- http://mail.python.org/mailman/listinfo/python-list
Embedding numpy works once, but not twice??
I have a program in which I have successfully embedded Python. Now, I want
to include NumPy as well (and other modules). I am able to import numpy
once. Then I close the python console in my program and then re-open it.
When I try to import numpy for a second time, the program crashes. Below is
a simple version of the problem.
The main() import numpy twice. The first import works fine -- prints "no
errors.". But the second load crashes the program. What is going on here?
By the way, this is all in Windows (XP and Vista have same problem) using
python25.dll (since numpy does not work for python26.dll). I am using MinGW
compiler.
#include
#include
#include
int load(char * code)
{
PyObject *errobj, *errdata, *errtraceback, *pystring;
int retval;
Py_Initialize();
PyObject *main = PyImport_AddModule("__main__");
PyObject* main_dict = PyModule_GetDict( main );
PyObject * rstring = PyRun_String( code, Py_file_input, main_dict, main_dict
); //the second main_dict was my_program_dict originally
PyErr_Fetch (&errobj, &errdata, &errtraceback);
if (errdata != NULL)
{
PyObject *s = PyObject_Str(errdata);
char * c = PyString_AS_STRING(s);
printf("%s\n",c); //print any errors
Py_DECREF(s);
}
else
{
printf("no errors.\n");
}
Py_XDECREF(errobj);
Py_XDECREF(errdata);
Py_XDECREF(errtraceback);
Py_Finalize();
return 0;
}
int main()
{
load("import numpy\n");
load("import numpy\n");
}
//output is:
// no errors
//
--
http://mail.python.org/mailman/listinfo/python-list
Re: Embedding numpy works once, but not twice??
I sort of guessed that was the issue -- doing Py_Initialize/Py_Finalize more than once. Thanks for the help. On Sun, Feb 1, 2009 at 12:43 AM, Gabriel Genellina wrote: > En Sun, 01 Feb 2009 03:47:27 -0200, Deepak Chandran < > [email protected]> escribió: > > I have a program in which I have successfully embedded Python. Now, I want >> to include NumPy as well (and other modules). I am able to import numpy >> once. Then I close the python console in my program and then re-open it. >> When I try to import numpy for a second time, the program crashes. Below >> is >> a simple version of the problem. >> > > The problem is not with NumPy. You can't run Py_Initialize/Py_Finalize more > than once. Python doesn't have any mechanism to un-initialize loaded > modules, and any static data that NumPy had initialized the first time it is > imported becomes invalid the second time. > Call Py_Initialize at the start of your program, and Py_Finalize at the > end, never more than once. > > -- > Gabriel Genellina > > > -- > http://mail.python.org/mailman/listinfo/python-list > -- http://mail.python.org/mailman/listinfo/python-list
Embedded python output capture
I have a program with embedded Python. If the python code has print statements, how do I retrieve those values (which normally go to stdout)? The output from Py_RunString is just NULL Thanks. -- http://mail.python.org/mailman/listinfo/python-list
embedding Python in a shared library
I have embedded Python in a shared library. This works fine in Windows (dll), but I get the following error is Ubuntu when I try to load modules: /usr/lib/python2.5/lib-dynload/*time.so*: error: symbol lookup error: * undefined* symbol: PyExc_ValueError I found many postings on this issue on the internet, but I was not able to find a solution that worked. I tried to load libpython2.5.so.1 into my program using dlopen, but that crashed the program for some reason. I tried building my library using the libpython2.5.a, but the same error was there. I am sure someone has a solution to this, since it seems like a general issue. -- http://mail.python.org/mailman/listinfo/python-list
listing files by modification time
Hi,
I am using python 2.5 on sun solaris.
I want to limit the number of files returned by os.listdir() to some number
(say 1000), how can I do it ?
Also I wan to list the files only if they are older than some x days, how
can I do it?
I can do this through shell script using command.
find ${ DIR_PATH} -mtime +`expr ${PERIOD} - 1` -type f -exec ls -l {} \;
I want to have similar functionality like above through python. I don't want
to list the whole files in directory and go to each file for checking it's
modification time.
I am thankful for any pointer.
--
Thanx & Regards,
Deepak Rokade
Do what u Enjoy &
Enjoy what u Do...
--
http://mail.python.org/mailman/listinfo/python-list
Re: listing files by modification time
Yes I can do that but for that I will have to go through entire list of
files and also I will have to first get the whole list of files present in
directory.
In case of my application this list can be huge and so want to list the
files which suits my criteria.
Similar to the unix find command I sent earlier or like below.
find / -mtime +5 -type f -exec ls -l {} \;
Otherwise can I limit the number of files in a list returned by os.listdir()
?
On Tue, Feb 17, 2009 at 6:06 PM, Chris Rebert wrote:
> On Tue, Feb 17, 2009 at 4:31 AM, Deepak Rokade
> wrote:
> > Hi,
> >
> > I am using python 2.5 on sun solaris.
> >
> > I want to limit the number of files returned by os.listdir() to some
> number
> > (say 1000), how can I do it ?
> >
> > Also I wan to list the files only if they are older than some x days, how
> > can I do it?
>
> You can filter the returned list of files by checking the results of
> os.stat() [http://docs.python.org/library/os.html#os.stat] on the
> files. The `stat` module [http://docs.python.org/library/stat.html]
> can help with the interpretation of the data returned from os.stat().
>
> Cheers,
> Chris
>
> --
> Follow the path of the Iguana...
> http://rebertia.com
>
--
Thanx & Regards,
Deepak Rokade
Do what u Enjoy &
Enjoy what u Do...
--
http://mail.python.org/mailman/listinfo/python-list
embedding Python in a shared library
I have embedded Python in a shared library. This works fine in Windows (dll), but I get the following error is Ubuntu when I try to load modules: /usr/lib/python2.5/lib-dynload/time.so: error: symbol lookup error: undefined symbol: PyExc_ValueError I found many postings on this issue on the internet, but I was not able to find a solution that worked. I tried to load libpython2.5.so.1 into my program using dlopen, but that crashed the program for some reason. I tried building my library using the libpython2.5.a, but the same error was there. I am sure someone has a solution to this, since it seems like a general issue. -- http://mail.python.org/mailman/listinfo/python-list
Unable to compile pyprocessing module on SUN solaris
I am trying to build package "pyprocessing" for python 2.5 I am using sun machine with Solaris 5.8 drok...@himalaya:~/modules_python/processing-0.52 (Deepak:)uname -a SunOS himalaya 5.8 Generic_117350-35 sun4u sparc SUNW,Sun-Fire While building the package I get below warnings. (Deepak:)python setup.py build Macros: HAVE_FD_TRANSFER = 1 HAVE_SEM_OPEN = 1 HAVE_SEM_TIMEDWAIT = 1 Libraries: ['rt'] running build running build_py running build_ext building 'processing._processing' extension gcc -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -DHAVE_SEM_OPEN=1 -DHAVE_FD_TRANSFER=1 -DHAVE_SEM_TIMEDWAIT=1 -I/home/drokade/ess_temp/56_release/ess/3rdparty/python/solaris/include/python2.5 -c src/processing.c -o build/temp.solaris-2.8-sun4u-2.5/src/processing.o src/processing.c: In function `processing_sendfd': *src/processing.c:158: warning: implicit declaration of function `CMSG_SPACE' src/processing.c:175: warning: implicit declaration of function `CMSG_LEN' *gcc -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -DHAVE_SEM_OPEN=1 -DHAVE_FD_TRANSFER=1 -DHAVE_SEM_TIMEDWAIT=1 -I/home/drokade/ess_temp/56_release/ess/3rdparty/python/solaris/include/python2.5 -c src/socket_connection.c -o build/temp.solaris-2.8-sun4u-2.5/src/socket_connection.o gcc -fno-strict-aliasing -DNDEBUG -g -O3 -Wall -Wstrict-prototypes -fPIC -DHAVE_SEM_OPEN=1 -DHAVE_FD_TRANSFER=1 -DHAVE_SEM_TIMEDWAIT=1 -I/home/drokade/ess_temp/56_release/ess/3rdparty/python/solaris/include/python2.5 -c src/semaphore.c -o build/temp.solaris-2.8-sun4u-2.5/src/semaphore.o src/semaphore.c: In function `SemLock_acquire': *src/semaphore.c:296: warning: implicit declaration of function `sem_timedwait' src/semaphore.c: In function `SemLock_new': src/semaphore.c:416: warning: int format, pid_t arg (arg 4) *gcc -shared build/temp.solaris-2.8-sun4u-2.5/src/processing.o build/temp.solaris-2.8-sun4u-2.5/src/socket_connection.o build/temp.solaris-2.8-sun4u-2.5/src/semaphore.o -lrt -o build/lib.solaris-2.8-sun4u-2.5/processing/_processing.so Though shared libraries were created, after I installed it in my python (2.5.1) I get below message while importing module 'processing" drok...@himalaya:~/modules_python/processing-0.52 (Deepak:)python Python 2.5.1 (r251:54863, Nov 27 2007, 18:27:50) [GCC 3.4.6] on sunos5 Type "help", "copyright", "credits" or "license" for more information. >>> import processing Traceback (most recent call last): File "", line 1, in File "/home/drokade/ess_temp/56_release/ess/3rdparty/python/solaris/lib/python2.5/site-packages/processing/__init__.py", line 62, in import _processing ImportError: ld.so.1: python2.5: fatal: relocation error: file /home/drokade/ess_temp/56_release/ess/3rdparty/python/solaris/lib/python2.5/site-packages/processing/_processing.so: symbol CMSG_SPACE: referenced symbol not found >>> ^D How can I get rid of this ? Is this package not available for sun solaris ? -- Thanx & Regards, Deepak Rokade -- http://mail.python.org/mailman/listinfo/python-list
Re: Unable to compile pyprocessing module on SUN solaris
This did not wok. I continued to get those warning and Import Error. I wanr through documentation of multiprocessing and it looks almost similar to processing module. Any advantages of multiprocessing module ? On Fri, Mar 20, 2009 at 6:53 PM, Christian Heimes wrote: > Deepak Rokade wrote: > > How can I get rid of this ? > > Is this package not available for sun solaris ? > > Apparently Solaris doesn't support sem_timedwait(). You have to disable > the feature in setup.py:: > > HAVE_SEM_TIMEDWAIT=0 > > Why are you using pyprocessing instead of multiprocessing? > > Christian > > -- > http://mail.python.org/mailman/listinfo/python-list > -- Thanx & Regards, Deepak Rokade Do what u Enjoy & Enjoy what u Do... -- http://mail.python.org/mailman/listinfo/python-list
Re: Unable to compile pyprocessing module on SUN solaris
Great ! It worked. I set HAVE_FD_TRANSFER = 0 and now that is working. I guess this feature should be for distributing task to remote machines... I do not require it as of now but any idea when this will be supported in multiprocessing ? Is this code not considered to support sun Solaris environment. On Fri, Mar 20, 2009 at 7:28 PM, Christian Heimes wrote: > Deepak Rokade wrote: > > This did not wok. > > I continued to get those warning and Import Error. > > > > I wanr through documentation of multiprocessing and it looks almost > similar > > to processing module. > > Any advantages of multiprocessing module ? > > You may have to disable more features and recompile everything until it > works. Have you removed the entire build directory before you recompiled > the package again? > > rm -r build && python2.5 setup.py build > > > The CMSG_* functions are used by the fd transer feature. Maybe solaris > doesn't support it, too? Please HAVE_FD_TRANSFER=0 and try it again. > > pyprocessing was added to Python 2.6 and 3.0 under the new name > multiprocessing. The multiprocessing package contains several fixes and > minor API changes. I'm maintaining a backport to Python 2.4 and 2.5. > > multiprocessing is a back port of the Python 2.6/3.0 > multiprocessing package. The multiprocessing package itself > is a renamed and updated version of R Oudkerk's pyprocessing > package. > > Christian > > -- > http://mail.python.org/mailman/listinfo/python-list > -- Thanx & Regards, Deepak Rokade Do what u Enjoy & Enjoy what u Do... -- http://mail.python.org/mailman/listinfo/python-list
Re: Unable to compile pyprocessing module on SUN solaris
Hi, I treed to comple the python 2.6 on Solaris 5.8 with gcc version 3.3.2 Initially multiprocessing package was not compile succesfully. below errors: Failed to find the necessary bits to build these modules: _bsddb _hashlib _sqlite3 _ssl _tkinter bsddb185 gdbm linuxaudiodev ossaudiodev readline To find the necessary bits, look in setup.py in detect_modules() for the module's name. Failed to build these modules: _curses_curses_panel _multiprocessing running build_scripts creating build/scripts-2.6 copying and adjusting /home/drokade/packages_less/PYTHON_2.6/Python-2.6.1/Tools/scripts/pydoc -> build/scripts-2.6 copying and adjusting /home/drokade/packages_less/PYTHON_2.6/Python-2.6.1/Tools/scripts/idle -> build/scripts-2.6 copying and adjusting /home/drokade/packages_less/PYTHON_2.6/Python-2.6.1/Tools/scripts/2to3 -> build/scripts-2.6 copying and adjusting /home/drokade/packages_less/PYTHON_2.6/Python-2.6.1/Lib/smtpd.py -> build/scripts-2.6 changing mode of build/scripts-2.6/pydoc from 664 to 775 changing mode of build/scripts-2.6/idle from 664 to 775 changing mode of build/scripts-2.6/2to3 from 664 to 775 changing mode of build/scripts-2.6/smtpd.py from 664 to 775 drok...@himalaya:~/packages_less/PYTHON_2.6/Python-2.6.1 (Deepak:) (Deepak:)gcc -v Reading specs from /usr/local/lib/gcc-lib/sparc-sun-solaris2.8/3.3.2/specs Configured with: ../configure --with-as=/usr/ccs/bin/as --with-ld=/usr/ccs/bin/ld --disable-nls Thread model: posix gcc version 3.3.2 Then later I modified the "setup.py" script as else: # Linux and other unices macros = dict( HAVE_SEM_OPEN=0, HAVE_SEM_TIMEDWAIT=0, HAVE_FD_TRANSFER=0 ) and then I could compile multiprocessing module , after "make clean" however I got error while importing this module bash-2.03$ python2.6 Python 2.6.1 (r261:67515, Mar 26 2009, 11:44:45) [GCC 3.3.2] on sunos5 Type "help", "copyright", "credits" or "license" for more information. >>> >>> >>> import multiprocessing Traceback (most recent call last): File "", line 1, in File "/home/drokade/packages_less/PYTHON_2.6/Python-2.6.1_Compiled//lib/python2.6/multiprocessing/__init__.py", line 63, in from multiprocessing.process import Process, current_process, active_children File "/home/drokade/packages_less/PYTHON_2.6/Python-2.6.1_Compiled//lib/python2.6/multiprocessing/process.py", line 18, in import itertools ImportError: No module named itertools >>> >>> ^D I think we should have these flags seperate for "SunOs" currently. bash-2.03$ bash-2.03$ python2.6 -c "import sys; print sys.platform" sunos5 bash-2.03$ On Sat, Mar 21, 2009 at 12:57 AM, Christian Heimes wrote: > Deepak Rokade wrote: > > I am trying to build package "pyprocessing" for python 2.5 > > > > I am using sun machine with Solaris 5.8 > > > > drok...@himalaya:~/modules_python/processing-0.52 > > (Deepak:)uname -a > > SunOS himalaya 5.8 Generic_117350-35 sun4u sparc SUNW,Sun-Fire > > > > While building the package I get below warnings. > > > > (Deepak:)python setup.py build > > Macros: > > HAVE_FD_TRANSFER = 1 > > HAVE_SEM_OPEN = 1 > > HAVE_SEM_TIMEDWAIT = 1 > > > > Libraries: > > ['rt'] > > Can you do me a favor and compile Python 2.6.1 on your machine? The > download link is http://www.python.org/download/releases/2.6.1/ > > I like to know if the multiprocessing module in 2.6.1 builds correctly > on Solaris. I also need the version number of GCC (gcc -v) and the value > of sys.platform (python -c "import sys; print sys.platform"). > > Thanks! > > I'm including Jesse, the maintainer of Python 2.6's multiprocessing > module, in our discussion. It seem like Solaris needs > HAVE_FD_TRANSFER=0, HAVE_SEM_TIMEDWAIT = 0 in order to compile > multiprocessing. > > Christian > > > > > -- Thanx & Regards, Deepak Rokade Do what u Enjoy & Enjoy what u Do... -- http://mail.python.org/mailman/listinfo/python-list
Multiprocessing module
Hi All, I have decided to use multiprocessing module in my application. In brief, my application fetches files from multiple remote directories and distributes the received files to one or more remote directories using SFTP. Since this application is going to be commercial one I want to know at this stage if there are any known serious bugs (not limitations) in the multiprocessing module? Also are there any examples where this module has been used for commercial applications? Where can I get this information about companies / products using this particular module? -- Thanx & Regards, Deepak Rokade -- http://mail.python.org/mailman/listinfo/python-list
configuring python with disable-thread in Ubuntu
Hello, I am embedding python inside a C++ program. For some reason (I think libxml2), I am getting Segmentation fault at PyThread_release_lock. The solution I found online was to configure python with --disable-thread. I used "apt-get install python-dev" to install python. How do I re-configure it using the --disable-thread option? Thanks. -- http://mail.python.org/mailman/listinfo/python-list
multiprocessing and threads
Are there any special guidelines for using multiprocessing package and threads in python program. I am designing a application that works with database and files. In my program I was spawning some threads and then some process pools. Only my main process use the threads and my child processes do not use any of the threads that were copied due to fork call. I observed that while creating the process pools some fork calls were hanging. I changed the sequence and then spawned process pools and then threads. This gave me better results and none of the fork then hanged. Is there any limitation based on sequence of threads and process ? Any specific issues ? Thanks Deepak -- http://mail.python.org/mailman/listinfo/python-list
[no subject]
I am trying to install numpy along with the python35 But it getting error everytime. Sent from Mail for Windows 10 -- https://mail.python.org/mailman/listinfo/python-list
