swig/python memory leak question


I am just getting into swig and trying to wrap my C++ library in python. I have a newbee question. If my C++ object has a method that allocates memory and returns a void * to it, python complains about ...
Posted On: Sunday 11th of November 2012 11:49:24 PM Total Views:  460
View Complete with Replies

RELATED TOPICS OF Python Programming PROGRAMMING LANGUAGE




32-bit python memory limits?

I'm running a Python job on OS X 10.5.3 and the Python 2.5.2 that's available as a binary download at python.org for OS X. I ran a python program tonight that ended up using much more memory than anticipated. It just kept on using more and more memory. Instead of killing it, I just watched it, using Activity Monitor. I assumed that when it had 2GB allocated it would blow up, because I thought 32-bit python could only address 2GB. But Activity Monitor reported that it had allocated 3.99GB of virtual memory before it finally blew up with malloc errors. Was my understanding of a 2GB limit wrong I guess so! But I'm pretty sure I saw it max out at 2GB on linux... Anybody have an explanation, or is it just that my understanding of a 2GB limit was wrong Or was it perhaps right for earlier versions, or on linux...\t
VIEWS ON THIS POST

156

Posted on:

Wednesday 7th November 2012
View Replies!

Re: Passing a memory address (pointer) to an extension?

Philip Semanchuk wrote: > I'm writing a Python extension in C that wraps a function which takes a > void * as a parameter. (The function is shmat() which attaches a chunk > of shared memory to the process at the address supplied by the caller.) > I would like to expose this function to Python, but I don't know how to > define the interface. > > Specifically, when calling PyArg_ParseTuple(), what letter should I use > to represent the pointer in the format string The best idea I can come > up with is to use a long and then cast it to a void *, but assuming that > a long is big enough to store a void * is a shaky assumption. I could > use a long long (technically still risky, but practically probably OK) > but I'm not sure how widespread long longs are. I recommend not giving the user access to that argument. Just use NULL and let shmat() pick a starting address. I don't think it's really safe to let the user pick, even in C. Perhaps if you're doing *really* low level stuff. Of course, being that low level seems to be the point of posix_ipc, so maybe I should let you get on with it. Anyways, the format "n" in Python >= 2.5 will correspond to a Py_ssize_t integer, which will always be the size of a pointer. You can return the void* that you get from shmat() with a PyCPointer object or make a new, small type that encapsulates a pointer attached via shmat(). The benefit of a real type is that you can type-check the input to shmdt() for safety. I strongly recommend either of those approaches over returning an integer. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco
VIEWS ON THIS POST

305

Posted on:

Wednesday 7th November 2012
View Replies!

Re: Passing a memory address (pointer) to an extension?

On Oct 22, 2008, at 8:33 PM, Robert Kern wrote: > > Philip Semanchuk wrote: >> I'm writing a Python extension in C that wraps a function which >> takes a void * as a parameter. (The function is shmat() which >> attaches a chunk of shared memory to the process at the address >> supplied by the caller.) I would like to expose this function to >> Python, but I don't know how to define the interface. >> Specifically, when calling PyArg_ParseTuple(), what letter should I >> use to represent the pointer in the format string The best idea I >> can come up with is to use a long and then cast it to a void *, but >> assuming that a long is big enough to store a void * is a shaky >> assumption. I could use a long long (technically still risky, but >> practically probably OK) but I'm not sure how widespread long longs >> are. > > I recommend not giving the user access to that argument. Just use > NULL and let shmat() pick a starting address. I don't think it's > really safe to let the user pick, even in C. Perhaps if you're doing > *really* low level stuff. Of course, being that low level seems to > be the point of posix_ipc, so maybe I should let you get on with it. I agree. To deny users of this module access to this param of shmat() would be design arrogance on my part. To do so would be to claim that I know better than who might want to use my module. Using values other than NULL might be unwise (the man page I'm looking at states that clearly) but isn't one of the core design philosophies of Python "we're all consenting adults" > Anyways, the format "n" in Python >= 2.5 will correspond to a > Py_ssize_t integer, which will always be the size of a pointer.
VIEWS ON THIS POST

173

Posted on:

Wednesday 7th November 2012
View Replies!

multiprocessing eats memory

On Sep 25, 8:40am, "Max Ivanov" wrote: > At any time in main process there are shouldn't be no more than two copies of data > (one original data and one result). From the looks of it you are storing a lots of references to various copies of your data via the async set.
VIEWS ON THIS POST

134

Posted on:

Wednesday 7th November 2012
View Replies!

how dump a program which is running in memory

hi I have written a service running backgroud to do something in linux. unfortunately,I deleted the source code by mistake, and I can still see the process running background using "ps aux" : username 13820 0.0 0.0 60368 2964 S Aug20 0:33 python ./UpdateJobStatus.py I wonder if there is some way to dump the programme "UpdateJobStatus.py" and get the source code back
VIEWS ON THIS POST

179

Posted on:

Wednesday 7th November 2012
View Replies!

Re: Swap memory in Python ? - three questions

Robert LaMarca wrote: > > > I am using numpy and wish to create very large arrays. My system is AMD 64 x 2 Ubuntu 8.04. Ubuntu should be 64 bit. I have 3gb RAM and a 15 GB swap drive. > > The command I have been trying to use is; > g=numpy.ones([1000,1000,1000],numpy.int32) > > This returns a memory error. > A smaller array ([500,500,500]) worked fine.. > Two smaller arrays again crashed the system. > > So... I did the math. a 1000x1000x1000 array at 32 bits should be around 4gb RAM... Obviously larger than RAM, but much smaller than the swap drive. > > 1. So... does Numpy have a really lot of overhead Or is my system just not somehow getting to make use of the 15gb swap area. > 2. Is there a way I can access the swap area, or direct numpy to do so Or do I have to write out my own numpy cache system... > 3. How difficult is it to use data compression internally on numpy arrays I do not know what numpy does, but constant arrays only need to store the dimensions and the constant value and have a getitem method that returns that constant value for any valid index. This is at most a few hundred bytes regardless of the dimensions.
VIEWS ON THIS POST

155

Posted on:

Wednesday 7th November 2012
View Replies!

How to view how much memory some process use in Windows?

How can I monitor with a Python script how much memory does a process use in Windows I want to do statistics about memory consumption of processes like Firefox, Antivirus, etc. When I search in Google I only find information about how to monitor this in linux or how to reduce Python programs memory usage.
VIEWS ON THIS POST

252

Posted on:

Wednesday 7th November 2012
View Replies!

Pylons and memory use?

Hi . I was thinking about signing up with a web host that supports Pylons (among many other things) and one of the differences between the various plans is application memory for long-running processes. The plan I'd like to sign up for has 80MB. Does anyone know if this is enough for basic Pylons applications Just in general, how exactly can I calculate how much memory a Pylons application (or any other type of application, for that matter) will require Is there some general range I might be able to rely on Does 80MB seem like enough for just playing around and hobbyist work
VIEWS ON THIS POST

253

Posted on:

Wednesday 7th November 2012
View Replies!

Re: How do you debug memory usage?

On Tue, May 6, 2008 at 4:21 PM, Banibrata Dutta wrote: > Many not be the most intuitive and elegant solution (I'm just a Python > newbie), but if your Python code is constrained to the usage of Python 2.2 > language features, you could use Jython, and then (I'm hoping, since I've > not tried this myself), use the Java Memory usage profiling/debugging tools. >
VIEWS ON THIS POST

190

Posted on:

Wednesday 7th November 2012
View Replies!

Re: python has memory leak?

On Tue, 22 Apr 2008 14:54:37 -0700 (PDT), yzghan@gmail.com wrote: >Hi all, > >I feel that my python script is leaking memory. And this is a test I >have: > > [snip] > The test doesn't demonstrate any leaks. It does demonstrate that memory usage can remain at or near peak memory usage even after the objects for which that memory was allocated are no longer live in the process. This is only a leak if peak memory goes up again each time you create any new objects. Try repeated allocations of a large dictionary and observe how memory usage rises and falls. Python 2.5 does a somewhat better job of releasing memory when actual use falls below peak, but this is a difficult thing to do perfectly. Jean-Paul
VIEWS ON THIS POST

150

Posted on:

Wednesday 7th November 2012
View Replies!

python has memory leak?

Hi all, I feel that my python script is leaking memory. And this is a test I have: log.write("[" + timestamp() + "] " + "test() ... memory usage: " + " ".join(repr(i/(1024*1024)) for i in getMemInfo()) + "\n") m = {} i = 1000*1000 while i > 0: i = i - 1 m.setdefault(i, []).append(i) log.write("[" + timestamp() + "] " + "test() ... memory usage: " + " ".join(repr(i/(1024*1024)) for i in getMemInfo()) + "\n") m = {} log.write("[" + timestamp() + "] " + "test() done. memory usage: " + " ".join(repr(i/(1024*1024)) for i in getMemInfo()) + "\n") From which I got: [17:44:50] test() ... memory usage: 55L 55L [17:44:53] test() ... memory usage: 143L 143L [17:44:53] test() done. memory usage: 125L 143L In the above code getMemInfo is my func to return current and peak memory usage in bytes. Can some expert explain how python manages memory The version of my python is: Python 2.4.4 Stackless 3.1b3 060516 (#71, Oct 31 2007, 14:22:28) [MSC v.1310 32 bit (Intel)] on win32 Many
VIEWS ON THIS POST

140

Posted on:

Wednesday 7th November 2012
View Replies!

How to get memory size/usage of python object

Is there a way to check the REAL size in memory of a python object Something like > print sizeof(mylist) or > print sizeof(myclass_object) or something like that ...
VIEWS ON THIS POST

165

Posted on:

Wednesday 7th November 2012
View Replies!

Which uses less memory?

I'm not sure if this is as easy a question as I'd like it to be, but here goes.... I'm working on an application that is very memory intensive, so we're trying to reduce the memory footprint of classes wherever possible. I have a need for a class which is able to have a type identifier which can be examined at run-time to determine whether that class (called Domain) contains data I care about or not. I've thought of two ways to implement this: 1. Add a type attribute and set it to a descriptive string. 2. Create marker classes and use multiple inheritance to "attach" these markers to specific Domains. Here's the kicker: I need to serialize these Domains and send them to/ from Java code as well as work on them using Python. We're looking to use Hessian and pyactivemq to handle the Java/Python interfaces. Which, I guess, leads to the following group of questions: 1. Which method has the smaller footprint within the Python engine 2. Do these protocols (Hessian and Stomp) preserve the class information when the class is serialized Any input would be most welcome.
VIEWS ON THIS POST

138

Posted on:

Wednesday 7th November 2012
View Replies!

Optimizing memory usage w/ HypterText package

I've been using the HyperText module for a while now (http://dustman.net/andy/python/HyperText/), and I really like it. I've run into a situation where I have code to construct a table and while it is normally perfect, there are times where the table can get quite huge (e.g. 1000 columns, 100000 rows .... yes, the question of "how on earth would someone render this table" comes up, but that's not the point here ), and the code I have generating this starts choking and dying from excessive RAM usage. I'm curious if people see a better way of going about this task and/or believe that an alternative method of HTML generation here would be better. A (possibly somewhat pseudocode, as I'm doing this by hand) small example of what I'm doing ... inputs = [A, List, Of, Values, To, Go, Into, A, Table] numcolumns = howManyColumnsIWant out = ht.TABLE() column = 0 for input in inputs: if (column == 0): tr = ht.TR() tr.append(ht.TD(input)) column += 1 if (column == numcolumns): out.append(tr) column = 0 As I said, this works fine for normal cases, but I've run into some situations where I need this to scale not just into the hundreds of thousands but also well into the millions - and that's just not happening. Is there a better way to do this (which involves direct HTML generation in Python), or am I SOL here
VIEWS ON THIS POST

120

Posted on:

Wednesday 7th November 2012
View Replies!

memory allocation for Python list

, I have a python list of unknown length, that sequentially grows up via adding single elements. Each element has same size in memory (numpy.array of shape 1 x N, N is known from the very beginning). As I ...
VIEWS ON THIS POST

131

Posted on:

Saturday 10th November 2012
View Replies!

how to measure memory usage on Mac OSX ?

, i want to measure memory usage of my python process on Mac OSX. i tired resource module, but it doesn't work on OSX. how can i get it thnx. -- masayuki takagi...
VIEWS ON THIS POST

153

Posted on:

Saturday 10th November 2012
View Replies!

FW: Using Python to shared memory resources between Linux and Windows

Diez, What you have said is extremely concerning. I am now using VMware. With Linux as the Master and windows as the guest operating system. I was wondering if you have ever had to develop a share memory resource between ...
VIEWS ON THIS POST

185

Posted on:

Sunday 11th November 2012
View Replies!

In-place memory manager, mmap (was: Fastest way to store ints andfloats on disk)

I've got an "in-place" memory manager that uses a disk-backed memory- mapped buffer. Among its possibilities are: storing variable-length strings and structures for persistence and interprocess communication with mmap. It allocates segments of a generic buffer by length and ...
VIEWS ON THIS POST

180

Posted on:

Sunday 11th November 2012
View Replies!

Re: Static memory allocation in Python

On Jun 17, 2008, at 2:34 PM, Eduardo Henrique Tessarioli wrote: > > > I am running a very simple python application and I noted that the > memory allocation is something like 4,5M. > This is a problem ...
VIEWS ON THIS POST

181

Posted on:

Sunday 11th November 2012
View Replies!

Re: How do you debug memory usage?

> > I'll check a few of those results and post to the list if I find something good. > It looks like Heapy, part of the Guppy project can do this: http://guppy-pe.sourceforge.net/#Heapy David. David....
VIEWS ON THIS POST

196

Posted on:

Sunday 11th November 2012
View Replies!