Restart interpreter with function or command
Hi, instead of to close full Pythonista app can user restart python interpreter with a soft key or function?
Sometimes I need to restart the app due to problems with imported libraries from site-packages.
Can python interpreter be reinitialize before each script execution like python in pc? Maybe optionally via setting or trick?
Hi, a simple solution to simulate (maybe) the reset of Pythonista could be to perform the following tasks:
- close Pythonista
- open Pythonista (to be sure to restart the interpreter)
- save list of default imported modules that user can see with:
import sys print(sys.modules.keys())
- import something
- compare the default imported modules saved in 3. with the new one after 4.
- delete only the new modules that user finds in 5. and not present in 3. (the default) with:
del(sys.modules[XXX]) ## XXX=list of new modules
Could it be a working solution to clear the ram of python interpreter in Pythonista? Or should we perform some other cleaning somewhere else?
@Matteo, I am totally guessing here. I am not sure how the garbage collection works full stop, but further to the normal garbage collection, I am not sure if Pythonista does anything between runs or loading of modules etc. with the gc module. As I say this maybe totally stupid comment as maybe the gc cant help in certain states. But for me it just comes to mind to investigate the gc module if you are trying to clean up everything.
@Phuket2 Thank you for the suggestion! Unfortunately I don't know anything about garbage collection, I don't know what it is, I will try to read something in Internet.
My approach is very simple:
- to determine the available RAM in Pythonista immediately after an app restart.
- to perform some libraries import (built-in or not) by executing some scripts.
- to determine the available RAM in Pythonista after the importing.
- to try resetting the interpreter by deleting all imported libraries from list sys.modules.keys() with del(sys.modules[XXX]).
- to try again executing some scripts that needs the libraries in 2. : if I can't execute the scripts, it means the interpreter doesn't recognize them, so ok.
- to determine the available RAM in Pythonista after 5: if the available ram is similar to the one measured in 1. it means that the command
del(sys.modules[XXX]) ## XXX=list of new modules
can also free the ram from deleted libs, so ok.
If 5. and 6. is ok, then I could show myself that the method works to reinitialize the interpreter.
Am I simplifying too much?
Hi, I've done some simple test on ram in Pythonista using resource.getrusage(resource.RUSAGE_SELF).ru_maxrss but after deleting the imported libs, I have an increase of used ram memory.
I don't know if resource.getrusage is a good method to measure the ram in a python environment. For now and based on the tests done, I can say that del(sys.modules[xxx]) can't free ram in Pythonista.
Has someone of you some suggestion?
Hi, the following script restores the default 'sys.modules' list after a lot of libs imports (sometimes it could be useful, or not?).
I can't see no way to directly replace the list of imported libs with one command.
list_default = ['json.decoder', '__future__', 'copy_reg', 'sre_compile', '_sre', 'encodings', 'site', '__builtin__', 'sysconfig', 'importcompletion', '__main__', 'encodings.encodings', 'json.struct', 'abc', 'posixpath', '_weakrefset', 'pykit_io', 'errno', '_json', 'encodings.codecs', 'sre_constants', 're', 'json', '_abcoll', 'types', '_codecs', 'encodings.__builtin__', '_struct', '_warnings', 'json._json', 'genericpath', 'stat', 'zipimport', '_sysconfigdata', 'warnings', 'UserDict', 'encodings.ascii', 'json.json', 'encodings.utf_8', 'json.sys', 'sys', 'json.scanner', 'imp', 'codecs', 'json.encoder', 'pythonista_startup', 'os.path', 'struct', 'json.re', '_locale', 'signal', 'traceback', 'linecache', 'posix', 'encodings.aliases', 'exceptions', 'sre_parse', 'os', '_weakref', '_debugger_ui'] import sys list_new = sys.modules.keys() size_default = len(list_default) list_of_imported_modules =  i = 0 while i < len(list_new): flag = False j = 0 while (j < size_default) & (flag == False): if list_new[i] == list_default[j]: i=i+1 flag = True else: j=j+1 if flag == False: list_of_imported_modules.append(list_new[i]) del list_new[i] for i in range(len(list_of_imported_modules)): del sys.modules[list_of_imported_modules[i]]
'list_default' is the list of imported libs after (1-2 minutes of inactivity) a full restart of Pythonista app. Maybe 'list_default' should be adapted for different types of python environment (I think to some pythonista_startup.py that imports some libs immediately after a full restart of the app).
I saved the script as a wrench action and I will use it to test if I can solve some issues related to imported or to be imported libs without closing and restarting Pythonista.
I will perform some ram tests about memory release when sys.modules returns to the default one.
Does someone suggest me a library (or a link in this forum) for ram checking in Pythonista? I've tried psutil but it can work with Pythonista.
Does someone know if 'resource.getrusage(resource.RUSAGE_SELF).ru_maxrss' is a valid alternative?
I'm thinking to a way to delete from sys.modules only a user chosen library with all dependencies (sub-imports), maybe with a function like:
deimport('mypackage') ## to del mypackage and all its new imported libs from sys.modules
Please kindly tell me if I'm doing something useless that can be done in a very different way. Target: to create a script for Pythonista that tries to simulate as much as possible a full Pythonista restart.
You might try also running gc.collect() a few times in a row to ensure everything that can be collected gets collected.
It is not clear to me whether resource.getrusage is giving you what you think it is. I believe that may be the max size (since start of process?) so it never decreases. Also, for example, if you clear stuff out, then allocate a bug variable, you will see it doesn't increase.
The _get_task_info method (which maybe doesnt work on 64bit,) does produce reliable results, and repeated calls show different phases of gc at work, or perhaps ios memory management at work.
Certainly deleting modules which have large python variables can reduce memory. But modules that are loading c libraries, or allocating things on the c side probably would not. Also, when mixing c and python modules, you might end up fragmenting the heap in an unrecoverable way.. so it might be good to import all built in modules at the start of your session.
btw, here was a method that worked for me when developing some matplotlib backend work. The problem was matplotlib.use is only allowed once, but i needed to clear things during develoment:
def clear_backend(backend): clearlist=('matplotlib', 'pylab', 'backend', 'overlay') for module in sys.modules: if module.startswith(clearlist): modules.append(module) for module in modules: try: sys.modules.pop(module) except KeyError: pass import matplotlib matplotlib.use(backend) import matplotlib.pyplot as plt # for development: clear out matplotlib if 'matplotlib.pyplot' in sys.modules: import matplotlib try: if False \ or matplotlib.mtime<os.path.getmtime('backend_pythonista.py') \ or matplotlib.mtime<os.path.getmtime('overlay.py'): print('Please wait, clearing out backend') clear_backend('agg') # just to make sure we eliminate any backend references in mpl clear_backend('module://backend_pythonista') except AttributeError: pass else: import matplotlib matplotlib.use('module://backend_pythonista')
Here, i only clear the modules that need to be cleared (anything starting with matlotlib., pylab., and my files) , and only did that when i detect that my two files are newer than the import time, thus avoiding unnecessary reimport (matplotlib is slow)
@JonB Thank you very much John for sharing your code and for your interest! I will try it next days (Merry Christmas!).
In general, with a Python distribution (Computer or Pythonista), does
del sys.modules['some_libs']free python interpreter's ram memory after the execution?
Or simply it deletes the item in the list with the only effect of canceling the reference to the module for the searching process of the python interpreter?
del sys.modules[something]behaves exactly like any other
del- it removes the module object from the
sys.modulesdict, but that does not guarantee that any RAM will be freed. If there are no other references to the module, the module itself will probably be garbage-collected, but that doesn't mean that the module's contents will all be removed. For example, in this code:
import sys import mymodule c = mymodule.MyClass del mymodule del sys.modules["mymodule"]
mymoduleobject will probably be garbage-collected, but
MyClasswill stay alive, because we stored it in the
cvariable. This is a very simple example of how this might happen, in reality the reasons why an object is kept alive are often more complicated.
@dgelessus , I dont understand what I am talking about, but it surprises me that the gc module just does not have at method like purge_all. Of course use at your own risk. Also maybe some caveats are required or params to give a some control over what is purged. Maybe like gc.purge(GC.UN_REACHABLE) or something like that. But as I say, I don't really understand it, its just what I perceive should happen
Modules that just contain python variables probably get gc'd, though you may need to run gc a few times until things settle out (gc has multiple phases -- each time an object is gc'd, it might expose other objects that can be gc'd.). Though it is possible to create variables that don't get gc'd. But there is nothing that forces c alloc's to get free'd. Also, I believe that anything that loads an objc bundle or dylibs (everything on ios is probably static) probably never gets unloaded. i feel also that it is very easy to create memory leaks using objc_util (IIRC the methodcache creates a reference cycle)
That said, I think in most cases, you are just wanting to clear out modules that retain state, so you can start fresh -- I have only ever run out of memory doing heavy image/video prcessing.
@Phuket2 The garbage collector already does its best to remove all unreachable objects. If you want to see what objects the garbage collector cannot remove even though they are unreachable, look at the
gc.garbagelist. However in recent Python versions there are basically no cases where that happens anymore (see the
gcmodule docs for details).
It's impossible to make the garbage collector safely free any more objects than it currently does. Objects can only be safely freed if they are unreachable. If a reachable object would be freed, it would still be accessible from Python somewhere, and if Python code tried to access it, bad things would happen. Most likely Python would crash - or worse, behave incorrectly without crashing.
NSBundleinstances have an
unloadmethod, which probably ends up calling
dlcloseon the underlying library handle. That will unload the library if it is not referenced from elsewhere. However unloading C libraries is REALLY unsafe. Since C has no garbage collector or anything, there's no way to tell if there's a pointer somewhere that points into the library's memory, which would become invalid when the library is unloaded. If you know for certain that no code other than your own uses the library, you can of course do your own cleanup first to ensure that you aren't referencing the library anymore. But if there's other code that uses the library, that's basically impossible to do.
@dgelessus , thanks for the reply. Just out of interest I will look a little deeper. The disconnect for me is understanding what is a un-reachable object. Eg, scope, lost refereferences etc... I guess to understand memory management, this is one big concept you have to understand. Of course the reference count system. I understand there is a reference counting system, but I am sure to really understand it, you have to delve in deeper and understand weak references. Also to have the confidence/knowledge that when you are writing code if you are potentially going to create a reference count that you will track of.
If I wrote Python for a living, there would be no doubt I would study and understand these concepts.
@dgelessus Thank you for your explanation with the example code. About it, I've understood that even if I delete the module from sys.modules, something related to that module could be exist again in other place of the interpreter memory (RAM), for example if I save in a global or local variable something related to the deleted module.
But you explained to me that if I long touch the Clear key at top-right of python console window, it resets the environment (it deletes all global variables of the interpreter).
So, does a Pythonista function/command exist to reset the environment programmatically, without long touching the Clear key? This function, if exists, could be added to the sys.modules reset routine to add the delete action of variables.
About the routine I posted, instead of having the reset of the entire sys.modules list to the default one, it would be smarter to have a function that, executed by user, could delete the imported modules (and all sub-modules) that user wants to delete from memory + delete all global/local variables containing only objects related to the imported modules that user wants to delete (function for long touching of Clear key).
@Phuket2 Hi, unfortunately I've not any deep knowledge and experience in computer science, I have not done specific studies about it and I can't understand a lot of things, but I can assure you that, compared to years of fortran90 programming for my environmental control studies, I am understanding more things now with a programming tool like Pythonista and posting my questions in this nice forum followed by cultured and reasonable people who does not hesitate to offer some lines of code.
@Matteo , thats great. I also really enjoy the conversations here. They are so enlightening. Even if you are wrong, people are present and helpful. It's nice to be able to ask questions without being intimidated or someone making you feel like you are stupid.
Sort of related but not really, I was listening to a podcast today about web server optimisation. They just mentioned that some companies actually turn off the gc expect for the reference counting part and periodically just restart the service. Apparently can speed things up. But this is in a system made up of micro services where a service can be elegantly be restarted and other services are in place to handle the requests of the service being restarted. Sorry, its a bit of a tangent, but I found it interesting. Here is the link to the podcast if you are interested.
@Phuket2 Thank you for the suggestion about the podcast, however I'm not an expert about programming and computer science and in general, due to little free time, I limit myself to deepening only the things I need at a certain moment.
Anyway thank you!
@dgelessus or someone else, hi, sorry could you kindly tell me if function exists to simulate the long press Clear key in Pythonista console to reset environment (that deletes all global variables) to use it programmatically in scripts?
Thank you so much
@Matteo I don't know for sure what the "Reset Environment" option does internally. Most likely it uses the
pykit_preflight.pyscript, which Pythonista automatically runs when you press the "play" button on a script (before the script itself runs). You can open the
pykit_preflight.pywith this code:
import editor import os editor.open_file(os.path.join(os.path.dirname(os.path.dirname(os.__file__)), "pykit_preflight.py"), new_tab=True)
@dgelessus Hi thank you for your help, I will try to do some tests about resetting global variables programmatically following what you suggest.