Welcome!
This is the community forum for my apps Pythonista and Editorial.
For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.
Please HELP! I have done the impossible. I can not open pythonista
-
Hmmm. OMG
I read in some data using JSON and made a list out of all the records I read in. Then I used JSON to dump them to a file name.txt
Then in the pythonista browser clicked the file. Whatever I wrote out with JSON pythonista does not like. Just sits there trying to read in the file. I have tried force quitting pythonista, I have also powered on and off. But it remembers it wants to read that file regardless.I looked in the settings for pythonista, no setting that well. I am on v1.5
Any help appreciated, I am basically locked out of Pythonista now
-
Try entering
pythonista://
in Safari, this should open the app without trying to restore the file you had open. -
@omz, thank you. It worked perfectly. I was having a panic attack :)
-
I was trying to take a lot of files I dumped with JSON and concat to a single file basically. Even reading the resulting file programmatically had errors. So, now i read them in using JSON , creating a list of dicts then use pickle to write the list of dicts to disk. All works. Can not view the pickle file in pythonista, but that's fine. Reading the resulting pickled file into memory is correct. I should have never have saved JSON files in the first place I guess
Again, thanks for the help
-
JSON is not a format where you can concat a few files and it works. If you want to stick multiple JSON structures into one file, put them in a JSON array (Python
list
) and dump that to JSON. -
@dgelessus. I was doing that. Below is the code I was using. The output is with pickle, but same API call with JSON.
Was meant as a quick and dirty way to put all my JSON data from multiple files into one file. About 1200 files, which is nothing. (Just IMDB text records)
Anyway, works with pickle. I assume reading a pickle file back into memory is faster than a JSON file. Hmmm, that bad word again, assume!
def build_new_datafile(out_filename): import os, os.path # read each file in the dir, they have prev # been saved with json.dump. # append each read dict into a list. # then just write out the whole list. lst = [] data_dir = settings.__EXT_INFO_DIR__ for f in os.listdir(data_dir): fspec = data_dir + f if os.path.isfile(fspec): fh = open(fspec) lst.append(json.load(fh)) fh.close() #write out the new pickled file out_dir = settings.__HOME_DIR__ fspec = out_dir + out_filename fh = open(fspec, 'wb') fh.truncate() pickle.dump(lst, fh) fh.close()