omz:forum

    • Register
    • Login
    • Search
    • Recent
    • Popular

    Welcome!

    This is the community forum for my apps Pythonista and Editorial.

    For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.


    new file/project couldn't be created

    Pythonista
    3
    66
    19467
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • pavlinb
      pavlinb last edited by

      Code at line 49:

      ml_model = MLModel.modelWithContentsOfURL_error_(c_model_url, None)

      1 Reply Last reply Reply Quote 0
      • pavlinb
        pavlinb last edited by

        Before crash Documents & Data iPhone storage was about 150MB.
        After crashing - 3GB.

        1 Reply Last reply Reply Quote 0
        • cvp
          cvp last edited by cvp

          Did you test with a very long text?

          Do you use the beta? Because in my (beta) version, line 772 of objc_util is relative to restype and I don't know if it is the same in your version

          Does the used space increase of some GB at each test?

          pavlinb 1 Reply Last reply Reply Quote 0
          • pavlinb
            pavlinb @cvp last edited by

            @cvp said:

            Did you test with a very long text?
            What do you mean?

            Do you use the beta? Because in my (beta) version, line 772 of objc_util is relative to restype and I don't know if it is the same in your version
            No, I use official version from Appstore. I was with beta when this happens for a first time.

            Does the used space increase of some GB at each test?
            No, space was around 150MB after each teast.

            cvp 1 Reply Last reply Reply Quote 0
            • cvp
              cvp @pavlinb last edited by cvp

              @pavlinb Could you tell me which is the line 772 in objc_util in the App Store version.

              Thus, you don't have a crash each time.

              Do you use another mlmodel than OCR.mlmodel?

              Is the image containing a text with a lot of characters?

              pavlinb 1 Reply Last reply Reply Quote 0
              • pavlinb
                pavlinb @cvp last edited by

                @cvp said:

                @pavlinb Could you tell me which is the line 772 in objc_util in the App Store version.
                res = objc_msgSend(cls, sel(self.sel_name), *args)

                Thus, you don't have a crash each time.
                Only with script that reads/writes to memory

                Do you use another mlmodel than OCR.mlmodel?
                Yes, used other models, it doesn’t depend from model.

                Is the image containing a text with a lot of characters?
                It looks that script tough the image has lot of characters.

                cvp 1 Reply Last reply Reply Quote 0
                • cvp
                  cvp @pavlinb last edited by

                  @pavlinb you said that memory was 150MB after a test, and this test was not trying to write to memory?
                  I don't understand the difference between two tests of mlmodel, one writing and the other one no

                  1 Reply Last reply Reply Quote 0
                  • pavlinb
                    pavlinb last edited by

                    Memory was about 150MB when test finish without crash - I checked the memory regularly.

                    When Pythonista crashed, I checked the memory again - it was 3GB.

                    I tried to OCR an image, in wich algo recognizes lot of characters. May this is the main reason.

                    Do you want a copy of that image to try?

                    Now I can't edit scripts and can't create new ones in Pythonista.

                    Some scripts still work.

                    But those that uses mlmodels crashes when script tries to load the model.

                    cvp 1 Reply Last reply Reply Quote 0
                    • cvp
                      cvp @pavlinb last edited by

                      @pavlinb I don't want to take the risk to need to reinstall, sorry.
                      If you remove Pythonista app from active apps list, and wait some time, does the memory vary?

                      pavlinb 1 Reply Last reply Reply Quote 0
                      • pavlinb
                        pavlinb @cvp last edited by

                        @cvp said:

                        @pavlinb I don't want to take the risk to need to reinstall, sorry.

                        Sure, no problem.

                        If you remove Pythonista app from active apps list, and wait some time, does the memory vary?

                        After 1 hour memory is the same - 3GB.

                        cvp 1 Reply Last reply Reply Quote 0
                        • cvp
                          cvp @pavlinb last edited by

                          @pavlinb I think it is now time to ask help from our big guru's...

                          1 Reply Last reply Reply Quote 0
                          • JonB
                            JonB last edited by

                            I wonder can you check the pythonista temp directory?

                            Are you using a local file when you load your model? Or an internet url?

                            pavlinb 1 Reply Last reply Reply Quote 0
                            • JonB
                              JonB last edited by

                              Also... What is your model_url? We need to check that is actually valid.

                              1 Reply Last reply Reply Quote 0
                              • pavlinb
                                pavlinb @JonB last edited by

                                @JonB said:

                                I wonder can you check the pythonista temp directory?

                                Sure, what do you need?

                                Are you using a local file when you load your model? Or an internet url?

                                I use local files.

                                1 Reply Last reply Reply Quote 0
                                • JonB
                                  JonB last edited by

                                  I mean check it for excessively large temp files.

                                  What is your exact model_url?

                                  Can you get a small model to work?

                                  Have you already pre-compiled the model?

                                  cvp 1 Reply Last reply Reply Quote 0
                                  • JonB
                                    JonB last edited by

                                    https://alexsosn.github.io/ml/2017/06/09/Core-ML-will-not-Work-for-Your-App.html

                                    This mentions that complex models can turn into several GB on the device. You might also try backing up after, and looking at what files get created and where. Possibly, I'm thinking if you put your file into a folder starting with a period, that keeps pythonista from showing /indexing it.

                                    1 Reply Last reply Reply Quote 0
                                    • cvp
                                      cvp last edited by cvp

                                      If he uses the mlmodel I found, it is here

                                      but I also used it without any problems, with images of short texts

                                      1 Reply Last reply Reply Quote 0
                                      • pavlinb
                                        pavlinb last edited by

                                        Yes, I used mentioned model.

                                        And with clean images with few symbols all is ok.

                                        Probably the problem occurs with complex images.

                                        1 Reply Last reply Reply Quote 0
                                        • cvp
                                          cvp @JonB last edited by

                                          @JonB said:

                                          Have you already pre-compiled the model?

                                          The omz code compiles the model before using it.

                                          1 Reply Last reply Reply Quote 0
                                          • cvp
                                            cvp last edited by cvp

                                            For each character, the compiled model is generated as
                                            file:///private/var/mobile/Containers/Data/Application/C285FD04-6489-45E5-A6C5-D4A44D300BBC/tmp/(A%20Document%20Being%20Saved%20By%20Pythonista3%20643)/OCR.mlmodelc/

                                            Omz code needs to be improved so this compilation is done only once, not for each character...

                                            1 Reply Last reply Reply Quote 0
                                            • First post
                                              Last post
                                            Powered by NodeBB Forums | Contributors