omz:forum

    • Register
    • Login
    • Search
    • Recent
    • Popular

    Welcome!

    This is the community forum for my apps Pythonista and Editorial.

    For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.


    Videos

    Pythonista
    6
    20
    17136
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • Webmaster4o
      Webmaster4o last edited by Webmaster4o

      I found this code here

      -(void) getAllImagesFromVideo
      {
         imagesArray = [[NSMutableArray alloc] initWithCapacity:375];
         times = [[NSMutableArray alloc] initWithCapacity:375];
      
         for (Float64 i = 0; i < 15; i += 0.033) // For 25 fps in 15 sec of Video
         {
             [times addObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(i, 60)]];
         } 
      
         [imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {
             if (result == AVAssetImageGeneratorSucceeded)
             {
                 [imagesArray addObject:[UIImage imageWithCGImage:image]];
                 CGImageRelease(image);
             }
         }];
      }
      

      Can anyone help me convert it to use objc_util in Python? Is it even possible?

      JonB 1 Reply Last reply Reply Quote 0
      • omz
        omz last edited by

        @JonB I've experimented in that direction. On a 64-bit device, I've actually been able to construct a block using just ctypes. There are probably some gotchas, but it seems to work... I haven't been able to make this work with the 32-bit runtime at all though, it basically just crashes, not sure why exactly.

        1 Reply Last reply Reply Quote 0
        • JonB
          JonB last edited by

          @omz ok, a little reading, and i got NSArray's

          enumerateObjectsUsingBlock_
          

          working on a 32 bit machine. That admittedly is a fairly simple block, without any imported variables, copy helpers, etc... but then again, it is not clear that such features are required for any block based protocols..... if we need refernces to other objects, we can create and keep them in python.

          1 Reply Last reply Reply Quote 0
          • JonB
            JonB last edited by

            https://github.com/jsbain/objc_hacks/blob/master/blocktest.py
            working 32 bit example of using blocks

            1 Reply Last reply Reply Quote 1
            • omz
              omz last edited by

              @JonB Wow, that's amazing!

              I'll see if I can put together something that works on 32 and 64 bit for the next beta...

              1 Reply Last reply Reply Quote 0
              • JonB
                JonB @Webmaster4o last edited by

                @Webmaster4o ok, i really had low expectations... but this actually worked:

                Use CaptureMedia to record a video (i dont think it works with existing videos, but that can change). then run this script. Right now it just cycles through frames once per sec, but you could use this to extract an image from an arbitrary time. the CMTime takes an interger value, and integer timescale, value/timescale = time in seconds. For some reason, i was only able to extract images at 1 second intervals. using a larger timescale to get finer resolution had no effect.

                1 Reply Last reply Reply Quote 1
                • omz
                  omz last edited by

                  @JonB Haven't tried this yet, but maybe you can get finer resolution by setting the requestedTimeToleranceBefore/After properties of AVAssetImageGenerator?

                  1 Reply Last reply Reply Quote 0
                  • JonB
                    JonB last edited by

                    tried that, as well as AVURLAssetPreferPreciseDurationAndTiming.
                    i have read about similar issues on SO, the solution that worked for some was to create a video composition for a narrow range, then use the method above to get an image, but i have not tried that yet.

                    as an aside, it seems that Setting the restype manually to a struct type does not force the objc_msgSend_stret invokation on 32 bit. for instance asset.duration() didnt work, even manually setting rstype, i had to resort to calling msgSend_stret myself.

                    omz 1 Reply Last reply Reply Quote 0
                    • Webmaster4o
                      Webmaster4o last edited by

                      Wow, it works! Didn't expect it to! I don't understand HOW it works, but it does!

                      1 Reply Last reply Reply Quote 0
                      • omz
                        omz @JonB last edited by

                        as an aside, it seems that Setting the restype manually to a struct type does not force the objc_msgSend_stret invokation on 32 bit. for instance asset.duration() didnt work, even manually setting rstype, i had to resort to calling msgSend_stret myself.

                        Thanks, I'll look into that. Btw, I've made some progress with the type encoding parser, and it should be able to handle most structs automatically in the next build, so you don't necessarily have to create Structure subclasses for things like CMTime.

                        1 Reply Last reply Reply Quote 0
                        • First post
                          Last post
                        Powered by NodeBB Forums | Contributors