omz:forum

    • Register
    • Login
    • Search
    • Recent
    • Popular

    Welcome!

    This is the community forum for my apps Pythonista and Editorial.

    For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.


    Videos

    Pythonista
    6
    20
    17129
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • reefboy1
      reefboy1 last edited by

      @ccc I wasn't able to download the program from github. Also what I want to do is be able pick a video and be able to have it on the screen in video form (aka a watchable video). I'm working on something with the ui module and it might answer my question. I'll let you know how it works when I'm done doing the finishing touches!

      1 Reply Last reply Reply Quote 0
      • Webmaster4o
        Webmaster4o last edited by

        All the links are dead.

        1 Reply Last reply Reply Quote 0
        • JonB
          JonB last edited by

          so far, the only method deals with basically running an http server, and uploading from photos to the server, then showing the video in a webview. not ideal.

          it may be possible with ctypes in the beta.

          tjferry's repo seems offline, but cccs fork is there
          https://github.com/cclauss/CaptureMedia

          techteej 1 Reply Last reply Reply Quote 1
          • Webmaster4o
            Webmaster4o last edited by Webmaster4o

            It'd be great if @omz could include a video module in pythonista like OpenCV or even pyffmpeg, which could be used to extract frames from a video for frame-by-frame processing with PIL, then put them back into video form. I don't know how feasible this is in objc_util

            1 Reply Last reply Reply Quote 0
            • omz
              omz last edited by

              I think a lot of interesting stuff in that direction might become possible with objc_util and AVFoundation, but I haven't really looked into this very much yet, mostly because I just don't have a lot of experience with video APIs.

              1 Reply Last reply Reply Quote 0
              • Webmaster4o
                Webmaster4o last edited by Webmaster4o

                Ok, thanks for pointing me in the right direction as far as that's concerned. I'll see what I can figure out, but it probably won't be much, as I have no experience with objective-c.

                1 Reply Last reply Reply Quote 0
                • techteej
                  techteej @JonB last edited by techteej

                  @JonB I had taken this repo down, since @omz had made a script that was a lot simpler right here.

                  1 Reply Last reply Reply Quote 0
                  • JonB
                    JonB last edited by

                    @omz AVFoundation seems to use a lot of blocks.

                    i havent tried it yet, but is it possible to create NSBlocks at runtime from python?

                    see for example
                    http://stackoverflow.com/questions/20134616/how-are-nsblock-objects-created
                    http://www.cocoawithlove.com/2009/10/how-blocks-are-implemented-and.html
                    http://stackoverflow.com/questions/17813870/how-does-a-block-capture-the-variables-outside-of-its-enclosing-scope

                    sounds like blocks are just c structs, which we can create with ctypes, with a pointer to an invoke function, and a structure with pointers or values of captured variables. memory management may be tricky.

                    1 Reply Last reply Reply Quote 0
                    • Webmaster4o
                      Webmaster4o last edited by Webmaster4o

                      I found this code here

                      -(void) getAllImagesFromVideo
                      {
                         imagesArray = [[NSMutableArray alloc] initWithCapacity:375];
                         times = [[NSMutableArray alloc] initWithCapacity:375];
                      
                         for (Float64 i = 0; i < 15; i += 0.033) // For 25 fps in 15 sec of Video
                         {
                             [times addObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(i, 60)]];
                         } 
                      
                         [imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {
                             if (result == AVAssetImageGeneratorSucceeded)
                             {
                                 [imagesArray addObject:[UIImage imageWithCGImage:image]];
                                 CGImageRelease(image);
                             }
                         }];
                      }
                      

                      Can anyone help me convert it to use objc_util in Python? Is it even possible?

                      JonB 1 Reply Last reply Reply Quote 0
                      • omz
                        omz last edited by

                        @JonB I've experimented in that direction. On a 64-bit device, I've actually been able to construct a block using just ctypes. There are probably some gotchas, but it seems to work... I haven't been able to make this work with the 32-bit runtime at all though, it basically just crashes, not sure why exactly.

                        1 Reply Last reply Reply Quote 0
                        • JonB
                          JonB last edited by

                          @omz ok, a little reading, and i got NSArray's

                          enumerateObjectsUsingBlock_
                          

                          working on a 32 bit machine. That admittedly is a fairly simple block, without any imported variables, copy helpers, etc... but then again, it is not clear that such features are required for any block based protocols..... if we need refernces to other objects, we can create and keep them in python.

                          1 Reply Last reply Reply Quote 0
                          • JonB
                            JonB last edited by

                            https://github.com/jsbain/objc_hacks/blob/master/blocktest.py
                            working 32 bit example of using blocks

                            1 Reply Last reply Reply Quote 1
                            • omz
                              omz last edited by

                              @JonB Wow, that's amazing!

                              I'll see if I can put together something that works on 32 and 64 bit for the next beta...

                              1 Reply Last reply Reply Quote 0
                              • JonB
                                JonB @Webmaster4o last edited by

                                @Webmaster4o ok, i really had low expectations... but this actually worked:

                                Use CaptureMedia to record a video (i dont think it works with existing videos, but that can change). then run this script. Right now it just cycles through frames once per sec, but you could use this to extract an image from an arbitrary time. the CMTime takes an interger value, and integer timescale, value/timescale = time in seconds. For some reason, i was only able to extract images at 1 second intervals. using a larger timescale to get finer resolution had no effect.

                                1 Reply Last reply Reply Quote 1
                                • omz
                                  omz last edited by

                                  @JonB Haven't tried this yet, but maybe you can get finer resolution by setting the requestedTimeToleranceBefore/After properties of AVAssetImageGenerator?

                                  1 Reply Last reply Reply Quote 0
                                  • JonB
                                    JonB last edited by

                                    tried that, as well as AVURLAssetPreferPreciseDurationAndTiming.
                                    i have read about similar issues on SO, the solution that worked for some was to create a video composition for a narrow range, then use the method above to get an image, but i have not tried that yet.

                                    as an aside, it seems that Setting the restype manually to a struct type does not force the objc_msgSend_stret invokation on 32 bit. for instance asset.duration() didnt work, even manually setting rstype, i had to resort to calling msgSend_stret myself.

                                    omz 1 Reply Last reply Reply Quote 0
                                    • Webmaster4o
                                      Webmaster4o last edited by

                                      Wow, it works! Didn't expect it to! I don't understand HOW it works, but it does!

                                      1 Reply Last reply Reply Quote 0
                                      • omz
                                        omz @JonB last edited by

                                        as an aside, it seems that Setting the restype manually to a struct type does not force the objc_msgSend_stret invokation on 32 bit. for instance asset.duration() didnt work, even manually setting rstype, i had to resort to calling msgSend_stret myself.

                                        Thanks, I'll look into that. Btw, I've made some progress with the type encoding parser, and it should be able to handle most structs automatically in the next build, so you don't necessarily have to create Structure subclasses for things like CMTime.

                                        1 Reply Last reply Reply Quote 0
                                        • First post
                                          Last post
                                        Powered by NodeBB Forums | Contributors