Welcome!
This is the community forum for my apps Pythonista and Editorial.
For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.
Videos
-
I found this code here
-(void) getAllImagesFromVideo { imagesArray = [[NSMutableArray alloc] initWithCapacity:375]; times = [[NSMutableArray alloc] initWithCapacity:375]; for (Float64 i = 0; i < 15; i += 0.033) // For 25 fps in 15 sec of Video { [times addObject:[NSValue valueWithCMTime:CMTimeMakeWithSeconds(i, 60)]]; } [imageGenerator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) { if (result == AVAssetImageGeneratorSucceeded) { [imagesArray addObject:[UIImage imageWithCGImage:image]]; CGImageRelease(image); } }]; }
Can anyone help me convert it to use
objc_util
in Python? Is it even possible? -
@JonB I've experimented in that direction. On a 64-bit device, I've actually been able to construct a block using just
ctypes
. There are probably some gotchas, but it seems to work... I haven't been able to make this work with the 32-bit runtime at all though, it basically just crashes, not sure why exactly. -
@omz ok, a little reading, and i got NSArray's
enumerateObjectsUsingBlock_
working on a 32 bit machine. That admittedly is a fairly simple block, without any imported variables, copy helpers, etc... but then again, it is not clear that such features are required for any block based protocols..... if we need refernces to other objects, we can create and keep them in python.
-
https://github.com/jsbain/objc_hacks/blob/master/blocktest.py
working 32 bit example of using blocks -
@JonB Wow, that's amazing!
I'll see if I can put together something that works on 32 and 64 bit for the next beta...
-
@Webmaster4o ok, i really had low expectations... but this actually worked:
Use CaptureMedia to record a video (i dont think it works with existing videos, but that can change). then run this script. Right now it just cycles through frames once per sec, but you could use this to extract an image from an arbitrary time. the CMTime takes an interger value, and integer timescale, value/timescale = time in seconds. For some reason, i was only able to extract images at 1 second intervals. using a larger timescale to get finer resolution had no effect.
-
@JonB Haven't tried this yet, but maybe you can get finer resolution by setting the
requestedTimeToleranceBefore/After
properties ofAVAssetImageGenerator
? -
tried that, as well as AVURLAssetPreferPreciseDurationAndTiming.
i have read about similar issues on SO, the solution that worked for some was to create a video composition for a narrow range, then use the method above to get an image, but i have not tried that yet.as an aside, it seems that Setting the restype manually to a struct type does not force the
objc_msgSend_stret
invokation on 32 bit. for instanceasset.duration()
didnt work, even manually setting rstype, i had to resort to calling msgSend_stret myself. -
Wow, it works! Didn't expect it to! I don't understand HOW it works, but it does!
-
as an aside, it seems that Setting the restype manually to a struct type does not force the
objc_msgSend_stret
invokation on 32 bit. for instanceasset.duration()
didnt work, even manually setting rstype, i had to resort to calling msgSend_stret myself.Thanks, I'll look into that. Btw, I've made some progress with the type encoding parser, and it should be able to handle most structs automatically in the next build, so you don't necessarily have to create
Structure
subclasses for things likeCMTime
.