Create and save Live Photo to Camera Roll
I have an image (a.heic) and a video file (a.mov) stored in Pythonista's Documents folder. I would like to save both files to Camera Roll so that iOS recognizes them as a Live Photo.
Is it possible to create instance of Objective-C class 'PHLivePhoto' and access method 'requestLivePhotoWithResourceFileURLs' to load 'a.heic' and 'a.mov' resource files and save the created 'PHLivePhoto' to Camera Roll? How would this be done by using 'objc_util'?
Any help would be greatly appreciated. Thank you!
thanks for that script. It adds a video file to Camera Roll. But how can I manage to add an image and a video to Camera Roll so that they will be displayed as a Live Photo?
What do you think of method 'requestLivePhotoWithResourceFileURLs' of class 'PHLivePhoto'? Is it possible to create a Live Photo that way in Pythonista?
Thank you for your help!
@riePat I'v tried with this method during two hours without success.
I agree that my knowledge about heif/heic is zero and that does not help 😢
Edit: I have been able to create the PHLivePhoto but not to add it to a PHAsset to add it in the camera roll
You are sure that with both files copied to camera roll you can"t do anything?
I really appreciate your effort! How did you manage to create the PHLivePhoto? Could you please share the code?
I checked the link you shared: I also thought of some metadata issue, but I have checked 'a.heic' and 'a.mov' again, both have sames dates. I really don't know why it is not working...
Do you have any other ideas? Thank you!
Not at home now, busy with granchildren, later
@riePat Could you Please try this, I can't because I don't have files pair
from objc_util import * import threading import ui PHPhotoLibrary = ObjCClass('PHPhotoLibrary') PHAssetCreationRequest = ObjCClass('PHAssetCreationRequest') url_mov = nsurl('iphone6s_4k.mov') url_heic = nsurl('iphone6s_4k.heic') lib = PHPhotoLibrary.sharedPhotoLibrary() def create_block(): req = PHAssetCreationRequest.creationRequestForAsset() PHAssetResourceType = 1 # PHAssetResourceTypePhoto req.addResourceWithType_fileURL_options_(PHAssetResourceType,url_heic,None) PHAssetResourceType = 9 # PHAssetResourceTypePairedVideo req.addResourceWithType_fileURL_options_(PHAssetResourceType,url_mov,None) def perform_changes(): lib.performChangesAndWait_error_(create_block, None) t = threading.Thread(target=perform_changes) t.start() t.join()
def create(): req = PHAssetCreati req.addResourceWithType_fileURL_options_(PHAssetResourceType,url_jpg,None) PHAssetResourceType = 9 # PHAssetResourceTypePairedVideo req.addResourceWithType_fileURL_options_(PHAssetResourceType,url_mov,None) def perform_changes(): create_block = ObjCBlock(create, restype=None, argtypes=None) err_ptr = c_void_p() ret = lib.performChangesAndWait_error_(create_block,byref(err_ptr)) if err_ptr: err = ObjCInstance(err_ptr) print(err) print('ok=',ret)
Shows process not ok, I don't know why 😭
Error Domain=NSCocoaErrorDomain Code=-1 "(null)"
Same error with, thus I'm doing something bad
from objc_util import * import time global handler_done handler_done = False def handler(_cmd,obj1_ptr,obj2_ptr): global handler_done if obj1_ptr: obj1 = ObjCInstance(obj1_ptr) if obj2_ptr: obj2 = ObjCInstance(obj2_ptr) print(obj2) handler_done = True return handler_block = ObjCBlock(handler, restype=None, argtypes=[c_void_p, c_void_p, c_void_p]) urls = [url_mov,url_jpg] PHLivePhoto = ObjCClass('PHLivePhoto') uiimage = ObjCInstance(ui.Image.named('iob:alert_circled_256')) PHImageContentMode = 0 # PHImageContentModeAspectFit PHLivePhoto.requestLivePhotoWithResourceFileURLs_placeholderImage_targetSize_contentMode_resultHandler_(urls,uiimage,CGSize(0,0),PHImageContentMode,handler_block) while not handler_done: time.sleep(1)
I think we still have to add some metadata to both files to assure they belong to the same pair...
Edit: I don't know from where your both files come but perhaps the script with PHAssetCreationRequest could work for you if your files contain the right needed metadata, what you could check with an Exit viewer (free) app.
A live photo has two resources. They are tied together with an asset identifier (a UUID as a string).
A JPEG; this must have a metadata entry for kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier] (17 is the Apple Maker Note Asset Identifier key).
A Quicktime MOV encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have:
Top-level Quicktime Metadata entry for ["com.apple.quicktime.content.identifier" : assetIdentifier]. If using AVAsset you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)
Timed Metadata track with ["com.apple.quicktime.still-image-time" : 0xFF]; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF byte (aka -1) and can be ignored. If using an AVAssetReader you can use CMSampleBufferGetOutputPresentationTimeStamp to get this time.
The assetIdentifier is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.
@cvp, you’re a genius! Thank you so much for the script you shared, it does exactly what is supposed to do :-)
As the files were created by iOS I do not have to modify any metadata. But thanks anyway for the detailed explanation.
@riePat 😅 I didn't know what I could do more...