@zencuke Apple doesn't allow Pythonista to do iTunes file sharing? Do you have a source for that? (Although it's a little more accessible for abuse than "Documents in the Cloud", I'm not convinced it is totally forbidden for code.)
Welcome!
This is the community forum for my apps Pythonista and Editorial.
For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.
Posts made by mteep
-
RE: IOS 8.3 now breaks iFunbox and probably other similar programs
-
RE: Beta Status Update
@omz There seems to be some issue affecting the
canvas
module. In both the recent betas (160013 and 160014), your turtle.py example crashes Pythonista immediately. But other simple canvas usage does not. In released versions like 1.5, and the earlier betas, I have only observed canvas crashes when I edited the script while it was running. -
RE: Beta Status Update
@polymerchm It seems that only crash reports with file names ending in ".ips" are shown in Xcode. Those that have been transferred are renamed on the device to ".ips-transferred" or similar. Those from the Pythonista beta end in ".ips-beta" or so. Fortunately, they call all be viewed on the device, if you go to Settings -> Privacy -> Diagnosis ... -> Diagnosis ... (or something close to that).
-
RE: Concep Question: IPad as industrial machine component?
Well, with the Pythonista 1.6 Beta version that had the ctypes and BLE modules, something close to that was possible, but at that time somewhat tricky to implement. I was able to get a live camera view and detect iOS supported metadata/features such as barcodes (tested) and faces (not tested). The output to the relay could easily have been sent using BLE.
Having said that, Pythonista is most likely not the ideal choice to do something like this. At least not in the short term. First, it might take quite a while before the ctypes module comes to a released Pythonista version, due to problems with 64-bit versions. Second, implementation is tricky, in particular if you want to detect something not handled by the built in detectors, and do it fast enough.
The iPad hardware, particularly the Air 2, could most likely do whatever computer vision processing you need blazingly fast on the GPU if properly coded. But this would be a lot easier to develop in Swift or Objective-C in Xcode.
-
RE: iCloud Sync support
@omz I could be wrong of course, but it just makes sense for personal content. From the perspective of users, iCloud is a simple feature that automatically makes their personal content available on all their devices, to quote some developer documentation. If a user can enter something into an app on one device, why shouldn't (s)he be able to access it in the same app on another device? After all, (s)he can restore an iCloud backup onto another device. Apple wouldn't gain anything by restricting that to some subset of personal content.
The key here is personal content. The guidelines refers to content created by other users as user generated content, and that is completely different.
The apparent contradiction could be explained in that the app itself doesn't actually do the download over the network (as far as I understand it), it just saves the file in the Ubiquity container and reads it from there (using special APIs, but still). The actual up- and download to sync it across devices is handled in the background by the Ubiquity machinery, somewhat like backups to iCloud are handled behind the app's back.
I believe the way, (shape,) or form wording refers to that the timing, the network protocol, or any encoding, encryption, or obfuscation is irrelevant. Basically, there shouldn't be any surprises for the user in terms of code that the review team couldn't review.
-
RE: iCloud Sync support
@misha_turnbull I'm sorry, but I don't think that would help. I am pretty sure that any set of mechanisms that together enable code distribution is disallowed. Syncing user created content (including code) among the users devices is allowed. It doesn't really have anything to do with file types, extensions or UTI:s.
-
RE: Step-by-step instructions for backing up Pythonista if I can not run Pythonista
@donnieh Well, iTunes can back up the contents of the entire device. And from that backup it is possible to extract the user Python files from Pythonista. But it is nontrivial and might require questionable third party programs. The Xcode method works standalone, but presumably only with non-AppStore apps (TestFlight or your own).
-
RE: Step-by-step instructions for backing up Pythonista if I can not run Pythonista
If you have a Mac, you can use Xcode.
Just connect your device, go to the Devices window and click on your device. Under Installed Apps, you'll find Pythonista and probably any other TestFlight apps. Select Pythonista, click on the gear icon and select Download Container .... This saves the entire user portion of the app as a package. (That is, a directory which Finder shows as a single file. Context click on it and select Show package contents to show the files.)
Later, you could restore it using Replace Container ..., but if it is to a different version of the app you should download that container first and then replace the
AppData/Documents
content with that which you saved initially. -
RE: iCloud Sync support
@Stewartbracken
I agree on both feature requests. For iCloud storage, omz needs to be convinced to try it. While at it, Handoff support would be nice too.Shader access might be possible in Pythonista 1.6 by using the
ctypes
module, if omz gets it to fully work in 64-bit builds (and it is approved by Apple, see below). If I find the time, which seems unlikely, I might investigate it in the beta.@ccc
iCloud storage of code is AFAIK not at all prohibited by the App Store review guidelines. I haven't read them in a while, but why would it be? In fact, all code I have written in Pythonista is already on iCloud (automatically backed up by iOS).What are prohibited, are mechanisms that could be used to mass distribute (App-like or possibly malicious) code to other users, thereby circumventing the App Store. (This policy is needed to maintain the benefits of the App Store to developers and users.)
That is, things like:
- Creating public links to iCloud documents containing code. (Only the app itself, or one from the same developer, has the ability to do this. So the review team can verify this. In particular if special entitlements are required.)
- Storing code in Dropbox or similar storage that, externally and unbeknownst to the app and the review team, could be made public.
- Enable code import from other apps via the Open in ... mechanism, like Pythonista tried before.
This means that using the old Documents in the Cloud mechanism should be fine, but probably not the iOS 8 Document Picker that came with iCloud Drive, unless it can be restricted with (lack of) entitlements.
Note also that while code in iCloud likely is allowed by itself, it might not be allowed in combination with other functionality like
ctypes
, depending on the granularity of entitlements. -
RE: Beta Build 160008
@wradcliffe Thanks, but I can't really say it was a very structured approach. I started on two different approaches: rewriting using PyBee, and logging the address and memory contents of the
ctypes
objects to be able to compare with the register contents in the crash reports. But while doing that I came to realize that the way I had written the code, nothing in Python (nor in Objective-C) could actually hold on to the function trampoline slot. So I added a direct reference to the Python object which allocated it. Luckily it was that simple.So I never completed the rewrite using PyBee, but that would most likely have eliminated the crashes too . Except that I could only have run the code once without restarting Pythonista, due to the Objective-C classes remaining registered. I currently deal with this by disposing the old class pair if I cannot allocate a new class pair, but it would be better to clean up before exiting.
In parallel, I also did a stripped down version without the metadata detection, in order to post it. However, when I added the ability to switch between the front and back cameras, it crashed again. But this does not seem to be a GC thing, rather that an
AVCaptureSession
only should be manipulated from a single thread (or serial queue). I'm looking at Apple example code, but haven't yet had the time to re-create the exact scheduling of the code across threads/queues. Aui.in_foreground
would have made that easier. -
RE: Beta Build 160008
I finally got the bar code detection to work. It was the Python method corresponding to the Objective-C method implementation (IMP) that got GC'd. Now I just have to clean the code up, a lot. In particular, I need to find a good place to dispose the Objective-C classes I registered. Otherwise, crashes are likely.
-
RE: Beta Build 160008
@omz
Thanks, I now found the crash reports. But I find it slightly weird that they didn't show up in Xcode. Maybe it's because they end in.ips.beta
instead of just.ips
. Anyway, at least I now know which thread that crashed and why, in some sense. Then I can hopefully detect differences when I change stuff. (Interesting to note that it is a 64-bit address on an A8X, everything else is 32-bit, but maybe that's how it works.)Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000040000004 Triggered by Thread: 5 Thread 5 name: Dispatch queue: com.apple.avfoundation.metadataoutput.objectqueue Thread 5 Crashed: 0 Pythonista 0x006915e7 0xea000 + 5928423 1 Pythonista 0x0069406c 0xea000 + 5939308 2 AVFoundation 0x218bb6f5 0x217b0000 + 1095413 3 CoreMedia 0x234822c1 0x23455000 + 185025 4 CoreMedia 0x23494713 0x23455000 + 259859 5 libdispatch.dylib 0x30dc224f 0x30db0000 + 74319
And yes, I am fully aware that you said CFUNCTYPE didn't work in the 64-bit builds. That's why I had to try it now while it is still 32-bit. (Otherwise, I couldn't compare face/barcode detections between newer/64-bit and older/32-bit devices.) My motivation is twofold.
First, this is functionality I've long wished for in Pythonista, but always assumed it would be out of scope. (Over a year ago, I prototyped something similar in Codea using a custom GLSL shader for bar code decoding, but due to the lack of a native UI I never polished it enough.)
Second, being able to use almost any iOS framework is of course very powerful, and before you added the
ctypes
module the thought hadn't really crossed my mind that such capabilities could come to Pythonista. But now, I know it is possible, although I understand that it likely wouldn't be a high priority for you but rather an 'edge use case', as you wrote. If Pythonista has this capability, a lot of user requests for features could be solved in Python, and you wouldn't need to spend time on the edge cases (like perhaps, thecb
module, which I love too). I wanted to help make this a reality by both providing a usage example and a way in which it could be implemented.As I wrote earlier, I think it is possible to enable access to a large part of the iOS frameworks without requiring a working CFUNCTYPE. I had hoped to be able to prototype such a solution that ultimately would be using a fixed set of predefined native dispatch functions. In the prototype however, the dispatch functions would need to be defined using CFUNCTYPE, while they work. (Alternatively, after seeing the
libffi
source code and the recent ARM64 patch, it doesn't seem implausible that CFUNCTYPE soon could work in 64-bit.)@ccc Yes, but I had hoped I could make it do a little more. It is rather similar to omz' AudioRecorder. I just added a
AVCaptureVideoPreviewLayer
as a sub layer to theCALayer
of aui.View
, whose pointer I got from the undocumentedui.View._objc_ptr
. That part should work also in the 64-bit builds, as long as the pointer remains accessible. -
RE: Beta Build 160008
I've playing with the
ctypes
module for a while now and have for instance been able to integrate a live feed from the camera in aui.View
. I have also defined a custom Objective-C class that appears to work from the main interpreter thread. I can add an instance of that class as a delegate to aAVCaptureMetadataOutput
, to be called on a newly created serial dispatch queue when "metadata" is detected in the stream. However, when metadata, such as a bar code or a face, is actually detected, Pythonista crashes.I've looked over the code many times and tried many variations, but I can't get it to work. I hoped there would be crash reports that I could look at and get hints from, but since iOS 8 they seem to no longer be accessible on the device, only from Xcode. But even from Xcode, there are no recent crash reports from Pythonista. Why is that? (I have crash reports from older Pythonista versions and recent from other apps.) Is there any way to enable them?
The only hint I've got is a few lines in the system log that I got when running my code while the iPad was tethered to Xcode. But it basically just said "segmentation fault", no further details. I've tried to send
retain
to my delegate, but fear that maybe it is some Python object that gets garbage collected. (Comments in the PyBee Rubicon code suggest that CFUNCTYPE instances could be GC'd, but I don't quite see how that would happen in my code. That's about the only path I have left to explore.)Has anyone else been able to get something similar to work? Or get crash reports? omz?
As a side note, I looked at the SWT PI Java code that I mentioned earlier, and it had comments suggesting that the Objective-C classes it defined only would work in the thread in which it was created (in this case the AppKit UI thread). But there was no explanation and can't really see why that would be the case.
-
RE: Programming at the metal
@polymerchm I agree on Swift v1.2. It would be time better spent to learn the intricacies of Swift, than those of Objective-C (although the former might change more in the near future). On the other hand, if you are already familiar with the particulars of C and C++, you basically just have to learn to accept the Objective-C syntax for message passing, object initialization, and method definition, to be able to understand most sample code out there. That could be a useful addition. The rest of the work is learning the frameworks, and they are the same (apart from slight naming differences).
@ccc I think one of the longer term goals with Swift is to enable it to be used as a safe scripting language as well. I wouldn't be surprised if something like Swift Playgrounds shows up on iOS within a year and a half. Nothing prevents Apple from enabling development of real full apps on iOS either, as long as you are a paid member of the development program. (A relatively high fee is required to maintain the mutual benefits of the App Store.)
-
RE: Should UI actions run in the background by default?
@omz
Like JonB, I have also experienced problems modifying UI state from an@in_background
action. Similarly from any other threads, such as inui.delay
ed functions or in callbacks from thecb
module in the 1.6 beta. I think it happened when trying to append text to aTextView
. But this is somewhat expected, since the underlying UIKit, like many other GUI toolkits, requires that most of its methods are called from the UI thread. So I believe something like the@in_foreground
decorator dgelessus suggested is really needed already with the current model. (Unless I have misunderstood something.)I would try this first, coupled with another decorator to use dispatch queues since, as you hinted,
@in_background
functions seems to be blocked byView.wait_modal()
. Then, we could experiment with your idea without breaking anything.But I suspect that using a serial queue (or a single thread) to process all actions would just cause it to block further actions from say a dialog. A better approach might be to do like AppKit and presumably UIKit, and run the event loop within the dialog presenting function, if called from the UI thread. Similar with
View.wait_modal()
. Or wouldn't that work? -
RE: Beta Build 160008
@omz
I see. I looked at the libffi source, and it seems they fixed something very close to this in the AArch64 port in a commit three weeks ago. They use a trampoline table as I suggested, but with PC-relative addressing and fancy virtual memory mirroring so there's no limit to how many they can have. (Although there's a bug in that they forgot to unlock theffi_trampoline_lock
in case of failure, line 871 inffi.c
.) So maybe this haven't been included in ctypes yet, causing your crashes.However, as when using Objective-C APIs, the Python programmer indeed doesn't have to provide pure C functions in order to subclass Objective-C classes. The function you pass to class_addMethod() receives the selector
_cmd
as well as theself
pointer. So you can always add the same single function (at least one per value of thetypes
parameter) that further dispatch to Python code based on[self class]
and_cmd
. Not super efficient, but it works in most cases.There are many projects in many languages that do this. Others have mentioned some in Python that might do it this way. I myself am familiar with two in Java: the SWT Platform Interface for Cocoa from eclipse.org, and a vaguely similar thing for AWT in OpenJDK (since JDK 7).
And for blocks, what you actually pass is a pointer to a data structure (an Objective-C object) which (after the header including the function pointer) can contain anything you like. So it is even simpler to dispatch to the correct Python code.
So while this will require some work, it is definitely doable.
-
RE: Beta Build 160008
@omz Very interesting indeed. I'm a bit curious as to what the 64-bit issue has to do with callbacks not being possible. I understand there are hurdles, but I thought they were a little different.
The obvious limitation on iOS is that you are not allowed to execute dynamically generated machine code. If that is how ctypes.CFUNCTYPE works, it cannot be used directly. But there are workarounds. For an arbitrary pure C callback, one may need to resort to tricks like defining lots of trampoline function slots. But in most cases (all?) in Objective-C, that won't be needed.
As you know, often in Cocoa you don't specify a C function pointer at all, but instead an Objective-C object and a selector. That is easily mapped to Python. You just need to supply an Objective-C wrapper class. And blocks, as you mentioned, are just a special case of this. A tiny bit of googling should tell you how. (Or start here.)
-
RE: Pythonista 1.6 Beta
@wradcliffe
Not really, I just synchronized everything to be safe. (Absence of a crash doesn't make it safe.) I use the RLock for all potentially concurrently accessed state in MipManager, which is more than I included above. Also, the self.speed_updater state is accessed by my code from two threads (UI and main). I used RLock instead of Lock since it has nice properties that I am well used to. -
RE: Pythonista 1.6 Beta
@wradcliffe
Sorry about the late reply. The forum was down last time I tried answering. My script is a little too long for this thread, so I have removed almost everything except the sending thread. (I will post the whole thing to GitHub once it's a little more complete.) The MipManager starts the SpeedUpdater thread once it gets the write characteristic in a callback. My UI calls action methods in MipManager (not shown), which either sets the current speed or turn in the SpeedUpdater, or occasionally queues a special command. Currently, I only send other commands when the robot has fallen over in order to attempt to get up (something the original app doesn't do at all), so the simple timing works. Finally, when my UI is closed, I call the shutdown method of SpeedUpdater via the MipManager.Note that I'm not sure of the thread safety of this code or of Queue.Queue. I just noticed that the threading support was closely modeled after Java and hoped the memory model would be too.
import cb import threading import Queue class SpeedUpdater(threading.Thread): def __init__(self, peripheral, characteristic): threading.Thread.__init__(self) self.peripheral = peripheral self.characteristic = characteristic self.state_lock = threading.Condition() self.keep_alive = True self.speed_code = 0 self.turn_code = 0 self.queue = Queue.Queue() def run(self): keep_alive = True while keep_alive: try: msg = self.queue.get_nowait() self.peripheral.write_characteristic_value(self.characteristic, msg, False) except Queue.Empty: with self.state_lock: self.state_lock.wait(0.05) speed_code = self.speed_code turn_code = self.turn_code if (speed_code != 0) or (turn_code != 0): msg = chr(0x78)+chr(speed_code) + chr(turn_code) self.peripheral.write_characteristic_value(self.characteristic, msg, False) with self.state_lock: keep_alive = self.keep_alive def set_speed(self, speed_code): with self.state_lock: self.speed_code = speed_code def set_turn(self, turn_code): with self.state_lock: self.turn_code = turn_code def queue_cmd(self, cmd): self.queue.put(cmd) def shutdown(self): with self.state_lock: self.keep_alive = False self.state_lock.notifyAll() self.join() class MiPManager (object): def __init__(self): self.state_lock = threading.RLock() self.peripheral = None self.speed_updater = None def did_discover_characteristics(self, s, error): log('Did discover characteristics...') for c in s.characteristics: if c.uuid == 'FFE9': with self.state_lock: self.speed_updater = SpeedUpdater(self.peripheral, c) self.speed_updater.start()
-
RE: Pythonista 1.6 Beta
I got my MiP driving app to work like the original app once I set up a dedicated thread to send the speed every 50 ms.
@JonB
The ui.delay() seems to be more or less the same as creating a threading.Timer, which is a new thread. I had hoped/guessed that it would instead schedule the function to be run in the UI thread, as this is something most UI frameworks I am familiar with supports and often requires. (As a side note, that erroneous approach worked for a longer time when I tried it on a newer (A8X class) device.)The turning seems to have been caused by a misreading of the MiP protocol description. You send it a command byte followed by a number of argument bytes that depend on the command. The problem is that the robot often accept commands with fewer arguments, if there is sufficient delay afterwards. So I sent two too short driving commands after each other, and the second one was interpreted the turning argument of the first one.
So, as long as you observe thread safety and send commands of the correct length, controlling the MiP using the cb module seems entirely deterministic.
@wradcliffe
I will see if I can come up with a test of your buffering theory, but in the MiP case, there is really no point in trying to push data faster than the robot physically can act. So throttling on the host side seems reasonable. (A possible exception is if you open up the robot and use the hacker port UART to communicate with some other micro-controller.)