Welcome!
This is the community forum for my apps Pythonista and Editorial.
For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.
Concep Question: IPad as industrial machine component?
-
I have a potential application where I want an iPads camera to view an industrial process. Real time images then get processed and a yes/no decision is made. I would then need to output that decision via the lighting cable to a circuit that closes a real world relay to divert a part on a conveyor belt.
Does Pythonista allow real time access to the IPad camera. Can I control the frame rate of the camera?
Does Pythonista allow real time input/output across the lighting cable? -
Pythonista doesn't provide any of those features unfortunately. Live camera access would techically be possible if you were to write a "real" app using Xcode, but there certainly is no way to send arbitrary data over the Lightning connection. In this case it would be a lot easier to use a normal webcam connected to a computer, though I don't know if there are any Python libraries for reading and processing live webcam images.
-
Well, with the Pythonista 1.6 Beta version that had the ctypes and BLE modules, something close to that was possible, but at that time somewhat tricky to implement. I was able to get a live camera view and detect iOS supported metadata/features such as barcodes (tested) and faces (not tested). The output to the relay could easily have been sent using BLE.
Having said that, Pythonista is most likely not the ideal choice to do something like this. At least not in the short term. First, it might take quite a while before the ctypes module comes to a released Pythonista version, due to problems with 64-bit versions. Second, implementation is tricky, in particular if you want to detect something not handled by the built in detectors, and do it fast enough.
The iPad hardware, particularly the Air 2, could most likely do whatever computer vision processing you need blazingly fast on the GPU if properly coded. But this would be a lot easier to develop in Swift or Objective-C in Xcode.
-
Thanks dgelessus,
I was afraid of that but thought it worth asking. It seems like the lightning cable is connected to a fairly standard 'serial port' that can be written to, somehow. My application has strict cost, space and weight requirements. I'm eventually hoping to use outdated cell phones. Guess I'll have to search for a young NASA kid into robotics. I'm also looking into the Rasberry but it has no display screen, requires a box and requires a separate unit as the web cam. Cell phones would be the perfect machine, if only they can be modified to do the job. They could even call somebody if they detected a malfunction.
I don't 'spose there's any body from Apples hardware section on this board?
-
mteep,
There are industrial tools that use BLE to wirelessly connect to sensors.
Although not an imaging application this link is particularly interesting in that it makes full use of a lot of other cell phone features.
http://defelsko.com/smartlink/smartlink.htm -
You might have better luck with an android for such an application, as the android hardware is somewhat more open.
The rasberry is another good option, you could use an ipad to interface to it. For instance you could basically run a web service that serves images, which you retrieve with your pythonista.
For that matter, there are a number of wireless webcams that essentially run webservers that you could access remotely.
Arduinos might be another option, though I'd think doing image processing on an arduino might be tricky. Arduino would let you interface directly to a relay.
-
Random comments.
Unless they changed something recently Apple doesn't allow App Store apps to use lightening serial.
A low cost Bluetooth device to use as an iPhone controlled switch is probably cheaper than a lightning serial cable.
Pythonista can look at any photo captured in other apps but it currently can't take pictures directly.
BLE is currently unavailable in Pythonista. It was in the beta so presumably it is targeted for the next release. That release sounds pretty ambitious feature wise which will be exciting when it happens but it will probably take a while to get ready. The first beta for this update was delivered 5 months ago which gives you an idea of the timescale involved.
-
By the time I can actually complete my project in full the IPad Air 2 might just be the obsolete cheapo device I'm looking for.
I like the sound of mteep's coment:
"The iPad hardware, particularly the Air 2, could most likely do whatever computer vision processing you need blazingly fast on the GPU if properly coded. But this would be a lot easier to develop in Swift or Objective-C in Xcode."
So... The learning curve of another language? I started coding in machine language on an IBM machine in 1965. You had to input the boot program with binary paddle switches every time you started it up. If I remember correctly it was an IBM 1620. Then in Fortran followed by C. There were hundreds of people like me with physics, math and engineering books at the left elbow and boxes and boxes of IBM punch cards under foot. Industry needed subroutines for everything! When the 8080 8bit processor came out I built my first computer from a Digital Group kit. it was back to machine language. Then basic, then C and C++ as the 8080 evolved to drive a real desktop. That fun ended as the pins on processors got so numerous and so tiny you needed multilayer boards, flow soldering and expensive stuff to check timing diagrams. I've been into and out of coding since those years. But current devices now offer so much power in such a small package I just can't resist tinkering with them. Pythonista seemed perfect! I repair and modify exotic machines for small manufactures now and I've used Pythonista to analyze and model processes and machines that were not doing what was wanted. Pythonista has served me well so far, mainly just scientific and statistical stuff, no web programming.
Now programming languages have evolved to a level that has passed me by. I would like to work in a language that is most similar to C++ and one that can be compiled into machine code for speed. Ive often found it easier to write my own foundations rather than decipher packaged software. But the foundations are now so extensive that approach is flawed.
Finally, I have a real question:
Which programming language has the clearest documentation with extensive examples? Swift? Xcode? Or another. What machines can they be run on. And a wild guess as to the cost of obtaining and maintaining the development suite. -
@Bruce42. As a fellow dinosaur, I am learning Swift. Its very python-like. You will have to contend with objective-C as this is what the underlying frameworks (read subroutine libraries) are implemented in.
PDP-8's, Monroebots and paper-tape readers are where I started. Kinda miss the old bootstrap loaders. Happy hacking.
-
PDP-8 with Dec Tape. Hand coded machine code on the 6502 based KIM-1. ;-) Of course I am MUCH too young to know what any of that means.
Speaking of small packages I have about 15-20 microprocessor systems in my desk drawer. Many have an ARM, do Bluetooth, cost under $20 and are less than a square inch in size. Two of them (Raspberry PI and BeagleBone Black) run Linux. I'm not sure why I have so many. At that price sometimes I buy them like candy. ;-) I've got 5 on my bicycle just to automatically turn on all my lights when the bicycle starts to move. (I have lots of lights on my bike.) The ones in the wheels have accelerometers to detect motion and use Bluetooth to talk to the others. Sometimes I forget to be amazed.
On Apple products Objective-C is hard to avoid though Swift looks like a simpler front end to it.
-
On image processing I'd suggest a Raspberry Pi 2 + a good webcam. You'd be off quite a bit cheaper and could use OpenCV for image processing in Python.
-
@Bruce42 if you started with the 1620 and missed the S/360 then you've missed a lot.
Hint: I have both sides of a replica S/360 Reference Card in a frame on my home office wall. :-)
Once somebody tried to "threaten" me with having to learn SAS. I said "bring it on; it'd be the 20th programming language I've had to get familiar with." :-)
So older dogs and newer tricks; I'm sure you could hack it. (Pun intended.) :-)
But, like Titus Andronicus, methinks I doth digress too much. :-)
-
Wow!
It's good to know I'm not the only old dog out here tinkering with the stuff of young'ings. I've noticed that young guys often miss really cheap and simple upgrades to old machinery. Like wire wrapping up a hard wired TTL Logic replacement for complex relay logic. It shrinks down the physical size of relay logic so that many new functions can be added without increasing the size of the old electrical box. Interfacing denounced input switches and real world output power is the only tricky part and the old relays can be repurposed to do that for many newly added functions. The parts cost for such an upgrade can be very low! I've upgraded several WWII machines so that they compete nicely with newer super expensive industrial ladder logic machines. -
I've decided that I'll probably never make a bunch of money with my idea and in keeping with the concept of open source and with the existence of guys like you all; here it is.
It's an industrial worker safety issue. There always seems to be a trade off between worker safety and worker productivity. I've noticed one instance where drastic improvements can be made without that trade off.
You are all probably familiar with the relative new laser line cross hairs that can be added to drill presses for under $20.00. Also appropriate for punch presses and such. The machines they are added to mostly manipulate flat material stock. The laser lines are of a very distinctive color for the given laser. The lines are very straight lines... Until a human hand enters the field of vission. Then the laser lines trace out the top curvature of the hands and/or fingers. So if imagine processing detects anything but 'straight laser lines' machine operation could be prevented. The speed at which a hand enters the field of vision and the time it takes to stop the descent of hand mangling tooling would be crucial. World wide this could save many hands and fingers! It could also increase productivity! While there would be many problems along the way to a commercial version, such as UL lab approval and/or other safety standards one has to start someplace. I'm thinking the proper starting place is with old dogs and new toys.
-
Here's my Dropbox link to some still photos of a hand under laser cross hairs.
https://www.dropbox.com/sh/gi6rej7mqa7edz4/AAD5WcHp8hXVt132peINxIUHa?dl=0
If anybody's interested I would email you copies.Maybe the computer language or speed of processing is initially not as important as the logic of analyzing individual frames.
-
More than a distinctive color, a distinctive frequency although a digital camera won't know the difference with only three color sensors.
Sounds like an interesting problem. A filter pass to filter the laser image frequency making it monochrome removing uninteresting things (noise) from the image followed by a Hough transform to find straight lines? The filter pass could probably generate a number indicating how much laser is visible. In fact depending on the actual setup a change in laser illumination might be enough to tell you that the beam is being interfered with. I've sometimes found in situations like this that the problem can be reduced to a simpler algorithm. Us old guys often needed to find "simpler" because "simpler" === "possible" on the underpowered machines we used.
I'd definitely start with lots of real images from realistic scenarios and develop the algorithms on a desktop first. I suspect the algorithm development will be the hardest part. Heck you could probably hire an app developer to convert it to an iPhone app on one of these software job bidding sites, probably for cheaper than you think. Unfortunately there are lots of underemployed programmers out there.
-
If the level of laser illumination turned out to be enough you might not even need a camera. Just an optical filter that passes only the laser frequency and a photodiode to say how much. Cheap pointer LED lasers (red) are usually 650 nm. They are so common that there are probably standard filters available cheap. Those laser levels at the hardware store probably have them which means someone makes them cheap.
-
Here's another approach: once the drill starts descending, the workpiece sure better be stationary. So, you could look for any moving objects in the FOV (maybe masking out the drill path itself). Anything moving ==> time to stop. Moving objects are pretty easy to identify by simply differencing images. Illumination would need to such that shadows from the tool don't cause false positives (i.e. illumination from at least a few different directions)
-
Hmmm, the shut down function could 'or' up multiple algorithms. The motion detection on a wide view could reveal how fast a hand is approaching, if too fast to always avoid a descending tool, shut down. A stationary hand holding the material would be much more common though. Many materials, textiles for example, are punched with a foot operated switch without clamping and both hands are often in danger. Even the operators, who's limbs are in danger, despise safety devices that interfere with their work. I can't count the times that I've seen disabled safety devices. That's what I like about an out of the way camera or a light level detector being the decision maker.