omz:forum

    • Register
    • Login
    • Search
    • Recent
    • Popular

    Welcome!

    This is the community forum for my apps Pythonista and Editorial.

    For individual support questions, you can also send an email. If you have a very short question or just want to say hello — I'm @olemoritz on Twitter.


    In ui how to implement gestures

    Pythonista
    3
    10
    8666
    Loading More Posts
    • Oldest to Newest
    • Newest to Oldest
    • Most Votes
    Reply
    • Reply as topic
    Log in to reply
    This topic has been deleted. Only users with topic management privileges can see it.
    • zencuke
      zencuke last edited by

      I can see how to subclass View and I see how to turn on multi-touch. In my simple example I see the touch events. What I don't understand is how to process only some multi-touch events (say for a swipe in the top enclosing view) but leave the rest for the individual subviews to handle. How are touch events passed through the view hierarchy? Do all enabled views see all events which happen in their frame.

      Also I see mcsquaredjr's moose gesture example gitt. Does anyone know of a gesture processor module that handles more of the standard iOS gestures?

      1 Reply Last reply Reply Quote 0
      • JonB
        JonB last edited by

        The problem that you will have is that events don't bubble, and you can't generate your own events to pass up or down. The topmost view that is hit gets the event, and that's where the event stops.

        Now, with enough work, it is possible to generate your own custom versions of many of the built in classes in ui. Then, you can implement touch events, and implement your own sort of callback system.

        See for example https://github.com/jsbain/uicomponents/blob/master/PopupButton.py

        I had to implement a custom view to handle a longtouch event for a button. Also, In this case I wanted to pop up a view with another list of buttons, and if your touch ended on one of those buttons, to call the action for the button. See my touch_moved and touch_ended... Basically the class that handled the initial touch event is the one whose callbacks keeps getting called, so that function needs to handle checking if the touch event had moved onto some other button, and then decide to call that action, etc. if you wanted to be able to pinch zoom, etc, I think you'd have to implement all custom views, and somehow manage sending events up to some master touch controller class.

        The other option would be to have your main view be a webview using javascript, and use jquery etc to do everything. Not very pythonic, but it can work, and in some cases better. For example, in panel display, touch moved stops firing because the panel wants to swipe back to console, but if you have webview or scrollview, the panel doesn't steal those events. You use custom uri scheme to communicate back to python.

        1 Reply Last reply Reply Quote 0
        • omz
          omz last edited by

          I'm thinking about adding support for gesture recognizers, but right now, there's not really a good way to accomplish this with the existing functionality in the ui module.

          1 Reply Last reply Reply Quote 0
          • zencuke
            zencuke last edited by

            Thanks guys,

            If read these comments correctly it sounds like the current implementation of multi-touch is really only useful in a view that doesn't include other ui elements which need touch events, for example a view that implements an interactive game or a picture drawing frame or a subview that lets you zoom/scroll an image. The parent view (and elsewhere in the hierarchy) could still use buttons and other controls.

            Can a custom view access keypad events? I.e. would it be possible to implement a custom TextField-like control which also supports multi-touch? I'm not sure I'd want to but...

            OMZ: I would use multi-touch/gestures if you enabled them but they are not the top of my list of needed features.

            JonB: If I used jquery for "everything" I would worry about potential brain damage plus there are probably better platforms. I like Python because of the clean implementation. javascript gives me headaches. ;-)

            1 Reply Last reply Reply Quote 0
            • JonB
              JonB last edited by

              The tricky part of a custom textfield is getting the keyboard to pop up and intercepting key events....
              I agree, javascript makes my head hurt too.

              What would be really awesome, would be the ability to register event listeners, or have some method to allow event bubbling, so that we can really customize. I.e we don't need gesture support per se, but having the ability to run our own touch handlers on built in components would allow one to implement whatever gestures one desires....

              (Actually, I'm not sure what happens if you multitouch such that one touch is on, say, a button, and another is just hitting the containing view.... Does the button get both touches? Or are they two separate events? Time to experiment)

              1 Reply Last reply Reply Quote 0
              • JonB
                JonB last edited by

                Ok, interesting discoveries...

                1. touch_began, moved, always is called with a single ui.Touch, not as a list or tuple of Touch.
                  That makes things a little trickier, since you basically need to keep track of each touch on your own. Deciding what to do when touch ends, for example, depends on how many touches you were tracking... You might need to wait until all touches end, or have some sort of timer which changes things back to single touch case, etc... Tricky.

                2. if you add a generic ui.View to your view, with background color set to None, this effectively shields any built in components, such a buttons, text views, etc that are underneath that view from getting touch events, the surprising bit: the events go to the containing view. This allows implementation of custom event handling!

                Proof of concept is here.
                I didn't try to do anything with multitouch, but here is something which simply interjects some prints before taking the normal action on a button or begin editing a textfield. But, you should be able to see how this is nestable, and customizable.

                TextFields /text views are going to be especially tricky to handle correctly this way. You are going to have to implement some custom way of figuring out where in the text stream a particular location corresponds to, if you want to be able to move cursor around properly, etc. that means figuring out how the text is going to wrap, etc. I think it might be doable, but will be a real pain!

                1 Reply Last reply Reply Quote 0
                • zencuke
                  zencuke last edited by

                  I assumed that was what touch_id was for. I figured I'd just maintain a database of open touch status'ae for all open touches. At each touch event log the event then call a common gesture detector function to consider the entire pending touch database to decide if a gesture has completed or action is required. No timer should be needed. When a gesture finishes clear the database. There must be standard gesture algorithms out there one could lift. In fact OMZ might be able to figure out how to use the builtin iOS gesture detector. One thing I'd really like is sliding views like the output/editor/library scrolling views in Pythionista implemented with simple swipe gestures. That's what I was going to try first. The animation in Pythionista is nice but I'd do without if I could just switch between views with a swipe. Sliding a small view might be a useful way of implementing popup behavior as well.

                  1 Reply Last reply Reply Quote 0
                  • zencuke
                    zencuke last edited by

                    We don't really need a complex new event mechanism. I suspect the built-in iOS "bubbling" mechanism would be fine if it can be made visible in Python. Higher level views should be allowed to see events destined for subviews. I think the rest could be built on that.

                    Even filtering isn't necessary; useful if simple to implement but that could be done in Python. I wouldn't even extend the existing view widget classes for multi-touch. If you want to extend a Button or TextField etc to handle multi-touches just create a wrapper custom container View to handle the extension and add_subview the base object into the wrapper contents. Let the base object handle the default behavior only.

                    1 Reply Last reply Reply Quote 0
                    • JonB
                      JonB last edited by

                      If all you want is a sliding gesture to slide between views, ScrollView with paging_enabled probably is a close approximation. It uses one finger slide only, but events do bubble... So you can have buttons, text views, etc that all work normally, but you can slide them around. Add > a and < buttons in the corner to complete the effect. Also, scrollview_did_scroll gets fired while you are dragging, allowing you to jump, for example if you drag by more than a certain amount.

                      import ui
                      import this
                      import console
                      
                      root=ui.ScrollView()
                      root.present()
                      root.content_size=(2*root.width, root.height)
                      root.paging_enabled=True
                      root.bg_color='white'
                      b=ui.Button(frame=(50,50,200,200),bg_color='red')
                      tv =ui.TextView(frame=(root.width+20,20,root.width-40,root.height-40),bg_color=(0.95,0.95,0.95))
                      root.add_subview(tv)         
                      tv.text=''.join([this.d.get(c,c) for c in this.s])
                      
                      cornerbutton1=ui.Button(frame=(root.width-32,0,32,32))
                      cornerbutton1.image=ui.Image.named('ionicons-chevron-right-32')
                      cornerbutton1.name='right'
                      
                      cornerbutton2=ui.Button(frame=(root.width,0,32,32))
                      cornerbutton2.image=ui.Image.named('ionicons-chevron-left-32')
                      cornerbutton2.name='left'
                      
                      root.add_subview(cornerbutton1)
                      root.add_subview(cornerbutton2)
                      
                      def a(sender):
                          console.hud_alert('i am a button')
                      def slide(sender):
                          def ani():
                              if sender.name=='right':
                                  sender.superview.content_offset=(root.content_offset[0]+root.width,0)
                              else:
                                  sender.superview.content_offset=(root.content_offset[0]-root.width,0)
                          ui.animate(ani,duration=0.5)
                      cornerbutton1.action=cornerbutton2.action=slide
                      b.action=a
                      root.add_subview(b)
                      root.add_subview(ui.Slider(frame=(20,300,500,100)))  # just to see if normal controls work
                      
                      1 Reply Last reply Reply Quote 0
                      • zencuke
                        zencuke last edited by

                        Cool. Thanks.

                        Note: There is a minor typo you might fix for the next person to read this thread.

                        ,root.height-40)0,

                        The trailing 0 before the comma.

                        1 Reply Last reply Reply Quote 0
                        • First post
                          Last post
                        Powered by NodeBB Forums | Contributors