Home AMXForums Archive Threads AMX Hardware Feature Requests

Support for touch gestures

s.strahammers.strahammer Junior MemberPosts: 17
Hi,

it would be really nice if TPControl would support touch gestures - swipes for example. One way to implement this would be to handle touch gestures like hard buttons on other panels - in TPDesign I'd like to select "Show external controls" > "Left to right swipe" an assign pageflips, channel codes, etc. to it.

This would be a great way to improve the user experience when using TPControl!

Comments

  • Touch Panel Control TeamTouch Panel Control Team AMX Authorized Product Partner Posts: 337
    I am pleased to say that we are already working on this very thing. We are hoping to be able to add the support in the same method as AMX have developed for their new panels so that the experience is consistent but we will release further details when it is available.

    Do you have any thoughts on what gestures you might like to see in the solution ?

    Kind Regards
    Touch Panel Control Team
  • a_riot42a_riot42 AMX Wizard Posts: 1,619
    Do you have any thoughts on what gestures you might like to see in the solution ?

    Is list scrolling using finger swiping possible a la Sonos, iPhone etc?
    Thanks,
    Paul
  • hodeyphodeyp Junior Member Posts: 104
    it woudl be great to simulate cursor control with swipes. this way a user doesn't have to be as accurate with clicking up/down/left right but can just swipe the cursor area in a particular direction
  • s.strahammers.strahammer Junior Member Posts: 17
    Do you have any thoughts on what gestures you might like to see in the solution ?

    Hi,

    I especially thought about list scrolling and getting from one page to the next or previous one.
  • DHawthorneDHawthorne Junior Member Posts: 4,584
    Hi,

    I especially thought about list scrolling and getting from one page to the next or previous one.

    Agreed. Because of the prevalence of Apple devices, this is getting to be the expected behavior. I find myself swiping lists all the time and getting frustrated when they don't respond, until I remember it's not supported (yet).
  • ericmedleyericmedley Senior Member - 4000+ posts Posts: 4,177
    DHawthorne wrote: »
    Agreed. Because of the prevalence of Apple devices, this is getting to be the expected behavior. I find myself swiping lists all the time and getting frustrated when they don't respond, until I remember it's not supported (yet).
    ly

    yeah, this has happened to us repeatedly on new client prospects during demos. Everyone just expects to be able to swipe menus or lists. It's funny how they seem to feel having to hit a button feels 'klunky'
  • PhreaKPhreaK Senior Member Posts: 966
    ericmedley wrote: »
    It's funny how they seem to feel having to hit a button feels 'klunky'
    It's not funny, it's human. Apple have invested a lot in ensuring the devices provide a consistant experience, both with the internal OS guff and the apps users can download - it's one of the advantages of having a closed system where everything must get overlord jobs' (or his minions) approval. The main issue with emulating an AMX UI on these devices is then when anyone picks up an iDevice they're brains are expecting those same interactions patterns and the standardised experience.
  • sridleysridley Junior Member Posts: 21
    Gesture Support

    I have seen that AMX have released TPD4 with Gesture support for the new 9-inch AMX panel. DOes this mean we are any closer to seeing it on the iPad with TPControl? It's the last missing link as far as I can see to what is a great product.
  • Jorde_VJorde_V UX Scientist Posts: 393
    Any updates on this?
  • John NagyJohn Nagy CineTouch Product Manager Posts: 1,546
    While I have no information on the TCP plan for gestures, I find the AMX implementation unusable.

    For gestures to be recognized, you need to start the touch in a location with no buttons, and that's not how users expect to use gestures. If you attempt to swipe but happen to touch a button at the moment of contact, that button fires. The panel is unable to discern a swipe from a touch. All iThings to this fine. While I can hope this will change, history does not encourage me.
  • gsmithgsmith Ex AMX Engineering Posts: 59
    We have tried to thread a needle between supporting gestures and maintaining compatibility with existing expected behavior. Quite a bit of time was spent analyzing gesture/button press behavior on Apple and Android devices. We should be similar but further refinement may be necessary based on user feedback.

    With all that said, here is what should happen: If you start on a button, then recognition of a button press is delayed for .15 second. During that time if you move more than 60 pixels in any direction, the button press and release will not be processed and the motion will be processed as a gesture. If you don't move or move less than that, then after .15 second the button press will be processed, the release will be processed when you release and no gesture will be processed.

    Some screen objects by their nature preclude any kind of gesture processing such as computer control, joystick and movable popups.

    AMX is interested in your feedback since we do want to get this right. It is certainly possible to tighten up the distance but our primary concern was that button presses are not missed since that would probably annoy the user more.
  • mpullinmpullin Obvious Troll Account, Marked for Deletion Posts: 949
    gsmith wrote: »
    We have tried to thread a needle between supporting gestures and maintaining compatibility with existing expected behavior. Quite a bit of time was spent analyzing gesture/button press behavior on Apple and Android devices. We should be similar but further refinement may be necessary based on user feedback.

    With all that said, here is what should happen: If you start on a button, then recognition of a button press is delayed for .15 second. During that time if you move more than 60 pixels in any direction, the button press and release will not be processed and the motion will be processed as a gesture. If you don't move or move less than that, then after .15 second the button press will be processed, the release will be processed when you release and no gesture will be processed.

    Some screen objects by their nature preclude any kind of gesture processing such as computer control, joystick and movable popups.

    AMX is interested in your feedback since we do want to get this right. It is certainly possible to tighten up the distance but our primary concern was that button presses are not missed since that would probably annoy the user more.

    Would it be possible to make this an option in TPDesign4? Make a new General option for a button, right under "Touch Style". Name it "Gesture Recognition." There would be 3 values possible: 1) Push Only (same as it's always been) 2) Gesture only (always process as a gesture) 3) Push and Gesture (do what you described above)

    That would make it easier to meet everyone's needs. I would personally strive to avoid using buttons of type 3) whenever possible because they would inevitably lead to adventures in user training / possible frustration
  • John NagyJohn Nagy CineTouch Product Manager Posts: 1,546
    mpullin wrote: »
    3 values possible: 1) Push Only (same as it's always been) 2) Gesture only (always process as a gesture) 3) Push and Gesture (do what you described above)

    I don't think this addresses the problem. The issue is that if you want to gesture, there is no "button" you want to "gesture on". You incidentally encounter a button you don't actually want to press when you touch the panel to make the gesture... so you'd have to mark every button type 3 to get gestures - ever.
  • John NagyJohn Nagy CineTouch Product Manager Posts: 1,546
    gsmith wrote: »
    It is certainly possible to tighten up the distance but our primary concern was that button presses are not missed since that would probably annoy the user more.

    I heartily suggest reducing the distance for detection. In 20 minutes of testing without knowing these parameters, we were perplexed why gestures occasionally but apparently randomly worked if a button was the landing point. We didn't realize we had to get really moving after a touch.

    Perhaps these parameters could be adjustable at the panel, or via telnet to the panel. The delay seems reasonable, although this also explains why our 9000 customers have mentioned that the panel seems sluggish compared to their other panels. But I'd think that if even 20 pixels of movement were detected in .15 second, it should be taken as a gesture. Easy to say that without actual testing, of course, and easy to be wrong about it.

    Another suggestion: To reduce error and missed pushes, if you think you detect a gesture, but there is no value for channel (i.e. 1,0) for the gesture, abort the gesture process and take the initial button press instead.

    Thanks for the details, but still not sure if I can use it as it is without customer training. Which seldom works.
  • AuserAuser Junior Member Posts: 506
    gsmith wrote: »
    With all that said, here is what should happen: If you start on a button, then recognition of a button press is delayed for .15 second.

    Seems like this is likely the culprit for an "annoying delay" in button presses which has been noted in the MVP-9000i thread:
    a_riot42 wrote: »
    [...]

    I've noticed as well an annoying delay or insensitivity to the panel as well. I can touch a button and have my finger on it and still not activate the button. If I successfully activate the button there is a 200-300 ms delay before the button changes its state from off to on. Other panels don't do this running the same code/panel file. Anyone else noticed anything similar?
    Paul

    Looks like, with the addition of the push notification -> master processing -> feedback visualisation delay, the lag is enough to reduce users' confidence in the UI.
  • PhreaKPhreaK Senior Member Posts: 966
    Auser wrote: »
    Looks like, with the addition of the push notification -> master processing -> feedback visualisation delay, the lag is enough to reduce users' confidence in the UI.

    The 150ms delay in push notification itself is significant enough to cause a disconnect between the action and the response, let alone the addition delay as the messaging and control passes through the system. Were there any usability studies done on this prior to deployment? If not, I'd highly recommend utilizing the services of a UX geek to get the gestural control playing nicely. If the dev budget doesn't have room for that and you have some keen employees there are masses of info and loads of studies that are freely available on different perceptual models and some pseudo golden rules that will help it move in the right direction. Have a look at this post for a start.
Sign In or Register to comment.