NX-3200 timeline based replacement for Define_Program camera control oddity
fogled@mizzou
Posts: 549
I've always used the Define_Program area to handle all my buffer processing, and it has worked well. It continues to work well even after moving to the NX series and moving all that stuff out of Define_Program into a repeating timeline. Except for camera control.
Standard VISCA control, PTZ controls are programmed to start pan on push, end pan on release. Hold for a sec. sends another faster-rate pan command. This was all fine and seemed to operate the cameras at their minimum pan/tilt/zoom step. But since moving to the timeline with a specified wait time, suddenly the buffer processing makes the "end on release" part much, much slower. So, a quick tap on a pan button will produce 2-3 degrees of motion, instead of just a degree or so. I can change the timeline interval (I've tried everything from 25 to 250) and see the change in the time the camera spends on the command.
I suppose the fix for this is to take the camera command buffer processing out of that queue loop, and send them directly. But, I rely on that buffer processing to make sure I get a response back before sending another. Without that, sometimes the controls overdrive the camera and confuse it.
Right now I've got my program loop at 50ms, and the camera response seems reasonably good (can't tell a big difference between 25 and 50), and I don't see any other ill effects in the controller.
Any comments on this from the peanut gallery?
Standard VISCA control, PTZ controls are programmed to start pan on push, end pan on release. Hold for a sec. sends another faster-rate pan command. This was all fine and seemed to operate the cameras at their minimum pan/tilt/zoom step. But since moving to the timeline with a specified wait time, suddenly the buffer processing makes the "end on release" part much, much slower. So, a quick tap on a pan button will produce 2-3 degrees of motion, instead of just a degree or so. I can change the timeline interval (I've tried everything from 25 to 250) and see the change in the time the camera spends on the command.
I suppose the fix for this is to take the camera command buffer processing out of that queue loop, and send them directly. But, I rely on that buffer processing to make sure I get a response back before sending another. Without that, sometimes the controls overdrive the camera and confuse it.
Right now I've got my program loop at 50ms, and the camera response seems reasonably good (can't tell a big difference between 25 and 50), and I don't see any other ill effects in the controller.
Any comments on this from the peanut gallery?
0
Comments
Maybe I should put buffer processing in a really tight timeline, but panel feedback in another slower one?
For visca cameras I usually do the sending of control strings in the button_event, but without any dependence on getting a reply before sending. If it didn't start/stop moving the user will normally just hit it again without giving it much thought (but this rarely happens). I think visca cameras are used to being attached to some very dumb camera controllers.
The reply processing happens in the data_event string section, with no timelines other than the normal 250ms feedback timeline which can update anything that didn't get set by the data_event.
The most obvious, easy, and sure fix thing to do would be to move the buffer processing directly into the data_event instead of one-offing it as a separate process. Thanks for pointing it out!