Quick serial slowdown / buffer?
fogled@mizzou
Posts: 549
I've inherited a system that drives a 3-unit BiAmp audio DSP. I've recently discovered that it is swamping the BiAmp with commands and the BiAmp is dropping commands occasionally. I know I need to implement a serial buffer, and I'm doing that in an update I'm working on. I found some pretty good code examples in the forum already.
But the existing codebase uses literally hundreds of discreet SEND_STRING dvBiAmp,"'Blah Blah',$0D" commands. Is there a way to slow down the serial command without having to do what amounts to a sizeable code re-write on a codebase I'm abandoning anyway?
Thanks,
But the existing codebase uses literally hundreds of discreet SEND_STRING dvBiAmp,"'Blah Blah',$0D" commands. Is there a way to slow down the serial command without having to do what amounts to a sizeable code re-write on a codebase I'm abandoning anyway?
Thanks,
0
Comments
One strategy would be to make a big buffer bufBIAMP, and all the string event does is And you call some function every second or so in your define_program to search bufBIAMP for a complete command and send THAT command on to dvBiAmp. That would be one way to get what you want.
Nice idea. You don't actually have to do the string replace though, you can just change the device name in the define_device section.
However I don't see how this will work because the buffer will just get full. What is causing the swamping? Is it perhaps volume adjustments or is it status requests?
No status requests. The swamping is caused by poor code, really. It's sending a "RECALL 0 PRESET XXXX" then up to 10 individual SET X INPMUTECS X XX 1" in a row. I bet I only have to give the BiAmp a few extra ticks between commands and it will keep up OK.
Thanks for the idea guys - I think the VDV is do-able.
OK, I'm working on this and not getting anywhere. Does the "string" device data event handler see strings sent, or just ones received? Here's my code so far:
So a send_string gets sent to the virtual device. The virtual device should be putting the command string into the buffer. But in the data_event for the virtual device, I never get any feedback that indicates the device is picking up the string.
Do I need to use "send_command" for the string send, and the "command" data event handler to get stuff sent to the virtual device?
Thanks for any help,
Or you could just try upping the Baud rate on the Biamp, I have recently finished a system that fires out a shed load of commands to the biamp and gets a heap back, never had a problem with it dropping commands, it works flawlessy, but I did have to put it up to 115200, at 9600 it had issues.
Worth a try
What model BiAmp is it, and what firmware version is it running? This is a 3-unit Nexia stack. AFAIK, this was never a problem with the BiAmp before I updated it to v1.4.20. Now I see there's another firmware release for the Nexia units: v1.4.30.
When I talked to BiAmp engineers about it, they insisted that the controller needed to wait for the "+OK" reply before sending the next command. I also asked them about changing the baud rate, but they didn't think it would make any difference.
I initially thought that slowing down the baud rate (not speeding it up) might give the biamp enough time to respond to one command before the next one came in. But maybe speeding up the baud rate will create a larger timing gap between commands. I'll certainly give it a try.
I'll see if I can find out, but that job has been signed off and accepted so I don't see myself going back there anytime soon
This code asks the runtime system to do the same job
as this code would do manually
as a result the strings are getting into the buffer automagically and your debug code never sees them.
Are you saying that if you have a CREATE_BUFFER for a device, then the STRING: handler for the DATA_EVENT of said device won?t fire? The code you posted?
?will actually double buffer all the data coming in.
CREATE_BUFFER will do its thing first and then the STRING handler will fire. This can be demonstrated with the following code:
And here is the output after 2 pushes of [dvTP,1]
Line 1 :: DATA.TEXT= hello - 15:41:15
Line 2 :: cBuffer = hello chello - 15:41:15
Line 3 :: DATA.TEXT= hello - 15:41:21
Line 4 :: cBuffer = hello chello hello chello - 15:41:21
If I misinterpreted what you were trying to say then never mind.
OK, I understand the double buffer issue - my code would have done that. Problem is, I never get ANYTHING in my buffer. I believe that's because I'm actually sending the string, not receiving it. The "string" event only captures incoming strings, not outgoing strings, right?
So is there any way to redirect a string sent to a device to a buffer instead without changing all the lines of code (literally hundreds) that send the string to the device?
Correct.
Try changing your virtual device to something like 33001:1:0 and see if that gets you any further.
Oh, I actually copied a virtual device definition from a system that uses a TOA amplifier and the TOA module. That uses 41001-41020 for it's virtual devices. Is that address range reserved for virtual devices used in modules?
I'll try the 33001 address and see what happens.
I reread this thread and I?d like to try and clear a few things up.
1) SEND_STRING dvRealDevice, ?some data?, sends ?some data? OUT TO the real device. The STRING handler in the DATA_EVENT for dvRealDevice will NOT fire. If that real device returns data THEN the STRING handler for dvRealDevice will fire.
2) SEND_STRING vdvVirtualDevice ?some data? sends ?some data? TO the virtual device and the STRING handler for vdvVirtualDevice WILL fire.
3) Given points 1 and 2, your original code in post #6 will not work as intended because it is looking for data FROM the real device, putting the into a buffer (twice) and then sending that same data back OUT TO the real device after it?s been passed INTO a virtual.
I believe you already have figured out this tangled web but I just want to make sure.
Not directly but you can accomplish what you need by taking Matt?s sound advice in post #2 along with what NMarkRoberts alluded to in post #3 (if you want to avoid the search and replace). Here?s an example using the devices you posted in your code:
HTH
As the code was clearly wrong in double dipping I guessed that was causing the problem - looks like it wasn't.
Thank you for your explanation of string handling. I was confused about that. Now I'm not. ;-)
But... your code above really is backwards from what I need. The existing code (I didn't write it) has an absurd amount of discreet "send_command dvBiAmp,"'Blah blah blah',$0A" commands in it. Bunches of them, right in a row. The communications to the devices just isn't function-ized.
The BiAmp system, a 3-unit Nexia stack, is occasionally returning errors when I throw more than 3 commands in a row at it; it returns errors about half the time with more than 5 commands in a row, and almost always if I'm up over 8 commands - especially if one of them is a RECALL PRESET. BiAmp engineers insist that their device be fed one command at a time, and the unit must return a "+OK" before it can be sent another command.
So, what I was doing was this:
RENAME the real device from dvBiAmp to rdvBiAmp
CREATE a virtual device with the original dvBiAmp name that would get all the send_string commands from the rest of the code
CAPTURE capture strings now being sent to the virtual "dvBiAmp" device into a buffer
CAPTURE replies coming back from the real "rdvBiAmp" device
USE a simple flag to control the flow of command strings from the buffer: flag initializes as true (can send). When the flag is true and there is a complete command string in the buffer, set the flag to false and send the command string. When the required "+OK" comes back from the real BiAmp, set the flag to true again. Lather, rinse, repeat.
I have not been able to really test the baud rate change - it turns out our IT department jacked up the wireless network over the weekend so the touchpanel is offline. Might be able to do that this morning.
I will also try using the correct range for virtual devices.
You may want to consider implementing some sort of timeout for the ?+OK? as a fail safe (and/or check the buffer length before adding to it.) If the BiAmp ever becomes disconnected for any reason, you don?t want to overflow your buffer and possibly muck up operations.
As a side note, since you?re sending an internal message to a virtual device you?re guaranteed to receive complete packets (assuming the original SEND_STRINGs are complete to begin with). It?s one of the advantages of virtual devices.
Have fun and let us know how it turns out. If you?re lucky you might not have to change a thing other than up the baud rate as trav suggested.
Erm... I'm flailing on a good implementation of this. I can:
1. Put some kind of loop in the DEFINE PROGRAM that eventually hits a limit and resets the flag. I've got it working, but it's got to be about the least efficient method of doing this...
2. Trap for an OnChange for the flag, then a wait, and a reset? That should be fairly efficient
3. Other suggestions?
BTW, I'm actually working on another system right now - this system's BiAmp is returning errors too, but despite the errors, It looks like all the commands are being executed in the BiAmp. Upping the baud rate (only) didn't eliminate the error responses.
Upping the baud rate to 115200 did not solve the problem. I was still getting occasional errors. The code I ended up with, that worked is:
No, there is not a timeout. I kept having trouble with the timout function, and the reality is that if the BiAmp isn't responding correctly, I'm gonna get a phone call anyway. I tested this quite a bit yesterday, and I never got any errors or had the queue get gummed up.
Just to keep this and the "more stupid questions" thread at the top, I had to post an update. I know everyone must have started laughing after reading my quip, "I'll get a phone call anyway." about not implementing a timeout. Of COURSE I got the phone call! The BiAmp is just a bit too flaky with it's responses. So, I finally got my head wrapped around the timeout:
In DEFINE_PROGRAM, I added a named wait to the end of the sending routine. In the STRING handler for the real device that reads responses from the actual BiAmp, I put a CANCEL_WAIT. 2 seconds is plenty long to wait - as someone else said, "an eternity for a serial device." Especially running at 115200 baud! I also added a couple more conditions that would be accepted as a valid response (the BiAmp would sometimes fail to send a CR or an LF with it's response).
Very easy, and works well (no, I haven't gotten another phone call) - once I got the named wait and cancel part figured out. Now I'm really digging the virtual devices and 2-way serial communications. MUCH better control over the devices! I've already updated 2 other projects, solving a similar problem with CATV tuners and implementing some really cool feedback on panels controlling Polycom systems.
Thanks everyone for the ideas and help!