Home AMX User Forum NetLinx Studio
Options

JSON Parsing and DirecTV DVR

Does anyone have a working JSON Parser built in Netlinx?

Creating a module for 2-way control of DirecTV HR boxes is next on my todo list and all of their replies are in the standard JSON format.

It would be really nice to not have to reinvent the wheel for a web standard.

Thanks.
«1

Comments

  • Options
    jjamesjjames Posts: 2,908
    Just curious if you've worked very much on this yet?

    I have a working parser - however, it seems to be painstakingly slow (10 seconds to parse a 17,000+ character response) on an NI-2000. Many of the parsers I've seen, including the Duet parser in AMX Tools, does it character by character - which is where I started, however I've removed doing the evaluation of all characters in the parsing of keys and their values. The Duet version *seems* to parse the returned string from the DTV much more efficiently and *MUCH* more quickly as it says "parsing" (when turning on DEBUG-4) only takes about a second - quite a bit in speed I'd say.

    Currently, I'm working on a generic parser which sends a string back for every key / value. For example, this is the output of querying program info:
    JSON Response:
    
    {
      "callsign": "CNNHD",
      "contentId": "1 1 10D9B 0",
      "duration": 3600,
      "isOffAir": false,
      "isPclocked": 1,
      "isPpv": false,
      "isRecording": false,
      "isVod": false,
      "major": 202,
      "minor": 65535,
      "programId": "6630951",
      "rating": "No Rating",
      "startTime": 1305435600,
      "stationId": 3900947,
      "status": {
        "code": 200,
        "msg": "OK.",
        "query": "/tv/getProgInfo?major=202"
      },
      "title": "CNN Newsroom"
    }
    
    
    Line      6 (01:14:17.626):: String From [36001:1:8]-[DATA=callsign:CNNHD]
    Line      7 (01:14:17.632):: String From [36001:1:8]-[DATA=contentId:1 1 10D9B 0]
    Line      8 (01:14:17.635):: String From [36001:1:8]-[DATA=duration:3600]
    Line      9 (01:14:17.638):: String From [36001:1:8]-[DATA=isOffAir:false]
    Line     10 (01:14:17.643):: String From [36001:1:8]-[DATA=isPclocked:1]
    Line     11 (01:14:17.647):: String From [36001:1:8]-[DATA=isPpv:false]
    Line     12 (01:14:17.651):: String From [36001:1:8]-[DATA=isRecording:false]
    Line     13 (01:14:17.655):: String From [36001:1:8]-[DATA=isVod:false]
    Line     14 (01:14:17.659):: String From [36001:1:8]-[DATA=major:202]
    Line     15 (01:14:17.663):: String From [36001:1:8]-[DATA=minor:65535]
    Line     16 (01:14:17.667):: String From [36001:1:8]-[DATA=programId:6630951]
    Line     17 (01:14:17.675):: String From [36001:1:8]-[DATA=rating:No Rating]
    Line     18 (01:14:17.686):: String From [36001:1:8]-[DATA=startTime:1305435600]
    Line     19 (01:14:17.691):: String From [36001:1:8]-[DATA=stationId:3900947]
    Line     20 (01:14:17.691):: String From [36001:1:8]-[OBJECT_OPEN=status]
    Line     21 (01:14:17.695):: String From [36001:1:8]-[DATA=code:200]
    Line     22 (01:14:17.699):: String From [36001:1:8]-[DATA=msg:OK.]
    Line     23 (01:14:17.707):: String From [36001:1:8]-[DATA=query:/tv/getProgInfo?major=202&minor=65535]
    Line     24 (01:14:17.711):: String From [36001:1:8]-[OBJECT_CLOSE=status]
    Line     25 (01:14:17.715):: String From [36001:1:8]-[DATA=title:CNN Newsroom]
    

    The code I have does not store any information whatsoever, it merely parses and the passes back the output.
  • Options
    the8thstthe8thst Posts: 470
    The reason the Java based JSON parsers are so much faster is because the JSON structure is natively implemented in JAVA (http://json.org/java/).

    I have not had any time to work on the JSON parsing in Netlinx since starting this thread. I really wish we had regular expressions in Netlinx as it would make the task much easier, but we don't.

    My idea was to use find_string to locate beginning and ending tokens of each data pair to create pointers. Then a little bit of simple math on the pointer values will let you be able to find and pull out nested values and embedded arrays.

    I think writing this module in Duet is the ideal way to do it, but I don't have a Duet license or the money and time to purchase a license and teach myself duet.

    I think you are working on Duet a little bit, so my recommendation is to have your Netlinx module save the JSON responses to a file, and then pass the file name and location to a Duet based JSON parser which will send the sanitized data pairs and arrays back to Netlinx in standard string events.

    There are a lot of different examples online for how to easily parse JSON/RSS/XML, etc in Java.

    It would be extremely nice to have a couple Duet Parser libraries for the standard data structures used on the web so we can quickly send a file to Duet for parsing and get the results back in Netlinx.
  • Options
    jjamesjjames Posts: 2,908
    Yeah - AMX Tools has a JSON parser in it, so one would only have to write an interface for it at that point. I started doing an interface with RegEx, which works pretty good. When I get some more time, I'll put it up here.

    I know of someone on another site that wrote their own JSON parser, should ask them what their times are like when processing large files. Parsing small responses is fine - it's the large response that take too long.
  • Options
    jjamesjjames Posts: 2,908
    Ok - I created two different ways to parse a JSON object. The first one is a character-by-character analysis; the other searches for the next possible character and essentially destroying the integrity of the string by using remove_string() to get pairs

    It turns out that the first rev (character-by-character) is a full second and a half faster than than the search & destroy. Unfortunately I can't quite think of another way might be a faster Netlinx implementation. Mind you, these were run on an NI-2000 running 3.60.453 FW. I ran the char-by-char one on an NI-900 and it was 2 seconds faster than my NI-2000. If I can't seem to figure a faster way, I'll probably wind up posting both versions here and allow for collaboration on them. For now though, I'm taking a break on this and working on real work.

    Each test was ran 5 times. Here are the results.

    Test 1 - 454 Character JSON Response (Program Info Request)
    Parser 1 (Character-by-Character): Average 0.171 seconds
    Parser 2 (Search & Destroy) : Average 0.149 seconds

    Test 2 - 17,329 Character JSON Response (DVR Listing Request)
    Parser 1 - Character-by-Character): Average 9.363 seconds
    Parser 2 - (Search & Destroy): Average 10.883
  • Options
    jjamesjjames Posts: 2,908
    Here we go!

    Okay - here we go. After several weeks of working on this off and on, I've trimmed it down and made it more of a generic JSON parser, here are the results:
    example1.json        :  3,554 characters - average time: 0.703 seconds
    example2.json        :    888 characters - average time: 0.497 seconds
    example3.json        :  1,725 characters - average time: 0.550 seconds
    proginfo.json        :    454 characters - average time: 0.138 seconds
    playlist.json        : 17,875 characters - average time: 4.771 seconds
    yahoo_stocks.json    : 22,448 characters - average time: 4.336 seconds
    yahoo_weather.json   :  1,969 characters - average time: 0.426 seconds
    

    You might be be questioning how the Yahoo stocks example was parsed faster than the DirecTV playlist file: it's because the Yahoo return has no white-space to skip (except for the CR/LFs for the chunking of the data when coming from the server.) This makes sense as running through each character is slow than searching for specific characters and performing mid_string. My tests were performed on using files rather than actually receiving the data from the servers since I wanted to eliminate that variable. So the attached has several ".json" files in it - put those in the main directory of the master to run the tests as I have.

    I would love to see results from an NI-x100 and NI-x00 as well; if anyone improves on the code - please don't be selfish and share. Also, it's Rev. 8 that is the quickest; Rev 7 is what it was obviously built off of - that is probably the best revision to start with if anyone decides to improve upon it.
  • Options
    PhreaKPhreaK Posts: 966
    the8thst wrote: »
    It would be extremely nice to have a couple Duet Parser libraries for the standard data structures used on the web so we can quickly send a file to Duet for parsing and get the results back in Netlinx.
    If I get the time over the next couple of weeks I'll try and throw together a Duet parser and accompanying wrapper include. Hopefully it should be possible to use the string_to_variable to load the data back from Duet into NetLinx. From the help file:
    The Decode variable must match the type of the encoded variable. In the case where the Encode variable was a structure then the Decode variable members must match in type and order. However if the number of members of the structures doesn’t match then the routine will fill all it can or skip any unused data members.
    This should let you build a structure containing the keys you are interested in, parse this to the wrapper function along with the json and have it returned populated.
  • Options
    jjamesjjames Posts: 2,908
    There is already a JSON parser in AMX Tools.
  • Options
    the8thstthe8thst Posts: 470
    jjames wrote: »
    Okay - here we go. After several weeks of working on this off and on, I've trimmed it down and made it more of a generic JSON parser, here are the results:
    example1.json        :  3,554 characters - average time: 0.703 seconds
    example2.json        :    888 characters - average time: 0.497 seconds
    example3.json        :  1,725 characters - average time: 0.550 seconds
    proginfo.json        :    454 characters - average time: 0.138 seconds
    playlist.json        : 17,875 characters - average time: 4.771 seconds
    yahoo_stocks.json    : 22,448 characters - average time: 4.336 seconds
    yahoo_weather.json   :  1,969 characters - average time: 0.426 seconds
    

    You might be be questioning how the Yahoo stocks example was parsed faster than the DirecTV playlist file: it's because the Yahoo return has no white-space to skip (except for the CR/LFs for the chunking of the data when coming from the server.) This makes sense as running through each character is slow than searching for specific characters and performing mid_string. My tests were performed on using files rather than actually receiving the data from the servers since I wanted to eliminate that variable. So the attached has several ".json" files in it - put those in the main directory of the master to run the tests as I have.

    I would love to see results from an NI-x100 and NI-x00 as well; if anyone improves on the code - please don't be selfish and share. Also, it's Rev. 8 that is the quickest; Rev 7 is what it was obviously built off of - that is probably the best revision to start with if anyone decides to improve upon it.

    I have both an NI-3000 and NI-3100 on my desk. I will try to run the tests tomorrow.
  • Options
    the8thstthe8thst Posts: 470
    NI-3100:
    example1.json        :  3,554 characters - average time: 0.238 seconds
    example2.json        :    888 characters - average time: 0.215 seconds
    example3.json        :  1,725 characters - average time: 0.196 seconds
    proginfo.json        :    454 characters - average time: 0.057 seconds
    playlist.json        : 17,875 characters - average time: 1.468 seconds
    yahoo_stocks.json    : 22,448 characters - average time: 1.273 seconds
    yahoo_weather.json   :  1,969 characters - average time: 0.155 seconds
    

    NI-3000:
    example1.json        :  3,554 characters - average time: 0.673 seconds
    example2.json        :    888 characters - average time: 0.475 seconds
    example3.json        :  1,725 characters - average time: 0.470 seconds
    proginfo.json        :    454 characters - average time: 0.133 seconds
    playlist.json        : 17,875 characters - average time: 4.585 seconds
    yahoo_stocks.json    : 22,448 characters - average time: 4.155 seconds
    yahoo_weather.json   :  1,969 characters - average time: 0.400 seconds
    
  • Options
    jjamesjjames Posts: 2,908
    Thank you! I'm quite happy with the results on the NI-3100. I knew the NI-x000 was slower - if anyone has an NI-700/900 to try these out on, that'd be cool.

    Again, this is only meant for a starting point in quick & effective JSON parsing; you could potentially use it as is, but I intend to add some features to it. With the source being all right here - anyone is free to make changes. (BTW - sorry for the lack of commenting! Sometimes my mind gets going so quick I don't have time to jot down what I'm doing.)
  • Options
    the8thstthe8thst Posts: 470
    I have a 700 at home that I can run the tests on later this week.

    I misread "NI-X100 and NI-X00" as "NI-X100 and NI-X000".
  • Options
    jjamesjjames Posts: 2,908
    No problem & thanks!

    I'm going to start writing a DTV module that'll parse the results into structures that would be easier to manage.
  • Options
    the8thstthe8thst Posts: 470
    NI-700
    example1.json        :  3,554 characters - average time: 0.282 seconds
    example2.json        :    888 characters - average time: 0.261 seconds
    example3.json        :  1,725 characters - average time: 0.235 seconds
    proginfo.json        :    454 characters - average time: 0.121 seconds
    playlist.json        : 17,875 characters - average time: 1.610 seconds
    yahoo_stocks.json    : 22,448 characters - average time: 1.502 seconds
    yahoo_weather.json   :  1,969 characters - average time: 0.187 seconds
    

    I was expecting these results to be slower than they are.
  • Options
    jjamesjjames Posts: 2,908
    Indeed! You must have a newer NI-700, because I thought the MIPS used to be lower than the NI-x100 series.

    Anyway - attached is some code for DirecTV JSON parsing. I changed the format of the returned strings from the parser and are a bit unconventional as far as AMX standards go. I have the parsing working in the example DTV module for the playlist & program info responses. Should be easy enough to add more responses if needed. Also, because of the way DTV responds I had to move the status object to the beginning, that way the DTV module knows what kind of response it got while parsing the rest of the info. I did this in the Main source file. This example is by no means complete and the attached code assumes you (anyone using this) knows how to get complete results to send to the parser.

    I think I'm about done with this and probably won't be taking it too much further until DirecTV solidifies what they're doing, so if anyone adds GUI to it, or makes it more complete - I'd be interested as I'm sure everyone else would be.

    As always, feedback would be nice.
  • Options
    trobertstroberts Posts: 228
    Sending strings via JSON format

    I did not want to start a new thread, since this seems like a GREAT one for parsing JSON. Is anyone familiar with how to send a string in a JSON format? Is it all asci characters.
    Taking an example from http://en.wikipedia.org
    If I wanted to send the following to an IP device....is it all asci string(s) plus 13,10s?
    {
    "firstName": "John",
    "lastName": "Smith",
    "age": 25,
    "address": {
    "streetAddress": "21 2nd Street",
    "city": "New York",
    "state": "NY",
    "postalCode": 10021
    },
    "phoneNumber": [
    {
    "type": "home",
    "number": "212 555-1234"
    },
    {
    "type": "fax",
    "number": "646 555-4567"
    }
    ]
    }
  • Options
    jjamesjjames Posts: 2,908
    JSON is just data, it's the transport that matters. Telnet, HTTP, ICSP, etc. is the transport and JSON is the payload.

    Sent from my Nexus S 4G using Tapatalk 2
  • Options
    trobertstroberts Posts: 228
    If the transport is HTTP how much does the header matter or is it based ONLY on the requirements of the device?
  • Options
    jjamesjjames Posts: 2,908
    The header is very important in HTTP since it determines the structure of the information being sent. The header could tell the inform the recipient if the data is "chunked" or if it's making a GET or POST request.

    What type of application are you looking to interact with? A protocol document would be helpful in this situation to better answer your questions. Using HTTP as the protocol is open ended since there are many different things that could (or could not) be required.

    If you use Firefox, Live Headers is a great add-on to "sniff" the headers that are exchanged.
    https://addons.mozilla.org/en-us/firefox/addon/live-http-headers/

    Wireshark is another great tool to have, not just for HTTP but in general to make sure that your responses are correct.
    http://www.wireshark.org/
  • Options
    jjamesjjames Posts: 2,908
    example1.json        :  3,554 characters - average time: 0.086 seconds
    example2.json        :    888 characters - average time: 0.051 seconds
    example3.json        :  1,725 characters - average time: 0.050 seconds
    proginfo.json        :    454 characters - average time: 0.025 seconds
    playlist.json        : 17,875 characters - average time: 0.649 seconds
    yahoo_stocks.json    : 22,448 characters - average time: 0.640 seconds
    yahoo_weather.json   :  1,969 characters - average time: 0.054 seconds
    

    I know it's been a while, but just wanted to share the results of running the tests on a new NX-4200.
  • Options
    DFrobergDFroberg Posts: 7
    I think this is going to save me tons of time.. Thanks for posting! I need to use Json RPC 2.0 for some Qsys gear. Have you done anything with that? I'll download this and see if I can get it to work for me.
  • Options
    jjamesjjames Posts: 2,908
    I haven't messed with JSON in quite sometime and could probably implement a better parser, but it's on the very bottom of my to-do list, pretty much in the "will never get to it" bucket.

    Plus side is that it's open source, so feel free to hack away at it.
  • Options
    DFrobergDFroberg Posts: 7
    Questions about Modules and a funny diagnostic message

    JJames,
    This seems to be working pretty good in my little program, but I noticed something odd and I've gotten a system message I've never seen before.
    It looks like you left in a couple of send string 0 for timing purposes with "START..." and "..END" in the module these BOTH show up in my diagnostic window before I start seeing the diagnostic messages I placed in my data event to receive the messages from the json module. it seems like the module chews through the whole json message and "ends" before the data event in the main program is sent one string. How do messages queue up between modules? how multithreaded are the new processors? ( this is running on a nx2200)
    On a related note, as my json strings are getting longer I've started getting " (Reader= Writer=)- CMessagePipe::MAX = 25 messages in my diagnostic window. Any idea what it is? I'm getting more than 25 data events and nothing seems to be dropped, but I'm still testing.
  • Options
    jjamesjjames Posts: 2,908
    While there are other threads going on in the NX, the NetLinx code is single threaded.

    That being said you should only send the PARSE command to the module when you are certain you have all of the data to be parsed. The json variable is shared between the main program and the module. So when you're done populating it, just call PARSE and the module should handle the rest.
  • Options
    DFrobergDFroberg Posts: 7
    Right. I'm getting a data event from my device and passing Data.txt to the Json variable then sending "parse". the incoming string is complete when it is sent.
    do you know what the " (Reader= Writer=)- CMessagePipe::MAX = 25 messages is about? is there a setting I should change?
  • Options
    jjamesjjames Posts: 2,908
    It's just a buffer hitting a specific mark. I think they increment in 25s? If the buffer isn't cleared out fast enough or the master gets stuck processing them, then that's a problem.

    You can use "show buffers" to see where they're at. Everything should read zero for a normal system.
  • Options
    DFrobergDFroberg Posts: 7
    hmmmm what should I do if my show buffers for the interpreter sits over 1300 and the max buffer can reach 4000? The master is sending JSON strings to a Qsys for status and getting a response. the strings are short, 200 char or less. Currently I can wire shark the master sending out the request and getting a reply back in 10ms or so. it doesn't work on parsing the message for seconds, At the moment the master is getting farther and farther behind. Is there a way to speed up the internal send strings for virtual devices? thanks!
  • Options
    Jorde_VJorde_V Posts: 393
    DFroberg wrote: »
    hmmmm what should I do if my show buffers for the interpreter sits over 1300 and the max buffer can reach 4000? The master is sending JSON strings to a Qsys for status and getting a response. the strings are short, 200 char or less. Currently I can wire shark the master sending out the request and getting a reply back in 10ms or so. it doesn't work on parsing the message for seconds, At the moment the master is getting farther and farther behind. Is there a way to speed up the internal send strings for virtual devices? thanks!


    If you mean the rate at which they are displayed to you, there is:

    - Preferences
    - Diagnostics
    - Diagnostics and Notifications Output Displays
    Read X line(s) from the buffer every 1/4 second.

    Granted this only helps if there's more lines coming in than it can display. (You can't change the speed)
  • Options
    DFrobergDFroberg Posts: 7
    Thanks, but what I'm interested in is changing the rate or amount of messages the data event can process before it switches to something else. The Json module takes a 200 char long string and breaks it up into 20-40 short strings and sends them one by one, back to the main program. It seems to be taking 100ms for a new nx2200 to get through a single short json string. it looks like I'm running into the buffer limits on how virtual devices can send strings to the main program. Using the telnet diagnostics, I can see that I'm hitting the limits( I think) and the buffer is running much higher than normal all the time. I wonder if the ni units would behave this way.... when I get in i'll have to experiment
  • Options
    Jorde_VJorde_V Posts: 393
    DFroberg wrote: »
    Thanks, but what I'm interested in is changing the rate or amount of messages the data event can process before it switches to something else. The Json module takes a 200 char long string and breaks it up into 20-40 short strings and sends them one by one, back to the main program. It seems to be taking 100ms for a new nx2200 to get through a single short json string. it looks like I'm running into the buffer limits on how virtual devices can send strings to the main program. Using the telnet diagnostics, I can see that I'm hitting the limits( I think) and the buffer is running much higher than normal all the time. I wonder if the ni units would behave this way.... when I get in i'll have to experiment

    Can you post the code? That way we can look for things to optimize.
  • Options
    GregGGregG Posts: 251
    DFroberg wrote: »
    Thanks, but what I'm interested in is changing the rate or amount of messages the data event can process before it switches to something else. The Json module takes a 200 char long string and breaks it up into 20-40 short strings and sends them one by one, back to the main program. It seems to be taking 100ms for a new nx2200 to get through a single short json string. it looks like I'm running into the buffer limits on how virtual devices can send strings to the main program. Using the telnet diagnostics, I can see that I'm hitting the limits( I think) and the buffer is running much higher than normal all the time. I wonder if the ni units would behave this way.... when I get in i'll have to experiment

    In my XML module I have a CONTINUE command that the receiving master code has to send after it has saved each element into the structure, so that way the reader doesn't just fill up the interpreter queue trying to read the whole file in one shot. You'll probably need something like this to throttle the JSON reader. It's less fun to write an asynchronous file/stream reader, but it keeps you from losing data.
Sign In or Register to comment.