Home AMX User Forum NetLinx Studio
Options

Big string to widechar string - error

Hi everyone!
I can’t convert big string buffer to widechar string.
The incoming string buffer is (sHTTPBuffer length = 39708) rather big and if I do this
wcHTTPBuffer = WC_DECODE(sHTTPBuffer, WC_FORMAT_UTF8, 1)
OR
wcHTTPBuffer = ch_to_wc(sHTTPBuffer)
and I get loop of this errors in diag
Line 11 (12:51:32):: Ref Error ? Index to large Line=3649
Line 12 (12:51:32):: SetVariable - Error 1 Tk=0x0000
I tried to change the array size
volatile widechar wcHTTPBuffer [50000] or
volatile widechar wcHTTPBuffer [100000] or
volatile widechar wcHTTPBuffer [1000000]
……..
But if incoming’s char buffer length is (sHTTPBuffer length = 4262) – no errors and widechar is converted correctly.
Please advise.

Comments

  • Options
    ericmedleyericmedley Posts: 4,177
    The function call might have a size limit. Not at my computer to check. You may have to do the buffer in 4Kbyte chunks or on char at a time with a loop.
  • Options
    Thanks Eric.
    I think you're right.
    I also tried
    sTempBuffer = "sTempBuffer, data.text" in string event, but sTempBuffer stays always 15999 length though it's global var.
    I should try char by char copy ...
    If anyone has such examples I would be very thankfull.
  • Options
    ericmedleyericmedley Posts: 4,177
    yeah, using data.text is not going to work due to the size limit. (2K)

    you might try using CREATE_BUFFER. Its size limit is much bigger.
  • Options
    Actually I do.

    CREATE_BUFFER dvIP_TCP_Client_Http, sHTTPBuffer
    and it grows to 65000 length.

    But unfortunately I can't do this >>
    wcHTTPBuffer = WC_DECODE(sHTTPBuffer, WC_FORMAT_UTF8, 1)


    This doesn't work either if sHTTPBuffer is more than 16000
    for (i=1; i<=length_string(sHTTPBuffer); i=i+9999) {
    wcTemp = ch_to_wc(mid_string(sHTTPBuffer,i,9999))
    wcHTTPBuffer = wc_concat_string(wcHTTPBuffer, wcTemp)
    }
  • Options
    I see only one way to save widechar buffer
    is to make an array of widechar strings.
    But extracting strings from an array and parsing on the fly can be a challange.
  • Options
    GregGGregG Posts: 251
    I do my large HTTP downloads to a temp file on the master, then pull out the data to process it after it has all come in.
  • Options
    GregG,
    Do you use ASCII string or UTF string when parsing ?
  • Options
    GregGGregG Posts: 251
    The data stream coming from any modern HTTP 1.1 compliant web server should be US-ASCII so I just handle all the raw data as plain netlinx chars.

    The buffer I use is defined like this:
    Volatile Char cDataBuffer[8192]

    And that is probably twice as large as it would need to be.

    I used the RFC as a protocol document for the internet when I wrote my module:
    https://www.ietf.org/rfc/rfc2616.txt
  • Options
    I did it!!!
    I changed WC_MAX_STRING_SIZE in UnicodeLib.axi from 16 000 to 160 000 and functions
    like WD_DECODE and CH_TO_WC started to work with large text buffer!
  • Options
    ericmedleyericmedley Posts: 4,177
    karageur wrote: »
    I did it!!!
    I changed WC_MAX_STRING_SIZE in UnicodeLib.axi from 16 000 to 160 000 and functions
    like WD_DECODE and CH_TO_WC started to work with large text buffer!



    I'd be very careful tweaking under the hood like that. It could have unforeseen consequences that can bite you later.
Sign In or Register to comment.