String of http length limit work-around?

Hi, I recently read that http.Fetch returns a maximum of 4096 characters. However I will be using it to read from sources which may have significantly longer lengths, so I have a few questions:

  1. is 4096 a limit from http.Fetch or from Lua string themselves?
  2. is there a workaround? either a way to split the body into strings and get them into a table, or perhaps a way to cut off part of the body as required?

Edit: Just checked on http://www.lua.org/cgi-bin/demo and there doesn’t appear to be a 4096 limit for strings, however i would still like to know if anybody knows of a workaround.

[editline]31st March 2016[/editline]

Looking on the wiki, potentially possible using Panel:SetURL() and TextEntries:GetValue() - this allows for approx. 8,100 characters to be downloaded at once. If combined with javascript it could be possible to read off 8,000 characters at a time and then delete them. Will test when at a testing environment.

It is possibilty a restriction from the HTTP client itself (maybe only has a buffer of 4096).

I recall someone testing http.Fetch with millions of characters and it worked fine. Can’t remember the reason or the thread it was posted on.

Getting ~55 thousand chars through http.Fetch and it works fine for me.

Are you sure it was getting that many characters? The meta for the string appears to state that many characters when it isn’t i.e. string.len will say 55,000 when its actually 4,000 still.

What makes you think its 4000? Have you print()ed it out? It is possible print() has limitations.

Pasted my 2k line code in online char counter, it also gets RunString’ed and works fully as expected.

Found my issue >.<

Long story short the ~4,000 limit was on print, not http or strings.

Guess you never expect your debugging to need debugging…

Anyway thanks for the help guys! :smile: