mirror of
https://git.haproxy.org/git/haproxy.git/
synced 2025-11-07 03:50:59 +01:00
For tokenizing a string, standard Lua recommends to use regexes.
The followinf example splits words:
for i in string.gmatch(example, "%S+") do
print(i)
end
This is a little bit overkill for simply split words. This patch
adds a tokenize function which quick and do not use regexes.