The go.net/html tokenizer does not limit buffering. For example, the following program
will crash OOM:
pr, pw := io.Pipe()
z := NewTokenizer(pr)
go z.Next()
pw.Write([]byte("<"))
b := []byte(strings.Repeat("t", 1024*1024))
for {
pw.Write(b)
}
To improve robustness, the tokenizer should limit buffering.