All part of speech groups working together – tokenize the whole Internet, so AIs can work with real languages
Paul Rayson, The reason the OpenAI Bing ChatGPT fails is because it uses a bad tokenizer. If the part of speech community would work together, they could standardized the part of speech tokens and code the entire Internet. So it would not have to be scanned and parsed every time. A pre-tokenized, pre-coded, internet would
Read More »