Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
8note
17 days ago
|
parent
|
context
|
favorite
| on:
AI will make formal verification go mainstream
the question remains: is the tokenizer going to be a fundamental limit to my task? how do i know ahead of time?
worldsayshi
17 days ago
[–]
Would it limit a person getting your instructions in Chinese? Tokenisation pretty much means that the LLM is reading symbols instead of phonemes.
This makes me wonder if LLMs works better in Chinese.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: