Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
ra
16 days ago
|
parent
|
context
|
favorite
| on:
Writing a good Claude.md
This is exactly right. Attention is all you need. It's all about attention. Attention is finite.
The more you data load into context the more you dilute attention.
throwuxiytayq
16 days ago
[–]
people who criticize LLMs for merely regurgitating statistically related token sequences have very clearly never read a single HN comment
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
The more you data load into context the more you dilute attention.