Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
gbrown
on April 5, 2021
|
parent
|
context
|
favorite
| on:
Are deep neural networks dramatically overfitted? ...
But what if we drop the word comprehension, and we just go with “functional approximation X -> Y, computed from a finite dataset, which minimizes a predictive risk “?
It’s unclear why compression is necessary there, except as a practical benefit.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
It’s unclear why compression is necessary there, except as a practical benefit.