Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
xiaodai
on June 16, 2020
|
parent
|
context
|
favorite
| on:
Gated Linear Networks
what stops you from using a few convolution layers and then use the GLN in the last layer? You achieve the best of both worlds if your theory is true.
fxtentacle
on June 16, 2020
[–]
That would work, but it would likely re-introduce the catastrophic forgetting problem, because now the GLN is dependent on an intermediate representation determined by conv layers.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: