Keep learning streams complex enough to avoid boring the learning algorithms

Alpha Alimamy Kamara @alpha_alimamy Graph Neural Networks as gradient flows by @mmbronstein in @TDataScience https://towardsdatascience.com/graph-neural-networks-as-gradient-flows-4dae41fb2e8a?source=social.tw
Replying to @alpha_alimamy @mmbronstein and @TDataScience

If you expose your neural nets to continuously increasingly complex streams, it will not get bored and fall into simple patterns. It is not the algorithm that gets lazy, it is the input that in boring. “over-smoothing” can be handled by keeping the learning streams complex enough to avoid boredom.


Russ Poldrack @russpoldrack  “People with very high expectations have very low resilience.” https://sfgate.com/tech/article/jensen-huang-nvidia-stanford-suffering-19023273.php
Replying to @russpoldrack

All of low-low, low-high, high-low and high-high exists. Ignoring possibilities kills off species. Best wishes for an interesting life. Challenge does not have to mean suffering. There are 8 billion responses to life, no two exactly the same, nor constant over time.

Richard K Collins

About: Richard K Collins

The Internet Foundation Internet policies, global issues, global open lossless data, global open collaboration


Leave a Reply

Your email address will not be published. Required fields are marked *