I still have to wrap my head around @screwlisp ‘s “deep learning in #Lisp” post <screwlisp.small-web.org/fundam…>, but I’m already hyped up:
• I we can do deep learning without numbers, only doing symbol/tree transformations
• And if there are systems that specialize on tree transformations and don’t have native numbers
• Then we can use these systems to do deep learning without actually needing any of the “modern” scaffolding for it!
Things that come to mind:
• @june@june@social.nouveau.community ‘s Modal and maybe Nova?
• @neauoire ‘s experiments with neural nets as embedded notepad connections
• ed(1) as a symbolic playground. C’mon, have you thought I’d go even one post without mentioning ed? Hell no! I’m going to understand this symbolic neural nets thing and start doing deep learning in ed! 🤩
Lo, thar be cookies on this site to keep track of your login. By clicking 'okay', you are CONSENTING to this.
screwlisp
in reply to Artyom Bologov • • •I had this feeling when I noticed I could just change +1 and -1 to foo and (complement foo) in a hopfield network, which is an implementation of (is dual to) deep learning (it turned out; they didn't know this originally).
Then, all the conventional wisdom about the disadvantages and advantages of deep learning seem to mostly have been artifacts of the string encoding being used.
@neauoire
what should I read about your connexions (or dare I say, neurons!)?
Devine Lu Linvega
in reply to screwlisp • • •I wrote about it a bit here: wiki.xxiivv.com/site/neural_ne…
I realized that it could power a sort of "smart highlight" that runs the thoughts of program, a bit like livecoding, but it's only a work in progress at the moment:
youtube.com/watch?v=nIlwdTIbTI…
youtube.com/watch?v=b2Tg7RPBve…
This is all heavily inspired by Minski's work in Finite & Infinite Machines.
neural nets
XXIIVV