Why #DeepNeuralNetworks need #Logic:

Nick Shea (#UCL/#Oxford) suggests

(1) Generating novel stuff (e.g., #Dalle's art, #GPT's writing) is cool, but slow and inconsistent.

(2) Just a handful of logical inferences can be used *across* loads of situations (e.g., #modusPonens works the same way every time).

So (3) by #learning Logic, #DNNs would be able to recycle a few logical moves on a MASSIVE number of problems (rather than generate a novel solution from scratch for each one).

#CompSci #AI

I feel happy
I think happy thoughts
Modus ponens

          — § —

I am sad
I think dark thoughts
Modus ponens

          — § —

Modus ponens
What a silly prompt
Silly haiku

#WOTDHaikuPrompt #ModusPonens
#Haiku #Senyru #FreeHaiku @freehaiku

- implementing this function just shows that the type ((a -> b), a) -> b is #inhabited. Therefore #modusponens is true in our logic.

- Function application in the #lambda calculus is expressed by β-reduction.

The Curry–Howard correspondence relates function application to the logical rule of #modusponens
Eval : morphism
Currying : isomophism

- Besides atoms like numbers and strings, the only way to form new terms in #Lisp is using #modusponens