Shawn Simister

168 Followers
250 Following
69 Posts
Building @GitHub Copilot. Interested in program synthesis, knowledge graphs, tools for thought 🇨🇦
LocationSan Francisco
I've got prompts from 4 related papers marked up now. You can try them out here: explainprompt.com
Working on a little side project to visualize prompting techniques like Chain of Thought. I like the idea of being able to break down a paper into bite-sized pieces and then step through the prompt like an interactive debugger.
Turing award winner Alan Perlis wrote: When we write programs that “learn”, it turns out that we do and they don’t. Forty years ago this was true; now it is the reverse.

did a rewrite of the writing flow tool and added...

✨ sections,
✨ a diff view for reviewing & editing suggested changes,
✨ and the start of a flow editor view
✨ a way to edit the generated paragraph summaries

coming up next: adding drag & drop reordering in the flow editor view 🙌

starting the day with @andy_matuschak 's article on "Cultivating depth and stillness in research" was exactly what I needed to remind myself to move more slowly and deliberately this year.

https://andymatuschak.org/stillness

Leading to a lovely day of printing words and marking them up over tea - feels like the best work is created with a blend of frenzied creating and slow marinating

Cultivating depth and stillness in research

It’s so fun training your own data via embeddings and being able to ask a GPT3 bot empowered with your context.

And federating with all kinds of data (Eg. For me: remix, Shopify, web.dev, etc etc) #genai

Happy Birthday, Wikipedia! 22 years, while I'm 44 - half of my life (although I haven't joined until two years later). For an entire generation the world has always been a world with free knowledge that everyone can contribute to. I hope there is no going back from that achievement. 8/n

Some reflections on my experience building the Twemex browser extension, and why tweaking existing software can be nice:

https://www.geoffreylitt.com/2023/01/08/for-your-next-side-project-make-a-browser-extension.html

For your next side project, make a browser extension

Reflections on the benefits of tweaking an existing app, instead of starting from scratch.

"A consistent challenge in my development as a researcher has been: how to cultivate deep, stable concentration in the face of complex, ill-structured creative problems?"

Newly unlocked Letter from the Lab in the spirit of new year reflections/planning: https://andymatuschak.org/stillness/

Memory Augmented Large Language Models are Computationally Universal

abs: https://arxiv.org/abs/2301.04589

Memory Augmented Large Language Models are Computationally Universal

We show that transformer-based large language models are computationally universal when augmented with an external memory. Any deterministic language model that conditions on strings of bounded length is equivalent to a finite automaton, hence computationally limited. However, augmenting such models with a read-write memory creates the possibility of processing arbitrarily large inputs and, potentially, simulating any algorithm. We establish that an existing large language model, Flan-U-PaLM 540B, can be combined with an associative read-write memory to exactly simulate the execution of a universal Turing machine, $U_{15,2}$. A key aspect of the finding is that it does not require any modification of the language model weights. Instead, the construction relies solely on designing a form of stored instruction computer that can subsequently be programmed with a specific set of prompts.

arXiv.org