We hosted our second workshop of term on OpenAI’s GPT-3 and Codex, looking at how these models work, and their possible applications.

Many thanks to the Department of Engineering!

GPT-3 is a text generator that made headlines a year ago by producing essays and news articles indistinguishable from human written ones.


Codex is similar, but speaks programming languages fluently: This includes being able to take a single comment and generate an entire function or more (prompts like “a function that adds the odd numbers in a list” or “a webpage with a big red button in the center, that gives you a random cat gif when pressed” can actually work!).

During our workshop, we:
– Tried working with Codex on programming challenges
– Got creative using GPT-3 to solve real-world tasks
– Learned about how such models work, and important limitations
– Brainstormed applications (and perhaps invent the next Google) to win valuable prizes such as dark chocolate!