GPT-2 Wikipedia Movie Plot -> Story Generation! (featuring Spiderman stopping 9/11

Artificial Intelligence Society mentor Vignesh had us build a story generation application using Wikipedia Movie plot Kaggle dataset!

The Wikipedia Movie plot Dataset we played with linked here

My first interactions with GPT's (GPT2 and Distilibert)

Welcome to my first ever blogpost where my team showcased generative pretrained transformers trained on the Wikipedia Movie plot Dataset.

An AI Augmented future:

When it comes to script writing, we wanted to solve a script writer's writer's block by generating ideas based on previous successful movies. We wanted an AI augmented approach to writing where one could iterate over hundreds of successful movie scripts based on genre and spitball ideas for a human to then perfect. As the technology improves over time, we see the human still making the final decisions since AI can still make many, many mistakes and inconsistencies.

Our Model's Training Period:

We had some doubts on how well it would generate since this was our first time interacting with these techniques. Over the course of a semester we would take approximately 10-24 hours training each model. Eventually, we hit a sweet spot that we could showcase at AIM night where dozens of other AI projects get a chance at the spotlight. We could finally showcase our favorite story in December 2021 where we generated a story of Spiderman stopping 9/11 (which got some laughs from the crowd).

A success? Will these (G)enerative (P)retrained (T)ransformers ever see the light of day in a commercial setting?

The team spent many weeks reading and studying Huggingface's techniques since these were the early days of LLM's. As I make edits to this blog I believe we need to prepare for LLM's impact on the writing economy.