Generate Conversational Podcasts With GPT-2 and Google WaveNet

Listen to your favorite podcast — forever

Sanjeet Chatterjee
Better Programming
Published in
3 min readAug 11, 2021

--

Logo
Image by author

There have been many generative experiments with GPT-2, ranging from lifelike chatbots to replicating Twitter profiles.

From the OpenAI blog, GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset of 8 million web pages.

--

--