Controllable Text Generation in Deep Learning with Transformers (GPT3) using Tensorflow & Keras Tutorial Series
This is the index page of the “Controllable Text Generation in Deep Learning with Transformers (GPT3) using Tensorflow & Keras” tutorial series.
We will cover all the topics related to Controllable Text Generation with sample implementations in Python Tensorflow Keras.
You can access the codes, videos, and posts from the below links.
You may like to start with learning the Text Generation methods in Deep Learning with Tensorflow (TF) & Keras from this tutorial series.
If you would like to learn more about Deep Learning with practical coding examples, please subscribe to the Murat Karakaya Akademi YouTube Channel or follow my blog on muratkarakaya.net. Do not forget to turn on notifications so that you will be notified when new parts are uploaded.
Last updated: 14/05/2022
BLOG LINKS:
Part A: Fundamentals of Controllable Text Generation:
- A1 A Review of Text Generation
- A2 An Introduction to Controllable Text Generation
- A3 Controllable Text Generation Approaches: Updating Sequential Inputs
Part B: A Tensorflow Data Pipeline for Word Level Controllable Text Generation
Part C: Building An Efficient TensorFlow Input Pipeline For Character-Level Text Generation
Part D: Sample Implementations of Controllable Text Generation with TensorFlow & Keras:
- C1 Approach: Input Update + Language Model: LSTM
- C2 Approach: Input Update + Language Model: Encoder-Decoder
- C3 Approach: Input Update + Language Model: Transformer (GPT3)
FULL CODE LINKS:
You can access the complete codes as Colab Notebooks using the links given in each video description below or you can visit the Murat Karakaya Akademi Github Repo.
YOUTUBE VIDEOS LINKS:
You can watch all these parts on the Murat Karakaya Akademi YouTube channel in ENGLISH or TURKISH.
VIDEOS IN ENGLISH
VIDEOS IN TURKISH
Comments or Questions?
Please share your Comments or Questions.
Thank you in advance.
Do not forget to check out the next parts!
Take care!