Multi-Topic (Multi-Class) Text Classification With Various Deep Learning Models Tutorial Series
Index Page
This is the index page of the “Multi-Topic (Multi-Class) Text Classification With Various Deep Learning Models” tutorial series.
Author: Murat Karakaya
Date created….. 17 Sept 2021
Date published… 11 March 2022
Last modified…. 09 April 2023
Description: This is a tutorial series that covers all the phases of text classification: Exploratory Data Analysis (EDA) of text, text preprocessing, and multi-class (multi-topic) text classification using the TF Data Pipeline and the Keras TextVectorization preprocessing layer.
We will design various Deep Learning models by using the Keras Embedding layer, Convolutional (Conv1D) layer, Recurrent (LSTM) layer, Transformer Encoder block, and pre-trained transformer (BERT).
We will use a Kaggle Dataset with 32 topics and more than 400K reviews.
We will cover all the topics related to solving Multi-Class Text Classification problems with sample implementations in Python TensorFlow Keras.
You can access the codes, videos, and posts from the below links.
If you would like to learn more about Deep Learning with practical coding examples, please subscribe to the Murat Karakaya Akademi YouTube Channel or follow my blog on muratkarakaya.net. Remember to turn on notifications so that you will be notified when new parts are uploaded.
PARTS
In this tutorial series, there will be several parts to cover the “Text Classification with various Deep Learning Models” in detail as follows.
You can access all these parts on YouTube in ENGLISH or TURKISH!
You can access the complete codes as Colab Notebooks using the links given in each video description (Eng/TR) or you can visit the Murat Karakaya Akademi Github Repo.
- PART A: A PRACTICAL INTRODUCTION TO TEXT CLASSIFICATION
- PART B: EXPLORATORY DATA ANALYSIS (EDA) OF THE DATASET
- PART C: PREPARE THE DATASET
- PART D: PREPROCESSING TEXT WITH TF DATA PIPELINE AND KERAS TEXT VECTORIZATION LAYER
- PART E: MULTI-CLASS TEXT CLASSIFICATION WITH A FEED-FORWARD NETWORK (FFN) USING AN EMBEDDING LAYER
- PART F: MULTI-CLASS TEXT CLASSIFICATION WITH A FEED-FORWARD NETWORK (FFN) USING AN 1 DIMENSIONAL CONVOLUTION (CONV1D) LAYER
- PART G: MULTI-CLASS TEXT CLASSIFICATION WITH A FEED-FORWARD NETWORK (FFN) USING A RECURRENT (LSTM) LAYER
- PART H: MULTI-CLASS TEXT CLASSIFICATION WITH A TRANSFORMER ENCODER BLOCK
- PART I: MULTI-CLASS TEXT CLASSIFICATION WITH A PRE-TRAINED (BERT) TRANSFORMER
- PART J: THE IMPACT OF TRAIN DATA SIZE ON THE PERFORMANCE OF MULTI-CLASS TEXT CLASSIFIERS
- PART K: HYPERPARAMETER OPTIMIZATION (TUNING), UNDERFITTING, AND OVERFITTING
Comments or Questions?
Please share your Comments or Questions.
Thank you in advance.
Do not forget to check out the following parts!
Take care!