Text generation is a type of natural language processing that uses computational linguistics and artificial intelligence to automatically produce text that can meet specific communicative needs. This demo uses the Generative Pre-trained Transformer 2 (GPT-2) model for text prediction.
The complete pipeline of this demo's notebook is shown below.
This is a demonstration in which you can type the beginning of the text and the network will generate a further. This procedure can be repeated as many times as you desire.
The following image show an example of the input sequence and corresponding predicted sequence.
This notebook demonstrates text prediction with OpenVINO, using the gpt-2 model from HuggingFace Transformers.
If you have not installed all required dependencies, follow the Installation Guide.