Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore.

15 St Margarets, NY 10033
(+381) 11 123 4567



Poem Generator Web Application With Keras, React, and Flask


Natural Language Processing (NLP) is an exciting branch of machine learning and artificial intelligence, as it is applied in speech recognition, language translation, human-computer interaction, sentiment analysis, etc. One of the interesting areas is text generation, and of particular interest to me, is poem generation.

In this article, I describe a poem generator web application, which I built using Deep Learning with Keras, Flask, and React. The core algorithm is from TensorFlow available in their notebook. The data it needs is an existing set of poems. For my application, the data are in three text files with:

  1. Poems of Erica Jong.
  2. Poems of Lavanya Nukavarapu.
  3. Poems of Erica Jong and Lavanya Nukavarapu together.


The TensorFlow example notebook has the model building and training code, as well as the prediction code. I took the model building and training code into my notebooks and executed it on Google Colab to generate the models for each of the three datasets.

The neural network code is as follows:

The network starts with a Sequential model, which is used, as in our case, where each layer has exactly one input tensor and one output tensor. The first layer is the Embedding layer that turns ‘positive integers (indexes) into dense vectors of fixed size.’ The second layer is a Bidirectional layer that’s a wrapper over 150 instances of LSTMs, which are components of Recurrent Neural Networks. Next, we have a Dense layer as the output layer, which applies the softmax activation function on the propagating data. The model is compiled using the categorical_crossentropy function to compute loss between labels and prediction, and ‘adam‘ optimizer. Finally, it is trained for 150 epochs by calling the fit method.

To this base code, I added two callbacks:

  1. ModelCheckpoint for saving the model only if its accuracy in the current epoch is higher than that in the previous epoch. So, by the end of the propagation, we have the model with the highest accuracy.
  2. ReduceLROnPlateau for monitoring the loss function and reducing learning rate by a factor of 20% if learning stagnates, that is if no improvement is seen for 1 epoch.


The prediction part of the TensorFlow example is run-time Flask code in my application. I encapsulated the code in a class called PoemGenerator. This class has the following key methods:


The constructor takes as arguments, a string for the seed_data_text, a list of strings called data, which is nothing but the cleaned poem corpus, and a model. These argument values are copied into instance variables of the same name. The instance variable max_sequence_len is set to the maximum length of the n_gram sequences that are generated from each line after converting their text to sequences of numbers and left-padded with zeros.


This method has the main functionality of poem generation. The seed_text is converted to a numeric sequence, left padded with zeros, and passed to the model to predict the next word. If the predicted word and its index are present in the tokenizer, which is an instance variable, the word is accepted and appended to the seed text. Now the seed text with the appended word becomes the new seed text. It is passed to the model to predict the next word, and the process continues 100 times, resulting in a string output.


This method takes the generated string from the previous method and gives it the shape of a poem. It first removes unnecessary stuff like a word having just a backquote or a backslash. Then it removes adjacent duplicate words. In the third step, it takes a random number between 5 and 8 and slices those many words out of the string, and stores them as the first string element in a list. Effectively, this is the first line of the generated poem. This process of slicing random lengths (between 5 and 8) of words from the string is iterated until all the words in the generated string are removed. The poem is now transformed from a string to a list of strings.

Next, there are two clean up steps:

  1. If the last line has fewer than 5 words, it is dropped. This task is repeated until we have the last line that has 5 words or more.
  2. If the last word of the last line has fewer than 4 characters, then that word is dropped.

Finally, the poem is returned as a list of strings.

The code of strToPoem method is given below:


In the UI, the user has to:

  • Enter a set of words in a text field as seed text;
  • Select a poet, and;
  • Click a button (‘Generate Poem’).

MH Poem Generator

I encapsulated the text field, select drop-down, and button as one React component called PoemComponent. The code is in the file Poem.js and is sourced as a Babel typescript. Babel compiles it into browser-compatible JavaScript.

Flask serves public assets from the directory static, so Poem.js is placed in that folder. Since this is a simple screen, I did not use utilities like create-react-app or npm or Node runtime.

PoemComponent’s key functions and functionalities are given below.

The constructor sets the state with two variables: poem_header and poem, both arrays. The render function has:

  1. An h5 label.
  2. An input text field with ID ‘seed_text’ and a placeholder text ‘Enter seed text: 3 to 5 words.’
  3. A select element with ID ‘poet‘, the first option as ‘--  Please chose a poet  --‘ and the names ‘Erica Jong,‘ ‘Lavanya Nukavarapu,’ and ‘Erica+Lavanya‘ as the subsequent options.
  4. A button with the text ‘Generate Poem.’

The button’s onClick event is bound to the component and invokes the function getPoem


This function collects the seed_text and poet’s name by calling document.getElementById and uses them to concatenate an URL. It invokes fetch with this URL having the endpoint ‘/getpoem‘ on the Flask application. After the response is received, the function updates the state by setting the values of poem_header and poem. This triggers the poem_header and poem values to be updated in divs with the IDs ‘generated_poem_header‘ and ‘generated_poem.’

Finally, the last two lines in Poem.js render PoemComponent at the ‘poem_container‘ div in index.html.

Given below are important snippets of PoemComponent code:



The root endpoint (‘/‘) is the index method that just serves index.html from the template folder.
This file has the entire backend run time code. At startup time, three text files, containing the poetry datasets are read into a list and all words are converted to lower case. This data list is one of the arguments passed to the constructor of PoemGenerator.


This function is invoked at the endpoint location ‘/getpoem.’ From the GET request parameters, it grabs the user-entered seed_text and poet name. It uses the seed_text, the correct data list, and model (based on poet name) to instantiate a PoemGenerator object. On this object, it calls the generate_poem method to generate the poem and stores it in the list ‘poem’. It also calls the makeHeader method to create the metadata of the poem which is stored in the list poem_header. Both these lists are returned as JSON to the client browser.

Repository and Deployment

The code of this application is available in my Github repository mh-poem-generator.

I deployed the application on a cloud Ubuntu-18.04 server. Since TensorFlow 2.2.0 is required, I installed conda and used its version of gunicorn to run it as a systemd service. The application is collocated with other Flask and Ruby on Rails applications and served via Nginx.

The systemd configuration is given below:


The Nginx configuration is as follows: 


You can access the application at https://mahboob.xyz/pg


As of now, the generated poems have the shape of poems but don’t make much sense as actual poems. Sometimes a few lines come out well with good figurative expressions, but that’s all. To improve the poem quality, I will have to add additional layers to the neural network, fine-tune the parameters and enrich the poem lines to better sentences, like how MontyLingua does.

Credit: Source link

Previous Next
Test Caption
Test Description goes like this