Keras Text Gen with RNN, how to predict new characters in the reverse direction?

Using this notebook tutorial:
https://colab.research.google.com/drive/1ysEKrw_LE2jMndo1snrZUh5w87LQsCxk#forceEdit=true&sandboxMode=true

It’s a tutorial to predict the next character of text n times to tell a story.

It’s trained using chunks such as ‘■■■■’, ‘ello’ to predict the next letter as in this example of the word ‘hello’.

My question is that since we have the data to go forwards, we technically also have the information to predict backwards from ‘ello’ to ‘■■■■’. I would like to accomplish this.

As an example. At the end of the tutorial, you have a prompt to begin the text generation. I would type the world ‘dream.’ and the model would predict and build my sentence in reverse to something like ‘I have a dream.’.

I’m not sure how to interact with our existing model to accomplish this and need a bit of hand holding.

1 Like

Hi @Genji.Tapia:

It’s been a long time since you posted. Did you figure it out because I’m curious to find out myself.

@masterryan.prof Here’s a post mortem. The model didn’t have bi-directional information. I suppose if you think of it as a markov chain, you only store the probabilities of what would come next, you don’t have any stored information to go backwards. So as a hacky solution I created a second model that I trained with the same data in reverse. Essentially this model learned to type backward. As a side effect, if you reversed the output string, it would appear as if it’s predicting characters from right to left.

Using both models I was able to generate the kind of output I was going for.

But overall it’s far from ideal. I’m currently studying GPT-2 & GPT-3 and Bi-directional Transformers aka (BERT) to go further with my ideas.

BERT

GPT-2

1 Like

Looks good! Thanks for the links @Genji.Tapia will check it out in my free time