Cannot add new data on the Go_Bot

Hi !
I am actually working with Deeppavlov, and I am currently working on the go_bot model.
I add new data by adding new discussions to the DSTC2 dataset, like this :


As you can see I added a new restaurant with a food origin that has never been cited before, so I also add the new restaurant in the database.
but when I run the command : python -m deeppavlov train PATH/gobot_dstc2-1.json -d
to train the new model on the new added data, I get this error :

I am sure that this error is because the new word “argentinian” because it has never appeared before, normally when add for example a new “indian” restaurant it works without any problem.
I have tried to modify in the python script the variable “obs_size” but I get another error about the size of the input of the neural network.
I would like to know what do I have to modify to make thins thing work, please.
Thank you in advance for your help :smile:

Probably this is becasue the model was trained and saved with different set of parameters. I’m not an expert in dstc2, but maybe one possible way is to re-train the model with your updated data.

1 Like

Hi!
Thank you for your answer !
That’s actually what I am trying to do, as you can see above in my previous message, I am trying to train my model again from scratch with these new added data, but I have the feeling that he has the old saved parameters and still goes back to them and outputs an error.

Hi!
There are several ways to fix your issue.

  1. If you intend to train the model on new data from scratch, then remove files of pretrained model from disk and rerun usual train command (without -d):
rm -r ~/.deeppavlov/models/gobot_dstc2/
python -m deeppavlov train PATH/gobot_dstc2-1.json
  1. If your wish is to fine-tune the existing model on new data, then just remove the line "fit_on": ["x_tokens"] from config section with "id": "word_vocab". The problem with fit_on is that the vocabulary is then rebuilt every time you add new words in the dataset. Removing fit_on will only load old vocabulary of words from disk. Hence the input dimensions of the policy network won’t change with new data.
1 Like

Hi !
Thank you very much for your answers, that’s very helpful !
1- I will try to do that and get back to ASAP !
2- FOr the fine-tuning method : I don’t know if the config section you’re talking about is the one in the “gobot_dstc2-4.json” file :

After removing the two lines "fit_on" : ["x_tokens"]and "id": "word_vocab" I had this error :

I have also another question about the fine-tuning technique, you said that if I remove fit_on, will only load the old vocabulary and the input dimensions of policy network will not change with new data, but If I understood well this is not what I want, right ?
I want my model to be able to detect this new added data and be able to answer according to it.
Can you please explain me this concept, thank you in advance for your help.

Yes, the second method implies that the vocab stays the same and new words will be interpreted as unknown token UNK.
For the method to work, please remove only the line with fit_on without deleting "id": "word_vocab".

1 Like

Hi !
Thank you for your answers, I come back to you because I tried the two solutions but none of them worked for me :

1- For the first method I deleted all the files contained in /.deeppavlov/models/gobot_dstc2/ and trained again the chatbot.
After the training When I run the command python -m deeppavlov interact PATH/gobot_dstc2-1.json the chatbot didn’t recognize the word argentinian again.
When I checked the file word.dict in the gobot_dstc2 folder, I saw that indeed the word argentinian
was recognized but only two times because I only added one discussion in each file (train/valid/test) which was normal.
I tried to duplicate the discussion in all the files, but the chatbot was still not able to recognize the new word.
After that I checked the file word_dict in the /.deeppavlov/models/slotfill_dstc2/ and the word argentinian was not there, so I thought this is the origin of the problem.
Do I have to modify something in the gobot_dstc2-1.json file or the slotfill_dstc2.json ?

2- For the second file when I only delete the line "fit_on": ["x_tokens"] and keep all the remaining lines the same, I get this error :


After doing the method 1 and coming back to the second one I get the old error about the size of the variable obs_size.

Thank you very much for your help :blush: