python - Does maximum vocabulary count is related to word vector dimensions in Glove model -


i have implemented glove model following implementation link https://github.com/stanfordnlp/glove/tree/master/src. have specified max vocab parameter 100000000 while generating vocab.txt file , word vector dimensions 100 while training model , generated vectors.txt 100 dimensions. when trying evaluate word_analogy.py eval folder in above link, getting following error

  file "c:\users\jayashree\anaconda2\lib\site-packages\spyderlib\widgets\externalshell\sitecustomize.py", line 714, in runfile     execfile(filename, namespace)    file "c:\users\jayashree\anaconda2\lib\site-packages\spyderlib\widgets\externalshell\sitecustomize.py", line 74, in execfile     exec(compile(scripttext, filename, 'exec'), glob, loc)    file "c:/users/jayashree/documents/1 billion words/word_analogy.py", line 77, in <module>     w, vocab, ivocab = generate()    file "c:/users/jayashree/documents/1 billion words/word_analogy.py", line 32, in generate     w[vocab[word], :] = v  valueerror: cannot copy sequence size 66 array axis dimension 100 

i want know whether parameters specified during vocabulary file creation has effect on dimensions of vectors file


Comments

Popular posts from this blog

Is there a better way to structure post methods in Class Based Views -

Python Tornado package error when running server -

Qt QGraphicsScene is not accessable from QGraphicsView (on Qt 5.6.1) -