[developers] training parse selection models using the fine system

Yi Zhang yzhang at CoLi.Uni-SB.DE
Tue Apr 11 11:53:02 CEST 2006


Hi Stephan,

Tim, Valia and I are still trying some experiments with the fine system 
and ERG. We have treebanked with different versions of the grammar and now 
I am trying to train the disambiguation models. I started with the fine 
system from http://lingo.stanford.edu/ftp/builds/2006-03-05/ , following 
the guide on the wiki. But when I click the Trees/Train, I got error 
saying:
podium-loop(): Symbol TRAIN does not have a function definition.


I also tried to follow the file `load' and `fc.lisp' in $HINOKI, using the 
LOGON tree. The feature caching seems to work, so that I got big `fc.abt' 
files afterwards. But when I try Trees/Train:

[11:07:26] operate-on-profiles(): reading `jan-06/jh0'
operate-on-profiles(): caching `jan-06/jh0' [11 - 511|.
[11:07:26] open-fc(): new BTree `fc.bdb'.
podium-loop(): Attempt to call #("db_open" 11440496 0 2 11440496) for
                which the definition has not yet been (or is no longer)
                loaded.

So my questions go: 1) how do I train the parse selection model with 
either DELPHIN tree or LOGON tree?
2) can I only cache some basic features for a model used by pet? I tried 
to set:
(setf tsdb::*feature-grandparenting* 0)
(setf tsdb::*feature-use-preterminal-types-p* nil)
(setf tsdb::*feature-lexicalization-p* 0)
(setf tsdb::*feature-active-edges-p* nil)
(setf tsdb::*feature-ngram-size* 0)
(setf tsdb::*feature-ngram-tag* :type)
(setf tsdb::*feature-ngram-back-off-p* nil)
but the fc.abt is still huge, and it still takes a long time.

Best,
Yi



More information about the developers mailing list