[developers] Input chart and gesture

Emily M. Bender ebender at uw.edu
Tue Aug 23 06:54:34 CEST 2011


Dear all,

Katya is here for HPSG and we started talking today
about how she might integrate the gesture information
into the input string so she can test her analyses
by building implemented grammars.  I made a suggestion
that relies on some assumptions about the input chart,
and so I thought I would ask here if the assumptions
hold.

The basic idea is to a list (actually diff-list) valued
feature "GESTURE", associated with all signs.  No lexical
entries would constrain this feature, but the tokens
in the input chart would use that slot to store information
about gestures accompanying the words.  The phrasal
types would concatenate the GESTURE value of the
daughters to make the GESTURE value of the mother.

The purpose of this feature is to record information
that non-branching gesture rules are sensitive to.
These rules would take a daughter with specific gesture
information and produce  mother with the semantics
of that gesture (added through their C-CONT).

So the input chart question is:  Can input tokens introduce
feature values that unify with lexical entries but are not
actually given in the lexical entries?

Thanks,
Emily



-- 
Emily M. Bender
Associate Professor
Department of Linguistics
Check out CLMA on facebook! http://www.facebook.com/uwclma



More information about the developers mailing list