[developers] Input chart and gesture
sweaglesw at sweaglesw.org
Tue Aug 23 07:10:20 CEST 2011
Sounds to me like it should work -- just like CFROM and CTO.
On Aug 22, 2011, at 9:54 PM,"Emily M. Bender" <ebender at uw.edu> wrote:
> Dear all,
> Katya is here for HPSG and we started talking today
> about how she might integrate the gesture information
> into the input string so she can test her analyses
> by building implemented grammars. I made a suggestion
> that relies on some assumptions about the input chart,
> and so I thought I would ask here if the assumptions
> The basic idea is to a list (actually diff-list) valued
> feature "GESTURE", associated with all signs. No lexical
> entries would constrain this feature, but the tokens
> in the input chart would use that slot to store information
> about gestures accompanying the words. The phrasal
> types would concatenate the GESTURE value of the
> daughters to make the GESTURE value of the mother.
> The purpose of this feature is to record information
> that non-branching gesture rules are sensitive to.
> These rules would take a daughter with specific gesture
> information and produce mother with the semantics
> of that gesture (added through their C-CONT).
> So the input chart question is: Can input tokens introduce
> feature values that unify with lexical entries but are not
> actually given in the lexical entries?
> Emily M. Bender
> Associate Professor
> Department of Linguistics
> Check out CLMA on facebook! http://www.facebook.com/uwclma
More information about the developers