[developers] Defaults in TDL

goodman.m.w at gmail.com goodman.m.w at gmail.com
Wed Sep 5 18:45:41 CEST 2018


Thanks for sharing, Emily.

If that was in Paris in 2010, did anything change by the StanfordDefaults
discussion? In particular, I didn't see any clear mention in the
StanfordDefaults wiki of the "collateral file" of default constraints that
your notes describe. Is that still the preferred method? I would think that
having these constraints separate from the relevant type descriptions would
be confusing, but it sounds like there were convincing arguments for it.

If we do have the separate file, I hope it would be a format supported by
all processors and not separately in, e.g., ace/config.tdl and lkb/script.

On Tue, Sep 4, 2018 at 3:11 PM Emily M. Bender <ebender at uw.edu> wrote:

> Dear Mike,
>
> Thanks for bringing up this issue.  At the 2010 DELPH-IN Paris Summit, Ann
> and I had a further conversation about this, from which I took the homework
> of typing up what it is I'd like to have (as a grammar developer, and
> especially from the point of view of the Matrix) wrt to defeasible
> constraints.  Here's what I wrote down later that year (Oct 27):
>
> Dear Ann,
>
> Here, with much more delay than I intended, is the write up
> I promised of my (reconstruction of my) understanding of where
> we ended up in our discussion of defeasible identity constraints
> over crepes in Paris.
>
> First, why I want it:
>
> In lexical rules, we want to be able say (like in SWB) that
> the value of certain features (HOOK, CAT, ARG-ST) is shared
> between the mother and the daughter unless the rule contradicts
> this.  If the rule does contradict it, then we want only the information
> specifically stated as such to change, and the rest "around" it,
> to be shared.
>
> For a concrete example, take a hypothetical lexical rule that
> changes the case on the first complement from acc to dat.
>
> First, here's the general lex rule type:
>
> lex-rule := phrase-or-lexrule & word-or-lexrule &
>   [ NEEDS-AFFIX bool,
>     SYNSEM.LOCAL.CONT [ RELS [ LIST #first,
>                              LAST #last ],
>                         HCONS [ LIST #hfirst,
>                                 LAST #hlast ] ],
>     DTR #dtr & word-or-lexrule &
>         [ SYNSEM.LOCAL.CONT [ RELS [ LIST #first,
>                                      LAST #middle ],
>                               HCONS [ LIST #hfirst,
>                                       LAST #hmiddle ] ],
>           ALTS #alts ],
>     C-CONT [ RELS [ LIST #middle,
>                     LAST #last ],
>              HCONS [ LIST #hmiddle,
>                      LAST #hlast ]],
>     ALTS #alts,
>     ARGS < #dtr > ].
>
> And a subtype with the defeasible identity indicated
> (using /# for now):
>
> defeasible-identity-lex-rule := lex-rule &
>   [ SYNSEM.LOCAL.CAT <http://synsem.local.cat/> /#cat,
>     ARG-ST /#arg-st,
>     C-CONT.HOOK /#hook,
>     DTR [ LOCAL [ CAT /#cat,
>                            CONT.HOOK /#hook ],
>              ARG-ST /#arg-st ]].
>
> The lex rule definition itself would just look like this:
>
> acc-to-dat-obj-lex-rule := lex-rule &
>  [ SYNSEM.LOCAL.CAT.COMPS.FIRST.LOCAL.CAT.HEAD.CASE dat,
>    DTR.SYNSEM.LOCAL.CAT.COMPS.FIRST.LOCAL.CAT.HEAD.CASE acc ].
>
> The intended behavior is for that to compile into a rule that
> includes constraints like these (I'm sure I'm missing some here):
>
> acc-to-dat-obj-lex-rule (expanded):
>  [ SYNSEM.LOCAL.CAT <http://synsem.local.cat/> [ HEAD #head,
>                                       VAL [ SPR #spr,
>                                                SPEC #spec,
>                                                SUBJ #subj,
>                                                COMPS [ REST #rest,
>                                                              FIRST [
> NON-LOCAL #non-local,
>   LOCAL [ CONT #cont,
>                 CAT [ VAL #val,
>                           AGR #agr,
>                           HEAD.CASE dat ]]]]]],
>   C-CONT.HOOK #hook,
>   ARG-ST #arg-st,
>   DTR [ SYNSEM.LOCAL [ CONT.HOOK #hook,
>                                        CAT [ HEAD #head,
>                                                  VAL [ SPR #spr,
>                                                           SPEC #spec,
>                                                           SUBJ #subj,
>                                                           COMPS [ REST
> #rest,
>
>  FIRST [ NON-LOCAL #non-local,
>             LOCAL [ CONT #cont,
>                           CAT [ VAL #val,
>                                    AGR #agr,
>                                    HEAD.CASE acc ]]]]]]],
>            ARG-ST #arg-st ]].
>
>
> What I remember from Paris is that we decided it would be best to
> encode these constraints not directly in the type definition as
> I did above in defeasible-identity-lex-rule but in a collateral file that
> instructs the LKB to do something special with certain feature paths
> on instances of certain types at compile time.
>
> We also worked out that we would only be able to "push down" the identity
> constraint to features that were necessitated by the types invoked
> in the rule.  Thus in the example above, we know that SPR, SPEC and SUBJ
> need to be identified because the value of VAL is necessarily "valence"
> (and not valence-min) as we've mentioned COMPS. But if CASE were
> appropriate
> for both noun and comp (for example), then we wouldn't be able to know to
> put in identity constraints for any other features of noun (or comp).  If
> the daughter in fact had a constraint on one of these other features,
> it wouldn't be copied up to the mother.  Relatedly, we lose the actual
> HEAD
> value because we can't identify HEAD while changing CASE.  (So here, the
> grammar
> writer would need to stipulate [HEAD noun], say, on the mother.)
>
> In Paris, I remember being convinced that the added simplicity in defining
> lexical rules would out-weigh the lack of transparency noted above.  And
> I'm still pretty sure I agree with that.  One thing in favor of that view
> is that if a rule
> defined using the defeasible identity type didn't have the expected behavior,
> the
> grammar engineer could always either add constraints or side-step
> that type and hand-specify all the desired identities.
>
> A further complication I noticed while writing out this example is the
> interaction between defeasible and indefeasible identity tags.  Two
> conditions
> to consider:
>
> 1) The rule inherits a constraint (e.g., from the type of the DTR value)
> that
> the REST of the ARG-ST is the same as the COMPS list.
> 2) The rule doesn't inherit such a constraint, but the constituent that
> serves as the daughter identifies its ARG-ST.REST and its COMPS.
>
> I think (2) isn't a problem (this is very similar to things that confused
> Tom, Ivan and I as we designed the lex rules in the textbook, though, so
> I'm
> not feeling very confident just now!).  As for (1), it could entail a
> similar
> push down of identity inside the ARG-ST.  But what if the ARG-ST to
> DTR.ARG-ST identification were a non-defeasible identity constraint?
> Maybe that's just a broken grammar that either shouldn't compile or would
> just have surprising behavior.
>
> I hope you are still interested in this problem. Let me know if/when it
> would
> be useful to have a grammar to play with.
>
> Thanks!
>
> On Tue, Sep 4, 2018 at 10:42 AM, goodman.m.w at gmail.com <
> goodman.m.w at gmail.com> wrote:
>
>> Hello everyone,
>>
>> I appreciate the feedback I've received in previous messages in my
>> attempts to dust off neglected corners of TDL syntax, and I'd now like to
>> bring up "defaults", or "defeasible constraints" (I believe these refer to
>> the same thing). Are we prepared to start supporting
>> defaults/defeasible-constraints in our processors and using them in our
>> grammars? Or should we discard them as an undesired experimental feature
>> (i.e., declare them to *not* be part of DELPH-IN TDL)?
>>
>> Further information:
>>
>> Currently, only the LKB supports them (and maybe PET?). As I understand,
>> they are a compile-time feature, meaning that they change how the grammar
>> is compiled and that there is no longer a notion of "defaults" during
>> run-time. I don't think the use of defaults causes any change in the
>> competence or performance of a grammar.
>>
>> The benefit of defaults is for the grammar engineer as it can reduce the
>> amount of boilerplate code and make the grammar source code more intuitive.
>> I think any result that makes grammar writing easier is a big win. The
>> differences it creates between the source-code form of the grammar and the
>> compiled hierarchy, however, can complicate debugging (e.g., interactive
>> unification).
>>
>> Some links:
>>   - http://www.aclweb.org/anthology/J99-1002
>>   - http://moin.delph-in.net/ParisDefeasibleConstraints
>>   - http://moin.delph-in.net/StanfordDefaults
>>
>> --
>> -Michael Wayne Goodman
>>
>
>
>
> --
> Emily M. Bender
> Professor, Department of Linguistics
> University of Washington
> Twitter: @emilymbender
>


-- 
-Michael Wayne Goodman
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.delph-in.net/archives/developers/attachments/20180905/23a1c536/attachment-0001.html>


More information about the developers mailing list