MACCENT (KUL)

System MACCENT
Code C and Prolog executable for Sun SparcStations
References [2]
Pointers http://www.cs.kuleuven.ac.be/~ldh/#software

MACCENT addresses the novel task of stochastic MAximum ENTropy modeling with Clausal Constraints. Maximum Entropy method is a Bayesian method based on the principle that the target stochastic model should be as uniform as possible, subject to known constraints. MACCENT incorporates clausal constraints that are based on the evaluation of Prolog clauses in examples represented as Prolog programs.

MACCENT can operate in two modes, each based on a different existing maximum-likelihood approach to maximum entropy modeling. Both these approaches are upgraded so as to allow for richer first-order logic representations.

In Mode 1, which upgrades [1], constraints are added one at a time. At each step the constraint is added which yields the highest gain in terms of log-likelihood of the training data.

In Mode 2, which upgrades [4], all constraints are selected in a preliminary stage using WARMR [3]. In contrast to Mode 1, the model is computed only once for all selected contstraints.

References

  1. A.L. Berger, V.J. Della Pietra, and S.A. Della Pietra. A maximum entropy approach to natural language processing. Computational Linguistics, 22(1):39-71, 1996.

  2. L. Dehaspe. Maximum entropy modeling with clausal constraints. In Proceedings of the tth International Workshop on Inductive Logic Programming, volume 1297 of Lecture Notes in Artificial Intelligence, pages 109-124. Springer-Verlag, 1997.

  3. L. Dehaspe and L. De Raedt. Mining association rules in multiple relations. In Proceedings of the tth International Workshop on Inductive Logic Programming, volume 1297 of Lecture Notes in Artificial Intelligence, pages 125-132. Springer-Verlag,1997.

  4. A. Ratnaparkhi. A maximum entropy part-of-speech tagger. In Proceedings of the Empirical Methods in Natural Language Processing Conference. University of Pennsylvania, 1996.


back to index