Abstract
We explore the notion of generalization in the setting of symbolic mathematical computing. By "generalization" we mean the process of taking a number of instances of mathematical expressions and producing new expressions that may be specialized to all the instances. We first identify a number of ways in which generalization may be useful in the setting of computer algebra, and formalize this generalization as an antiunification problem. We present a single-pass algorithm for antiunification and give some examples.
Original language | English |
---|---|
Title of host publication | Maple Conference |
Number of pages | 7 |
Publication date | 2005 |
Pages | 277-382 |
Publication status | Published - 2005 |
Externally published | Yes |