Abstract
We explore the notion of generalization in the setting of symbolic mathematical computing. By "generalization" we mean the process of taking a number of instances of mathematical expressions and producing new expressions that may be specialized to all the instances. We first identify a number of ways in which generalization may be useful in the setting of computer algebra, and formalize this generalization as an antiunification problem. We present a single-pass algorithm for antiunification and give some examples.
Originalsprog | Engelsk |
---|---|
Titel | Maple Conference |
Antal sider | 7 |
Publikationsdato | 2005 |
Sider | 277-382 |
Status | Udgivet - 2005 |
Udgivet eksternt | Ja |