[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

eval'd macros (the other point of view)



    Date: Mon, 3 Feb 1986  20:36 EST
    From: "Scott E. Fahlman" <Fahlman@C.CS.CMU.EDU>

	(defun fn () (mac))            ;==> FN
	(defmacro mac () `'foo)        ;==> MAC
	(fn)                           ;==> ***Error***

I wish we'd required that this signal an error. I like the behavior shown above.

    You don't say whether the problem you described occurs in the
    interpreter or the compiler.  Your example certainly is not required to
    work when compiled because the macro definition follows the use.  In my
    opinion, this should work without an error in the interpreter.  Hacking
    in the interpreter would be fairly awkward if it didn't.  However, the
    fourth paragraph on page 143 seems to give implementors permission to do
    compiler-like macro expansion when a defun is first seen.  I wouldn't
    want to use such an implementation, but it probably is legal as the
    manual currently stands.

Well, people frequently don't like to use something that behaves differently
than what they're used to, but that doesn't make the `new' behavior wrong.
In the interest of fairness, let me outline the alternate point of view -- the
view to which I subscribe...

Having macros expanded when the defun is first seen offers the nice feature
that you can't easily break functions you've already written as you enter a
debugging phase. For example, consider how redefining LET would break the world
in an environment where LET wasn't resolved at definition time. Compare that
to a world where you had to do things in order. You're always trading one thing
for another.

Also, in your preferred interpreter, allowing displacing macros to expand
lazily will mean that code which is fully loaded but only partially exercised
may behave inconsistently after redefinition of a macro, since only the 
non-displaced calls will see the update. In the case of an early-binding
interpreter, the effect of the redefinition is trivially predictable.

Personally, I would prefer not to work in an interpreter that did lazy 
expansion. The only weakness I see in the spec is that it doesn't guarantee
early expansion.