I've always had a vague lingering feeling about mathematical notation being at times inconsistent, excessive, duplicated, and so on. Some of it is of course due to me not knowing enough math, and not knowing math well enough, but some of it, I was recently relieved to find out, was a concern shared by people smarter than me too.

Unfortunately, I'm now *more* confused since the two solutions are (IMO) at odds with each other. Both share the common goal of replacing outdated notation by making it *"executable"*, and thus more precise, but the way they go about it is different.

Or so it seems to me, at least. Perhaps someone out there can explain it better to me someday. The two people I came across are, (of course?) Ken Iverson and Gerald Sussman, and the two ideas can be experienced by reading *"Notation as a Tool of Thought"*, and watching *"The Role of Programming in the Formulation of Ideas"*. Enjoy.