Wednesday, September 30, 2009

Rethinking the fundamental aspects of functions

A Class is a "minimizing" tool, a shorthand for otherwise impossibly repetitive, incomprehensible and error-prone declarations of methods and data.

Functions fall into the same category, as do global data declarations: with them, we create a necessary shorthand. We resolve nagging details within functions, so we can move on, and think larger thoughts. Functions reflect the compression we perform naturally, while we think, and while we express our thoughts.

There is, however, a problem with large, intertwined libraries, frameworks and cultures of shorthand.

One problem is that, in applying existing Libraries and Functions to a new problem, or a new set of goals, you can't find the important features of your new set in a natural way ... natural either to the problem at hand or to your deepest thoughts about it. Instead, you need to think of it in terms of the machine, and the culture of using the machine represented by the Libraies and Functions, Classes and Methods ...

What you want to do, to quote Plato (from the frontispiece of Christopher Alexander's "Notes on a Synthesis of Form"):

" ... the separation of the Idea into parts, by dividing it at the joints, as nature directs, not breaking any limb in half as a bad carver might ..."


I have always felt that, while there are important ideas hidden in these libraries, they have a tendency to "chop up" the problem you're working on in a most unnatural way. Some are a better fit to your problem than others, to be sure. But there is something fundamentally wrong about how we apply this existing work to our own problems.

Mathematicians have understood the difference for centuries: the application of known results to new problems is either done well, and naturally, unfolding beautifully for the reader of the proof, or else it is hacked together: possibly correct, but unreadable. There needs to be a plot, a gradient of applied resolutions bearing down on the problem.

What I want to do with Grogix, is make it easy to create this gradient. But to do so, we need to think differently about Functions and Classes.

Functions are shorthand, and collected together make a kind of fantasy world. But they represent the fantasy world of an operating computer ... so there is utility in cooperating with it. Yes, the fantasy world collapses when you hit it with a sledgehammer. But it also contacts a billion people when you use it well.

The RNA->DNA->protein mechanisms of nature could also be called a fantasy logical system ... boil a cell, and the machinery disappears. But there is great power in this machinery. In fact the point of the present work is to find a natural way to use an analogy to morphogenesis as a means of cooperating with computing systems.

Let's look at one particular aspect of functions: functions that call functions, or higher-order function. In Object-Oriented Design, these are methods that call other methods or classes. They are pervasive in computing. Every programmer uses them constantly.

One of the consequences of higher-order functions is a kind of cascading effect, or a "tree of applied functions" that are called anytime one is used.

But say we want this tree to be explicit? Say I'd like every program I write to "show its work", and also to "minimize all higher-order functions" by eliminating buried functional similarities?

Why would I want to do this? So we could write more reliable, robust, and natural programs. If we can disentangle the trees of higher-order functions, and lay them side-by-side, and order them by principles, we'll have some real understanding and clarity in computing. Instead, I find that higher-order functions simply hide more, and more, and more, until we cannot move naturally towards our goals. It's like using the specific musical knowledge of, say, a bass player, to choreograph a ballet. It's good stuff, but it requires too many unique transformations to get where you want to go.

If you look at this first working Grogix example, ListItem.gx, you can see that the morphogenetic sequence for unfolding the program (steps 1 through 19) lays the the principles side-by-side ... a complete cross-cutting, holistic application ... but that's easy when there's very little structural repetition, in such a simple web application.

So, let's say that we're creating a web application with multiple features that are organized nearly identically -- a simple example is a single dynamic web page with multiple update-able and extend-able content.

I've created a structure in Grogix which I call "mux/demux" to deal with repeated distribution across several steps in a sequence.

In one step, one defines the "multiplexer" production:

.step 6: 'Global structure'

.muxP : 'global structure'
.mP : 'main page'
.mP : 'news section'
.mP : 'news item'
.mP : 'link section'
.mP : 'link item'
.mP : 'user content section'
.mP : 'user content item'
.mP : 'group content section'
.mP : 'group content item'
.dnt : 'db declarations'
.dnt : 'handler declarations'
.dnt : 'handler map entries'
.dnt : 'template dictionaries'
.dnt : 'templates'
.muxP end

And then in five other "demultiplexing" steps, one applies the application-wide pattern ('db declarations' etc.) to each of the similar application sections:

.step 9: 'handler map entries'

.demux : 'handler map entries'

.template: '' : 'inside_string', 'rountine_name' , KEY
<<<
('/inside_string/KEY', routine_name)
>>>

# add (.*)/ for KEY when you need it

.dmP : 'main page'
.t : typical_entry ('','Main','')
.t : ','

.dmP : 'news section'
.t : typical_entry ('news','News','')
.t : ','

.dmP : 'news item'
.t : typical_entry ('news_item','News_Item','')
.t : ','

...


Since the .dnt's = .steps = patterns from a pattern gradient, this effectively lets me program "holistically" ... I can turn a formerly functional application inside out, with generative grammar, so that the principles are applied explicitly wherever needed. I can add unique concerns to this step, and more than one demux, thereby concentrating the global "what" and "how" together for anyone who might subsequently read the program, wanting to understand the principles applied.

It also maintains a global minimalism. Which is nature's strength: the point of Grogix.

This isn't quite sufficient though. We've dealt only with multiple applications of functions, not with the higher-order functions. We need these too ... you could write a complex program without one, but you still need them to minimize the encoding of action. A good example is this multiple-web-section application. There will be further patterns to apply to all the "mux" sections, long after the first "demux".

And so, we need an arbitrary number of levels of "demuxing".

Essentially, we need to apply the original sets of .muxP production targets to any production with a non-terminal that appears in the first demux section. This maintains the global principle aspect, while allowing the power of higher-order functions and multiple application of functions.

I'll provide a full example in the next post.

No comments:

Post a Comment