"Moving front" is from Noordzij; I think he sees
it as a useful term because it refers not only to the
edges but also the body, which (sorry :-) is painted.
Your observation that the main thing it produces
of interest to type designers is the edges and not the
body is significant; and to me that leads straight to
the need to separate the edges, basically obviating
the -formative- relevance of the body. And this is
how the white really becomes liberated.
So the term "moving front" to me is in a way
rigged, and that's what's limiting creativity.
I agree Hrant. To me the "edges" are a boundary of sorts, they require an opposing force.
The "leading front" forgets about the "tailing behind." Once again it is about the white. You cannot forget about the "behind" that is essentially a "white bumper." :-)
Michael; do not understand how you can call it the [moving] front when pen manipulation affects the left and right side of the form.
Noordzij is asking the question 'What happens in the stroke when the pen is manipulated?', and tries to come up with a systematic terminology, independent of particular tools or techniques, within which to answer that question. I don't think that's 'obfuscation'. I also think that it is an approach that is of very little use or interest to a calligrapher, which is perhaps what you are expressing. Noordzij's analysis is of use and interest to people who need to look at and think about writing, not to writers, and still less to readers. I've found it useful in two areas in which I'm involved: palaeography and type design, the latter involving making letters from thought rather than from movement.
Bottom line John... are not the forms we bend, fold, staple and mutilate in the name of NEW fonts based on those very forms, originated by a pen?
I do not care if it is Helvetica... it is still dependent on a concept that is over 2000 years old...pen/brush influenced!
The "thought" that you speak of cannot be disassociated from those "very old preconceptions" in which movement was a major player.
Actually I think when one is designing the border
between black & white the pen can in fact only
be an illusion. A powerful illusion, and sometimes
even an effective one (for example in display type)
but I can't help but believing that there's something
better out there, especially in terms of readability.
And a design like Legato seems to show that.
Michael, I don't think we're disagreeing. It is precisely because I don't think the thought of letter design can be disassociated from the movements of the tool -- although the association can be stretched and bent in various ways, some of which may be fruitful -- that I find Noordzij's analysis of the results of those movements to be useful.
You have the particular facility of being able to perform with your hand what I can only perform with my mind and translate into technology. I have found Noordzij's analysis of the stroke very helpful in understanding what is possible in terms of the results of manipulation: he helped me to see and understand things in writing. I can understand how, to you, as a scribe, his analysis seems unnecessary and probably a complication of things that are easy or self-evident. But I don't think he is presenting these ideas for the benefit of scribes and calligraphers.
[Hrant] Actually I think when one is designing the border between black & white the pen can in fact only be an illusion.
I know what drawing an outline is, and I know how adjusting it in font creation software is done. I also know what marking the boundary of a form— where form and counter form meet — is or amounts to, but I must confess, I don't know what designing the boundary is.
In drawing an outline or marking and adjusting the boundary of a form (“the black”) I think “the pen” can provide a notional reference axis. This is different than setting up the shapes delivered by the pen as the norm. What I've been calling norm-violation (vis à vis the pen — systematic, across the character set; or ad hoc, in a glyph-by-glyph manner) I can conceive of as enhancing readability.
To me Legato introduces non-incidental norm-violation of a complex and systematic sort, and I think there is at least a theoretical basis for thinking it enhances readability.
My point has been: why start violating from a flawed standpoint?
Because it's easier than starting from scratch? OK, I can accept that,
but only if there's an admission that the starting point is flawed.
Even Bloemsma started from an imperfect point: Modernism (which
he had long been a fan of). But at least he made something truly new.
Ideally, we start from the right place: reading. The problem is it's ill-
defined (and probably will stay that way). But not knowing something
perfectly is no excuse to look away. That's why I'm a Post-Modernist,
just not a hooligan-style 90s one.
I don’t think of what we have as flawed, but as something singular and evolved that has — in terms of reading — reached a benchmark level of performance and efficiency. This doesn't mean performance can’t be optimized, and efficiency enhanced.
OK, but staying on the chirographic continent limits
the optimization; you can't get to an (although never
the) ideal from there.
I think lack of inventiveness and a sense of direction in norm-violation combined with miss-apprehensions of what the reading system needs limits optimization. The sense of what the reading system needs can be intuitive and become instinctive. It can also be evidence based.
Peter, I don't think the notion of 'norm-violation' is very helpful, at least not without the kind of nuance in the interpretation of the words that requires quite a lot of explication. I think it is possible to approach type design in relationship to written forms in a variety of ways without evoking 'norms' or suggesting that some of these ways represent violations. My own approach is to recognise that -- presuming one is designing for a cultural script, i.e. one that has evolved over time in the literary life of communities with particular histories -- almost all scripts have originated in and/or passed through writing, and this process has established the shapes of the signs as expressed by particular tools and movements. Hence, reference to those tools and movements is a legitimate and sensible aspect of typeface design, and what is important here is not that this anchors types to 'norms' of letter shape or stroke modulation but that it anchors them within the script culture.
One of the things that is interesting to me about Legato is just how comfortably it sits within the Latin script culture, despite doing things that would be considered norm-violations in your analysis. I think this comes from the strength of structure -- the sense of a skeleton even though the edge relationships plainly demonstrate that none can exist -- and also the quality and measure of tension in the "strokes", which although being differently arrived at is of a similar tenor to that found elsewhere in the line of 'humanist sans' derived from more traditional constructions.
I believe Legato seems to exhibit this paradox
only because our conscious perception of the
existence of "skeletons" is in effect illusory.
To me Legato in essence proves -or at least
demonstrates- that any expanded-skeleton
approach to type design (chirography being
the pinnacle of this) is an arbitrary constraint,
at least in terms of reading.
I didn't say anything about an expanded skeleton. What I was talking about is the strong sense of structure, and the fact that even though the edges do not follow a traditional or tool derived relationship there is no sense of structural wobble in Legato. Contrast this with types that attempt a traditional edge relationship but fail because of lack of experience or skill of the designer (the Typophile critique forums have plenty of examples): they're less a 'norm-violation' than Legato, but also less structurally sound because what happens in their edge relationships introduces wobble in the perception of the shapes. This is important, and I think it is key to the success of Legato's systematic approach to edge relationships: even though no writing tool or manipulation can create quite those 'strokes', one can impose a hairline skeleton on the Legato letters and see that a) it is a strong structural form and b) the relationships of the edges to that skeleton are systematic and consistent. They're just not tool based.
Now here is an interesting idea: since the relationship of edges to a structural skeleton in Legato do seem to follow a consistent, systematic approach, would it be possible to algorithmically expand that skeleton to produce these kinds of relationships? That seems to me a practical question that could be practically explored. Why would one bother? Well, it would settle the question of how illusory or not the perception of a skeletal structure in Legato is. What it wouldn't do it settle how important that perception -- illusory or not -- is to the readability of the type. I theorise that it is important because what we recognise in reading -- at text sizes in any case -- is shapes not edge relationships, and hence edge relationships need to support the perception of shape not undermine it (which is why wobble is a problem).
John, when I use norm in these contexts, I'm not using it in the sense of standard or model (an ‘ought’), but in the sense of conventional pattern (or ‘is’) that has accrued over time. Violation in these contexts is just deviation from that evolved conventional pattern. Any feature manipulation away from a conventional base is norm violation.
My use of the term stems from my reading of Douglas Hofstadter's “Letter Spirit: Esthetic Perception and Creative Play in the Rich Microcosm of the Roman Alphabet” in Fluid Concepts and Creative Analogies: Computer Models of the Fundamental Mechanisms of Thought and papers he did with Gary Mcgraw and John Rehling in the aftermath of his critique of Donald Knuths essay in Visible Language on the Concept of a Metafont.
For Hofstadter, “norm violation is the name of the game in creation.”
I once did an an analysis of Legato on Typophile that shows how, starting from an expansion-with-contrast reduction base (the sans serif), a clever series of norm violations are introduced which have the result of producing a humanistic sans, namely one that approaches the conditions of translation.
As I said, a usage that requires quite a lot of explication. :)
Three sentences and a source attribution is a lot?
I do like what you say about script culture, and that what is remarkable about Legato is just how comfortably it sits within the Latin script culture, despite doing things that I would considered strong norm-violations. What I said about my analysis of Legato provides an elaboration of just that.
In the context of this thread it might be interesting to point out that Bloemsma's vantage point in the introduction of his norm violations is the consideration of how the whites sit inside the black and how they co-relate in the context of the word. The outcome is a distinctive contribution to the consolidation of the word, which is his avowed aim.