Alan Kay on “Should web browsers have stuck to being document viewers?” and a discussion of Smalltalk, HyperCard, NeWS, and HyperLook

Don Hopkins
33 min readJan 8, 2023

--

Alan Kay
Alan Kay

Alan Kay answered: “Actually quite the opposite, if “document” means an imitation of old static text media (and later including pictures, and audio and video recordings).”

Alan Kay Wrote:

Actually quite the opposite, if “document” means an imitation of old static text media (and later including pictures, and audio and video recordings).

It was being willing to settle for an overly simple text format and formatting scheme — “for convenience” — that started the web media architecture off in entirely the wrong direction (including the too simple reference scheme c.f. Doug Engelbart and Ted Nelson). Circa early 90s, it had the look and feel of an atavistic hack. I expected that Netscape would fix this rather than just try to dominate what was there (I expected a better architecture both for “thinking about media in the age of computing” and also something not like “an app” but more like an operating system to deal with the actual systems requirements, demands, and scalings of the world-wide Internet-in-progress).

It’s both surprisingly and dismayingly difficult to get people — especially computerists — to criticize the web and the web browsers — even more so perhaps today.

This is despite the glaring fact that the interactive media provided by the web and browsers has always been a hefty and qualitative subset of the media on the very personal computers that run the web browsers.

At the time of the WWW’s inception — in the early 90s — I made several recommendations — especially to Apple where I and my research group had been for a number of years — and generally to the field. These were partially based on the scope and scalings that the Internet was starting to expand into.

  1. Apple’s Hypercard was a terrific and highly successful end-user authoring system whose media was scripted, WYSIWYG, and “symmetric” (in the sense that the “reader” could turn around and “author” in the same high-level terms and forms). It should be the start of — and the guide for — the “User Experience” of encountering and dealing with web content.
  2. The underlying system for a browser should not be that of an “app” but of an Operating System whose job would be to protectively and safely run encapsulated systems (i.e. “real objects”) gotten from the web. It should be the way that web content could be open-ended, and not tied to functional subsets in the browser.

I pointed out that — as with the Macintosh itself — these two recommendations — which seem to be somewhat at odds — have to be reconciled. The first recommendation would be the next stage in the excellent Macintosh “guidelines” about its user experience (Chris Espinosa and others have never been praised highly enough for this important work). These guidelines laid out the conventions to be followed for any app of any functionality — they are the parts that must be similar.

The second recommendation was to reinforce the idea that the content to be run within the system had to be as free from the tools of the OS as absolutely possible (because special needs often require special designs etc). An example, was that the content needed to be able to generate its own graphics if necessary (even if the OS supplied some graphics tools). The more the content wanted to go its own way, the more its presentation to the users had to be made to conform to the standards in (1). As with any decent OS, it has to allow for new ideas while also providing the resources for safety, efficiency, and to manifest user experiences.

If we squint at some of the implications of both of these, we can find a number of good principles from the past. One of them — as a real principle — I trace to the first Unix systems at Bell Labs. The design was partly a reaction against the extremely complex organization of the Multics OS at MIT. One of the great realizations of the early Unix was that the *kernel* of an OS — and essentially the only part that should be in “supervisor mode” — would only manage time (quanta for interleaved computations) and space (memory allocation and levels) and encapsulation (processes) — everything else should be expressible in the general vanilla processes of the system. More functionality could be supplied by the resources that came along with the OS, but these should easily be replaceable by developer processes when desired.

The original idea was to instigate as much progress as possible without incurring lock-in to a huge OS, but to protect what needed to be protected and ensure a threshold of system integrity and reliability.

Sidebar: perhaps the best early structuring and next stage design of Unix was Locus by Gerry Popek and his researchers at UCLA in the early 80s. Locus allowed live Unix processes to migrate not just from one machine to another on a network, but to a variety of machine types. This was done by combining the safety required for interrupts with multiple code hooks in each process, so an “interrupt” could allow the process to be moved to a different machine and resumed with different (equivalent) code. It was easy to see that combining this with an end-user language would provide a network-wide system that would run compatibly over the entire Internet. Soon after arriving at Apple ca 1984, I tried to get them to buy Locus, but the “powers that be” at the time couldn’t see it.

Note that when such a system is made interactive — e.g. using the sweeping ideas from the ARPA/Parc research community — the end-users need to have a user interface framework that is generically similar as much as possible over all applications — and that this can conflict with the freedoms needed for new ideas and often new functionalities.

So this is an important, large, and difficult design problem.

My complaints about the web and the web browsers have been about how poorly they were thought about and implemented, and how weak are both the functionalities of web content and the means for going forward and fixing as many of the most critical mistakes as possible.

One way to look at where things are today is that the circumstances of the Internet forced the web browsers to be more and more like operating systems, but without the design and the look-aheads that are needed.

  1. There is now a huge range of conventions both internally and externally, and some of them require and do use a dynamic language. However, neither the architecture of this nor the form of the language, or the forms of how one gets to the language, etc. are remotely organized for the end-users. The thresholds are ridiculous when compared to both the needs and the possibilities.
  2. There is now something like a terribly designed OS that is the organizer and provider of “features” for the non-encapsulated web content. This is a disaster of lock-in, and with actually very little bang for the buck.

This was all done after — sometimes considerably after — much better conceptions of what the web experience and powers should be like. It looks like “a hack that grew”, in part because most users and developers were happy with what it did do, and had no idea of what else it *should do* (and especially the larger destinies of computer media on world-wide networks).

To try to answer the question, let me use “Licklider’s Vision” from the early 60s: “the destiny of computing is to become interactive intellectual amplifiers for all humanity pervasively networked worldwide”.

This doesn’t work if you only try to imitate old media, and especially the difficult to compose and edit properties of old media. You have to include *all media* that computers can give rise to, and you have to do it in a form that allows both “reading” and “writing” and the “equivalent of literature” for all users.

Examples of how to do some of this existed before the web and the web browser, so what has happened is that a critically weak subset has managed to dominate the imaginations of most people — including computer people — to the point that what is possible and what is needed has for all intents and purposes disappeared.

Footnote about “Ever expanding requirements at Parc” (prompted by Phillip Remaker’s comment and question)

When Gary Starkweather invented and got the first laser printer going very quickly, and at astounding speeds (a page per second, 500 pixels per inch), there was a push to get one of these on the networked Altos (for which the Ethernet had been invented). The idea was to use an Alto as a server that could set up and run a laser printer to rapidly print high quality documents.

Several of the best graphics people at Parc created an excellent “printing standard” for how a document was to be sent to the printer. This data structure was parsed at the printer side and followed to set up printing.

But just a few weeks after this, more document requirements surfaced and with them additional printing requirements.

This led to a “sad realization” that sending a data structure to a server is a terrible idea if the degrees of freedom needed on the sending side are large.

And eventually, this led to a “happy realization”, that sending a program to a server is a very good idea if the degrees of freedom needed on the sending side are large.

John Warnock and Martin Newell were experimenting with a simple flexible language that could express arbitrary resolution independent images — called “JAM” (for “John And Martin” — and it was realized that sending JAM programs — i.e. “real objects” to the printer was a much better idea than sending a data structure.

This is because a universal interpreter can both be quite small and also can have more degrees of freedom than any data structure (that is not a program). The program has to be run in a protected address space in the printer computer, but it can be granted access to a bit-buffer, and whatever it does to it can then be printed out “blindly”.

This provides a much better match up between a desktop publishing system (which will want to print on any of the printers available, and shouldn’t have to know about their resolutions and other properties), and a printer (which shouldn’t have to know anything about the app that made the document).

“JAM” eventually became Postscript (but that’s another story).

Key Point: “sending a program, not a data structure” is a very big idea (and also scales really well if some thought is put into just how the program is set up).

Phillip Remaker replied:

This is fantastic. Did you have an opinion on Sun’s Network Extensible Window System? It seemed like that was the winning idea, but licensing fees killed it. X was much worse, but it was free.

Economics seems to rule the day — things that are cheap/free get more mindshare and traction than anything you have to buy, even when the ultimate costs end up much higher.

Alan Kay Replied:

I liked NEWS as far as it went. I don’t know why it was so cobbled together — Sun could have done a lot more. For example, the scalable pixel-independent Postscript imaging model, geometry and rendering was a good thing to try to use (it had been used in the Andrew system by Gosling at CMU) and Sun had the resources to optimize both HW and SW for this.

But Postscript was not well set up to be a general programming language, especially for making windows oriented frameworks or OS extensions. And Sun was very intertwined with both “university Unix” and C — so not enough was done to make the high-level part of NEWS either high-level enough or comprehensive enough.

A really good thing they should have tried is to make a Smalltalk from the “Blue Book” and use the Postscript imaging model as a next step for Bitblt.

Also, Hypercard was very much in evidence for a goodly portion of the NEWS era — somehow Sun missed its significance.

Etc.

Don Hopkins Replied:

Hi Alan! Outside of Sun, at the Turing Institute in Glasgow, Arthur van Hoff developed a NeWS based reimagination of HyperCard in PostScript, first called GoodNeWS, then HyperNeWS, and finally HyperLook. It used PostScript for code, graphics, and data (the axis of eval).
Like HyperCard, when a user clicked on a button, the Click message could delegate from the button, to the card, to the background, then to the stack. Any of them could have a script that handled the Click message, or it could bubble up the chain. But HyperLook extended that chain over the network by then delegating to the NeWS client, sending Postscript data over a socket, so you could use HyperLook stacks as front-ends for networked applications and games, like SimCity, a cellular automata machine simulator, a Lisp or Prolog interpreter, etc.

Alan Kay Replied:

Hi Don

Thanks for this info (I wasn’t aware of it). I wish he had done a real end-user language for this: it would have been quite something to behold I think!

Don Hopkins Replied:

If you prefer C-syntax-like languages over PostScript, Arthur van Hoff also wrote “PdB”, an “object oriented” C to PostScript compiler. You could subclass PostScript classes from C and subclass C classes from PostScript! Conceptually it was a lot like the Objective C and TypeScript source=>source compilers, and we used it to write some of the HyperLook widgets, and other people like Leigh Klotz used it at Xerox too. Here’s a paper Arthur wrote about it:

Syntactic Extensions to PdB to Support TNT Classing Mechanisms

Leigh Klotz has written more PostScript than Jamie too, while working at Xerox!

Leigh Klotz’s comment on the regex article:

>OK, I think I’ve written more PostScript by hand than Jamie, so I assume he thinks I’m not reading this. Back in the old days, I designed a system that used incredible amounts of PostScript. One thing that made it easier for us was a C-like syntax to PS compiler, done by a fellow at the Turning Institute. We licensed it and used it heavily, and I extended it a bit to be able to handle uneven stack-armed IF, and added varieties of inheritance. The project was called PdB and eventually it folded, and the author left and went to First Person Software, where he wrote a very similar language syntax for something called Oak, and it compiled to bytecodes instead of PostScript. Oak got renamed Java.

Here’s the PdB object oriented C to PostScript compiler and lots of other NeWS stuff:

Open Look Base Window (.h file, .pdb file, and resulting .ps file):

Owen Densmore at Sun was very much inspired by Smalltalk’s object oriented programming system, and wrote “class.ps” which was the basis for NeWS’s object oriented user interface toolkit. Here’s his original 1986 Usenix Monterey Graphics Conference paper about it, “Object Oriented Programming in NeWS”:

Object Oriented Programming in NeWS, by Owen Densmore

PostScript’s dynamic binding over the dictionary stack made it very easy to implement prototypical objects as PostScript dictionaries, including multiple inheritance. You can dynamically promote methods and variables to instances (instance methods and dynamic properties are essential for implementing HyperCard-like and Smalltalk-like prototype based systems).

Kragen is right that PostScript is a lot more like Lisp or Smalltalk than Forth, especially when you use Owen Densmore’s object oriented PostScript programming system (which NeWS was based on). PostScript is semantically very different and much higher level that Forth, and syntactically similar to Forth but uses totally different names (exch instead of swap, pop instead of drop, etc).

Owen Densmore recounted John Warnock’s idea that PostScript was actually a “linguistic motherboard”. (This was part of a discussion with Owen about NeFS, which was a proposal for the next version of NFS to run a PostScript interpreter in the kernel. More about that here:)

https://donhopkins.com/home/archive/NeWS/linguistic-motherboard-owen.txt

Window System? ..NeWS ain’ no stinkin’ Window System!
-or-
Swiss Army NeWS: A Programmable Network Facility

Introduction

NeWS is difficult to understand simply because it is *not* just a window system. It is a “Swiss Army Knife” containing several components, some of which contribute to its use as a window system, others which provide the networking facilities for implementing the client-server model, all embedded in a programmable substrate allowing extremely flexible and creative combination of these elements.

During the initial implementation phase of the Macintosh LaserWriter software, I temporarily transfered from Apple to Adobe working closely with John Warnock and other Adobe engineers. At lunch one day, I asked: “John, what do you plan to do after LaserWriter?” His answer was interesting:

>PostScript is a linguistic “mother board”, which has “slots” for several “cards”. The first card we (Adobe) built was a graphics card. We’re considering other cards. In particular, we’ve thought about other network services, such as a file server card.

He went on to say how a programmable network was really his goal, and that the printing work was just the first component. His mentioning using PostScript for a file server is particularly interesting: Sun’s next version of NFS is going to use PostScript with file extentions as the client-server protocol!

This paper explores NeWS in this light: as a Programmable Network Facility, a major part of Sun’s future networking strategy.

Alan Kay Replied:

Hi Don

This work is so good — for any time — and especially for its time — that I don’t want to sully it with any criticisms in the same reply that contains this praise.

I will confess to not knowing about most of this work until your comments here — and this lack of knowledge was a minus in a number of ways wrt some of the work that we did at Viewpoints since ca 2000.

(Separate reply) My only real regret about this terrific work is that your group missed the significance for personal computing of the design of Hypertalk in Hypercard.

It’s not even that Hypertalk is the very best possible way to solve the problems and goals it took on — hard to say one way or another — but I think it is the best example ever actually done and given to millions of end users. And by quite a distance.

Dan Winkler and Bill Atkinson violated a lot of important principles of “good programming language design”, but they achieved the first overall system in which end-users “could see their own faces”, and could do many projects, and learn as they went.

For many reasons, a second pass at the end-user programming problem — that takes advantage of what was learned from Hypercard and Hypertalk — has never been done (AFAIK). The Etoys system in Squeak Smalltalk in the early 2000s was very successful, but the design was purposely limited to 8–11 year olds (in part because of constraints from working at Disney).

It’s interesting to contemplate that the follow on system might not have a close resemblance to Hypertalk — perhaps only a vague one ….

Don Hopkins Replied:

I came into PostScript from a FORTH background, so it was natural for me, but much more like Lisp and Smalltalk in its higher level semantics. Programming NeWS in PostScript was kind of like programming in bytecode (as a lot of bytecode VMs are stack machines). But the PostScript dictionary stack gave you Self-like total control over the dynamic binding environment that enabled you to “roll your own” object systems (which Owen Densmore did), and even do weird things like stacking up nested sub-objects (that was how the PSIBER Space deck pretty printing layout worked, so views could inherit graphical properties like size, scale factor, etc, from their parents). I think what you’re suggesting is that Sun could have used the lower level PostScript language as a low level “bytecode” to implement a higher level user friendly scripting language, which is a wonderful idea!

I think Dave Winer’s work is a fruitful approach, which started with the MORE outliner on the Mac, that he developed into the Userland Frontier’s “UserTalk” scripting language, whose syntax and data structure format is an actual outliner (notably quite well designed and easy to use — getting that right counts for A LOT).

He actually released Frontier before Apple released AppleScript, and it fully supported their open scripting architecture. Wikipedia: “At the time of its original release, Frontier was the only system-level scripting environment for the Macintosh, but Apple was working on its own scripting language, AppleScript” I’ve written more about its fascinating history on Hacker News, and linked to some of his demos:

I wrote some stuff in previous HN discussions about outliners and spreadsheets, and also some stuff about Dave Winer’s Frontier, which I’ll quote and link to here:

Userland Software and Frontier:

Alan Kay Replied:

Hi Don

The Frontier site (pointed at in the Wikipedia article seems to have been taken over by Indonesians. Any other links to suggest?

The biggest problem with AppleScript — maybe also with Frontier — is that the raw APIs and “messaged values” of the apps were poorly organized and ad hoc. There needed to be mappings into a better designed semantics to make most of these useable by an end-user (or even most pro-users).

Don Hopkins Replied:

Tom Stambaugh described how Smalltalk inspired Owen Densmore’s PostScript object oriented system in NeWS, more about “another story” you mention about JAM/Interpress/PostScript, and I linked to a great story by Brian Reid about PostScript history on Hacker News here:

I’ll manually transclude the hn posting here:

Tom Stambaugh described how Smalltalk inspired Owen Densmore’s PostScript object oriented system in NeWS.

A point he didn’t mention is that PostScript is directly descendent from Interpress, which was developed at Xerox PARC and reincarnated as PostScript at Adobe by Chuck Geschke and John Warnock:

Brian Reid’s deep detailed historic dive “PostScript and Interpress: a comparison”:

I also think PostScript owes a lot to Lisp (it’s dynamic, homoiconic, polymorphic, symbolic), even more so than Forth.

C2 Wiki: Forth PostScript Relationship

ForthLanguage and PostScript appear superficially similar, since both languages use a mostly postfix-based syntax (see PostfixNotation). Since Forth predates PostScript by quite a few years, PostScript is often assumed to be descended from Forth.

Tom Stambaugh wrote:

It seems to me that Forth is to stacks what LispLanguage is to lists. Forth demonstrated the advantages of a stack-centric paradigm in which each pushed or popped item could be evaluated as an expression or a primitive. Postscript reflects the application of that paradigm to the world of typography, 2-d graphics, and page layout. My own recollection is that Postscript’s primary contribution was the use of splines to describe character glyphs, allowing them to be effectively rendered at virtually any resolution desired. If anything, Postscript owes more to TexLanguage and DonaldKnuth than to Forth. I view the stack-based language paradigm as a convenient afterthought rather than a central organizing principle.

I also think we should note the contribution that OwenDensmore, at Sun, made in demonstrating how to use Postscript dictionaries to create a dynamically-bound object-oriented runtime environment. This was the fundamental premise of the Sun window server that ultimately became the NetworkExtensibleWindowSystem. Owen and I discussed his “crazy” idea at a poolside table at the now-demolished Hyatt Palo Alto, on El Camino. I told him that it made sense to me, we scribbled furiously on napkins, and I helped him see how he might adopt some learnings from Smalltalk. It was one of those afternoons that could only have happened at that time in that place in that culture. — TomStambaugh

I’ve extracted Owen Densmore’s paper from the news.tape.tar (marked PD), “Object Oriented programming in NeWS”, and uploaded it:

Object Oriented Programming in NeWS, by Owen Densmore

It would require some modification to run in other postscript environments, but not much, I think. It was developed after the 1st Edition Postscript manual but before the second, so it’s considered a Level 1.5 Postscript environment. It uses dictionaries freely, but the << /Level-2 (syntax) >> had not yet been invented. So it uses a number of nonstandard operators for dictionary construct. These would need to be simulated or the routines rewritten to use Level 2 syntax. — luserdroog

comp.lang.forth discussion on “Why is Postscript not Forth?”:

Why is Postscript not Forth? by Don Hopkins

Forth/PostScript discussion with Mitch Bradley:

Forth/PostScript between Don Hopkins and Mitch Bradley

Here’s more about James Gosling’s NeWS window system:

FWIW, here’s a visual PostScript programming and debugging interface I made for NeWS, called “PSIBER”, which was inspired by a lot of great ideas from Smalltalk and ThingLab, among others, and is kind of like Self’s Morphic visual object editing interface:

The Shape of PSIBER Space: PostScript Interactive Bug Eradication Routines — October 1989

Pseudo Scientific Visualization of the NeWS root menu object.

Abstract: The PSIBER Space Deck is an interactive visual user interface to a graphical programming environment, the NeWS window system. It lets you display, manipulate, and navigate the data structures, programs, and processes living in the virtual memory space of NeWS. It is useful as a debugging tool, and as a hands on way to learn about programming in PostScript and NeWS.

The PSIBER Space Deck is a programming tool that lets you graphically display, manipulate, and navigate the many PostScript data structures, programs, and processes living in the virtual memory space of NeWS.

The Network extensible Window System (NeWS) is a multitasking object oriented PostScript programming environment. NeWS programs and data structures make up the window system kernel, the user interface toolkit, and even entire applications.

The PSIBER Space Deck is one such application, written entirely in PostScript, the result of an experiment in using a graphical programming environment to construct an interactive visual user interface to itself.

It displays views of structured data objects in overlapping windows that can be moved around on the screen, and manipulated with the mouse: you can copy and paste data structures from place to place, execute them, edit them, open up compound objects to see their internal structure, adjust the scale to shrink or magnify parts of the display, and pop up menus of other useful commands. Deep or complex data structures can be more easily grasped by applying various views to them. […]

Data and Objects on the NeWS PostScript Stack
NeWS PostScript Object and Class Browsers
Pseudo Scientific Visualization of a Map of Adventure
Pseudo Scientific Visualization of a Map of the ARPANET

Mark Miller Replied:

Your article mentions that city employees were being partly trained using SimCity. It makes me curious enough to ask, “In what way?” In my mind, SC modeled the behaviors in a city in only a shallow aspect. So, I could see it would give people some basic ideas in systems thinking, so that they could begin to think about the particulars of the city they were participating in governing (which would be different from the game). How did cities find this game useful? Did you understand specific benefits they were getting out of it?

I remember once watching a TV show where, just as a lark, they had a child (maybe 12 years old) compete against a real life mayor, playing SimCity. The show “came back later” to see how each was doing (since the game took a while to play, before seeing meaningful results). The child was doing a lot better than the mayor! :) He had a sustaining city that had well-ordered layout, good financials, and a good economy, while the mayor’s city looked like a mess all the way around, and was not doing well. It made me wonder if perhaps the mayor was not familiar with the game, and so didn’t understand how to implement his thoughts well in it, or if perhaps the game was less like a real city than I might have imagined, and so its dynamics worked very differently from his experience, or maybe he just didn’t understand municipal governance well at all (which is possible).

Don Hopkins Replied:

I totally agree with you. Perhaps it was not clear by the formatting, but the quote “The game is currently being used by many government offices to train their city planners.” was from an email from a Sun marketing representative (the wife of the owner of the small Unix game company who licensed the rights for SimCity on Unix, actually), that was circulated internally at Sun to gauge interest in SimCity. Chaim Gingold’s excerpted that email that I provided to him in his dissertation on “Play Design”, in which he wrote about open sourcing SimCity (which I excerpted here, including a link to his dissertation, which is interesting reading):

Open Sourcing SimCity, by Chaim Gingold

Chaim wrote that:

The potential for a serious application, such as GIS, is overblown — it is hard to imagine SimCity enabling Sun to offer a “true ‘desktop’ GIS solution.” The email, however, speaks to the appeal of SimCity as a compelling representation of a city. Not only was it serious enough for GIS, but it was fun, qualities which enabled it to become absorbed into the current of Sun’s marketing agenda.

I agree that the idea of using SimCity to educate people about city planning is a silly idea, and that the idea of a game like SimCity being used as a “GIS solution” is overblown. But I do believe that SimCity can be used educationally to teach many other important things other than city planning itself.

Alan Kay has wisely criticized SimCity for hiding the internal simulation away in a “black box” that player’s can’t see and change. I share that burning desire to see inside, understand how it works, and change it: ever since playing it the first time, I always wanted to see the source code, understand how it ticked, and implement pie menus for its user interface. That was what motivated me to respond to the overhyped email looking for developers at Sun who were interested in porting SimCity to Sun workstations.

The first time I saw SimCity, I immediately started “Reverse Over-Engineering” it in my mind, playing around with it, testing its boundaries and limitations, trying to figure out how it worked, and what its models were, and how they interacted.

Chaim Gingold’s “SimCity Reverse Diagrams” beautifully illustrate all that stuff about how SimCity actually works, but those didn’t exist yet. Although they’re fascinating to look at now, seeing those diagrams before actually playing the game would have spoiled the “Simulator Effect”.

Chaim Gingold’s SimCity Reverse Diagrams

Will Wright defined the “Simulator Effect” as how game players imagine a simulation is vastly more detailed, deep, rich, and complex than it actually is: a magical misunderstanding that you (as a game designer) shouldn’t talk them out of.

He describes the Simulator Effect and other concepts in his master class on game design:

https://www.masterclass.com/classes/will-wright-teaches-game-design-and-theory/chapters/game-mechanics

Will designs games to run on two computers at once: the electronic one on the player’s desk, running his shallow tame simulation, and the biological one in the player’s head, running their deep wild imagination.

“Reverse Over-Engineering” is a desirable outcome of the Simulator Effect: what game players (and game developers trying to clone the game) do when they use their imagination to extrapolate how a game works, and totally overestimate how much work and modeling the simulator is actually doing, because they filled in the gaps with their imagination and preconceptions and assumptions, instead of realizing how many simplifications and shortcuts and illusions it actually used.

There’s a name for what Wright calls “the simulator effect”: Apophenia.

Apophenia (/æpoʊˈfiːniə/) is the tendency to mistakenly perceive connections and meaning between unrelated things. The term (German: Apophänie) was coined by psychiatrist Klaus Conrad in his 1958 publication on the beginning stages of schizophrenia. He defined it as “unmotivated seeing of connections [accompanied by] a specific feeling of abnormal meaningfulness”. He described the early stages of delusional thought as self-referential, over-interpretations of actual sensory perceptions, as opposed to hallucinations.

More on the “Simulator Effect”:

Will Wright defined the “Simulator Effect” as how game players imagine a simulation is vastly more detailed, deep, rich, and complex than it actually is: a magical misunderstanding that you shouldn’t talk them out of. He designs games to run on two computers at once: the electronic one on the player’s desk, running his shallow tame simulation, and the biological one in the player’s head, running their deep wild imagination.

“Reverse Over-Engineering” is a desirable outcome of the Simulator Effect: what game players (and game developers trying to clone the game) do when they use their imagination to extrapolate how a game works, and totally overestimate how much work and modeling the simulator is actually doing, because they filled in the gaps with their imagination and preconceptions and assumptions, instead of realizing how many simplifications and shortcuts and illusions it actually used.

A better approach to using SimCity educationally is through “Constructionist Education” or “Constructionism”, as defined by Seymour Papert, and a passionate interest of Alan Kay and others on the OLPC project:

We were able to talk EA/Maxis into relicensing the original source code of SimCity for free under GPL-3, for educational use by the OLPC project and anyone else. I’ve cleaned up the code and redeveloped it under SimCity’s original title, “Micropolis”. Here is a talk I gave at HAR 2009 about “Micropolis: Constructionist Educational Open Source SimCity”.

The OLPC is About Constructionist Education

Seymour Papert with a Logo Turtle

>This OLPC project is based on Seymour Papert’s ideas about teaching children to program, even when they’re very young, and Alan Kay’s ideas about the Dynabook and object oriented programming, and how kids learn.

>One of the important things about constructionism is kids helping each other, communicating, and learning from each other.

A cartoon from Alan Kay’s original research paper on the DynaBook

>I adapted the old X11/Unix version of SimCity to the OLPC XO-I Children’s Laptop. The original X11 version supported multi player mode, but I disabled that feature to simplify the game, because it didn’t scale well, was complex to configure, and not well designed. OLPC games for kids need to be very easy to use.

>A demonstration of OLPC SimCity running on the One Laptop Per Child XO-1 laptop.

Here’s a summary of Will Wright’s 1996 talk about “Designing User Interfaces to Simulation Games”, which he gave to Terry Winnograd’s user interface class at Stanford:

Some educators have asked Maxis to make SimCity expose more about the actual simulation itself, instead of hiding its inner workings from the user. They want to see how it works and what it depends on, so it is less of a game, and more educational. But what’s really going on inside is not as realistic as they would want to believe: because of its nature as a game, and the constraint that it must run on low end home computers, it tries to fool people into thinking it’s doing more than it really is, by taking advantage of the knowledge and expectations people already have about how a city is supposed to work. Implication is more efficient than simulation.

>People naturally attribute cause and effect relationships to events in SimCity that Will as the programmer knows are not actually related. Perhaps it is more educational for SimCity players to integrate what they already know to fill in the gaps, than letting them in on the secret of how simple and discrete it really is. As an educational game, SimCity stimulates students to learn more about the real world, without revealing the internals of its artificial simulation. The implementation details of SimCity are quite interesting for a programmer or game designer to study, but not your average high school social studies class.

>Educators who want to expose the internals of SimCity to students may not realize how brittle and shallow it really is. I don’t mean that as criticism of Will, SimCity, or the educators who are seeking open, realistic, general purpose simulators for use in teaching. SimCity does what it was designed to and much more, but it’s not that. Their goals are noble, but the software’s not there yet. Once kids master SimCity, they could learn Logo, or some high level visual programming language like KidSim, and write their own simulations and games!

>Other people wanted to use SimCity for the less noble goal of teaching people what to think, instead of just teaching them to think.

>Everyone notices the obvious built-in political bias, whatever that is. But everyone sees it from a different perspective, so nobody agrees what its real political agenda actually is. I don’t think it’s all that important, since SimCity’s political agenda pales in comparison to the political agenda in the eye of the beholder.

>Some muckety-muck architecture magazine was interviewing Will Wright about SimCity, and they asked him a question something like “which ontological urban paradigm most influenced your design of the simulator, the Exo-Hamiltonian Pattern Language Movement, or the Intra-Urban Deconstructionist Sub-Culture Hypothesis?” He replied, “I just kind of optimized for game play.”

Then there was the oil company who wanted “Sim Refinery”, so you could use it to lay out oil tanker ports and petroleum storage and piping systems, because they thought that it would give their employees useful experience in toxic waste disaster management, in the same way SimCity gives kids useful experience in being the mayor of a city.

They didn’t realize that the real lessons of SimCity are much more subtle than teaching people how to be good mayors. But the oil company hoped they could use it to teach any other lessons on their agenda just by plugging in a new set of graphics, a few rules, and a bunch of disasters.

[…more at the links…]

Mark Miller Replied:

Thanks for your involved response. :)

The point about the Simulation Effect got me thinking of what Nicholas Mayer said about filmmaking, that the filmmaker, rather than filling in all the detail, should use the audience’s intelligence to fill it in.

I definitely get the sense from what Wright said that he was saying, “Look, I was just writing a game, and trying to make it fun,” nothing more than that. Don’t make it into anything more than that.

From watching people play it, it seemed like there was at least the effect of feedback within the game, that depending on how you set your parameters, you would see some effects that “fed” on them. The response was not purely random. I saw that in your video, “Jack up taxes, and watch everyone leave. Lower taxes, and see the economy boom.” :)

I remember reading part of a column written in Compute! Magazine when I was a teenager that I think was on “the future of gaming.” it talked about rather than the game setting up the playfield, and the player having to learn the rules within that playfield, and how to win within it, they would play an active part in constructing the playfield. The example the author used was a game concept with a railroad. The game would contain different track shapes, and part of the game would be the player laying these shapes down, so as to create a track that a locomotive could run on. The train would run on whatever the player set up.

I’d seen something kind of like this with Bill Budge’s Pinball Construction Set, but the concept the author was talking about didn’t make sense to me until I saw people playing SimCity.

Don Hopkins Replied via Email to Alan Kay, David Rosenthal, and James Gosling (the authors of NeWS):

Hi, Alan, and David and James!

David and James: here’s a great thread on Quora about browsers with Alan Kay’s answer and a discussion we had about NeWS (you have to log in to see the discussion).

Alan: here’s an old posting to slashdot in which David explained some of the history of NeWS, including how it was influenced by Smalltalk and other systems, and he mentioned the book Methodology of Window Systems, one of my favorite books that I HIGHLY recommend.

When I was learning about window systems and researching pie menus, I ran across it in the UMD engineering library and could not believe that there was such a deep and specialized book about just what I was researching about, including an article about SunDew! I checked it out, then lost it for months, and had to pay the fine to replace it (it was one of those outrageously expensive silver Springer Verlag books), so I kept it and still have and cherish it! But now it’s all online for free! So many extremely interesting articles, especially Warren Teitelman’s history, and the classic Sundew article by Gosling and Rosenthal that was about the earliest version of NeWS.

Methodology of Window Management

4. Ten Years of Window Systems — A Retrospective View
Warren Teitelman

5. SunDew — A Distributed and Extensible Window System
James Gosling

  • Don

Alan Kay Replied:

Hi Don and all

Windows didn’t start with Smalltalk. The first *real* windowing system I know of was ca 1962, in Ivan Sutherland’s Sketchpad (as with so many other firsts). The logical “paper” was about 1/3 mile on a side and the system clipped, zoomed, and panned in real time. Almost the same year — and using much of the same code — “Sketchpad III” had 4 windows showing front, side, top, and 3D view of the object being made. These two systems set up the way of thinking about windows in the ARPA research community. One of the big goals from the start was to include the ability to do multiple views of the same objects, and to edit them from any view, etc.

When Ivan went ca 1967 to Harvard to start on the first VR system, he and Bob Sproull wrote a paper about the general uses of windows for most things, including 3D. This paper included Danny Cohen’s “mid-point algorithm” for fast clipping of vectors. The scheme in the paper had much of what later was called “Models-Views-and-Controllers” in my group at Parc. A view in the Sutherland-Sproull scheme had two ends (like a telescope). One end looked at the virtual world, and the other end was mapped to the screen. It is fun to note that the rectangle on the screen was called a “viewport” and the other end in the virtual world was called “the window”. (This got changed at Parc, via some confusions demoing to Xerox execs).

In 1967, Ed Cheadle and I were doing “The Flex Machine”, a desktop personal computer that also had multiple windows (and Cheadle independently developed the mid-point algorithm for this) — our viewing scheme was a bit simpler.

The first few paragraphs of Teitelman’s “history” are quite wrong (however, he was a good guy, and never got the recognition he deserved for the PILOT system he did at MIT with many of these ideas winding up in Interlisp).

Cheers and best wishes

Alan

David Rosenthal Replied:

Alan, thank you for these important details. I’d like to write a blog post correcting my view of this history — may I quote your e-mail?

Is this paper, “A Clipping Divider”:

The one you refer to?

David.

Alan Kay Replied:

Hi David

Thanks very much! Your blog is a real addition to the history and context needed to really understand and criticize and improve today.

I would like to encourage you to expand it a bit more (even though you do give quite a few references).

I had very high hopes for Sun. After Parc, I wanted something better than Smalltalk, and thought Sun had a good chance to do the “next great thing” in all of these directions. And I think a number of real advances were made despite the “low-pass filters” and exigencies of business.

So please do write some more.

Cheers and best wishes to all

Alan

Don Hopkins Replied:

Yeah, it was very sad that Sun ended up in Larry Ellison’s grubby hands. And I sure liked the Sun logo designed by Vaughan Pratt and tilted 45 degrees by John Gage (almost as great as Scott Kim’s design of the SGI logo), which he just sent out to the garbage dump. (At least Facebook kept the Sun logo on the back of their sign as a warning to their developers.)

I truly believe that in some other alternate dimension, there is a Flying Logo Heaven where the souls of dead flying logos go, where they dramatically promenade and swoop and spin around each other in pomp and pageantry to bombastic theme music, reliving their glory days on the trade show floors and promotional videos.

It would make a great screen saver, at least!

  • Don

David Rosenthal Posted on his Blog:

History of Window Systems

Alan Kay’s Should web browsers have stuck to being document viewers? makes important points about the architecture of the infrastructure for user interfaces, but also sparked comments and an email exchange that clarified the early history of window systems. This is something I’ve wrtten about previously, so below the fold I go into considerable detail.

--

--

Don Hopkins
Don Hopkins

Written by Don Hopkins

User interface flower child. Pie menus, PizzaTool, SimCity, The Sims, Visual Programming, VR, AR, Unity3D / JavaScript bridge.

Responses (1)