Have You RSS-Enabled Your Site? And If You Have, Can Readers Find Your Feeds?
My Top 10 Tools

Architectures of Control: How Design Influences the Ways We Use and Do Things

Just stumbled across a REALLY interesting blog called The Architectures of Control. It's maintained by Dan Lockton who says:

Increasingly, many products are being designed with features that intentionally restrict the way the user can behave, or enforce certain modes of behaviour. The same intentions are also evident in the design of many systems and environments.

Dan goes on to define architectures of control this way:

Architectures of control are features, structures or methods of operation designed into physical products, software, buildings, city layouts—or indeed any planned system with which a user interacts—which are intended to enforce, reinforce, or restrict certain modes of user behaviour.

While the use of architectures of control in computing is well-known, and a current issue of much debate (in terms of digital rights management, ‘trusted’ computing and network infrastructures themselves), it is apparent that technology—and a mindset that favours controlling users—is also offering increased opportunities for such architectures to be designed into a wide range of consumer products; yet, this trend has not been commonly recognised.

This, of course, isn't really new--control has often been inherent in the design of products and systems in the past. We have only to look at the Windows Operating system and the Catholic Church to see this principle at work.

What is new, I suspect (although I may be wrong) is the widespread use of intentional design to control and restrict user behavior.  That is, it seems as we become more aware of how the design of things influences how we interact with them, designers are choosing to use features to control user behavior more intentionally.

This raises a few questions and thoughts for me:

  • I'm not sure how I feel about the idea of "architectures of control." I instinctively recoil from the idea of "controlling" anything, yet I also recognize that it's inevitable--whatever we create invariably shapes how we interact with it, the customs that develop around the object or system, etc. It's unavoidable. And one person's "restriction" is another person's "liberation."

Take a look, for example at Dan's post on (Anti)public Seating, which includes a number of examples of how seating in parks, train stations and other public areas actually discourages people from sitting in or using those areas. Apparently one set of benches has been specifically designed so that homeless people won't sleep on them, with the effect that no one would want to sit on them. I personally see this as a restriction that serves no one well, but others obviously see it as a perfectly legitimate use of design principles for the public good.

  • What is the larger societal impact of a design approach that seeks to control and restrict? This, to me, ties back into the scarcity mindset I explored a few months ago. Scarcity is about control and restrictions. It would seem, then, that we have a mutually reinforcing dynamic going on where the more we see scarcity, the more we seek to control through design and the more we control through design, the smaller and more restricted the world becomes. Is using design to restrict behavior the symptom or the disease? Or both? And how does this really influence all the ways in which we see and interact with the world?
  • In looking at technology, the interplay between the open source movement, the growth of user-centric Web 2.0 tools, etc. makes things more confusing and difficult to discern.  The hoopla around Facebook as platform seems a perfect example of how the design of a thing can appear open, yet actually control, as this Read/Write Web post shows. Where else does this go on? Does it matter?
  • Despite what we know about design, it's still very often unintentional. Because how we interact with a product or system often happens below any real level of consciousness, how is it shaping our behavior in ways we don't really understand? In my system development work with organizations, we often run up against the fact that the design of their  customer systems is actually reinforcing the very behaviors staff want to change. So, for example, in a system that's supposed to help people independently prepare for and find employment, most of the service functions actually discourage people from taking any independent action. One of the challenges is getting people to recognize how design really does influence behavior. It's something that happens below the radar, so they don't really notice.

I see that this has become one of those rambling posts with no real conclusion--just some questions and observations. One immediate impact for me of reading this stuff is a greater awareness of the design of things and how they influence what I do. It also reminds me about the need to assume nothing and to question everything, including how objects and systems may encourage certain behaviors that I consider to be negative, but have not noticed.

I'd be curious to hear other people's responses to Dan's blog and how you see this playing out in your corner of the world. Is this something we should be paying attention to or is it just an academic exercise?


Feed You can follow this conversation by subscribing to the comment feed for this post.

There was an article I read many years ago, but don't see cited very often, about designing learning environments. This is Fulton's SPATIAL model (Satisfaction, Participation, Achievement, Transcendent Attributes, Immanent Attributes, Authority, and Layout of the Physical Environment). Speaking of the last component, to which you refer in your post, Physical Environment, Fulton remarks, "For example, an environment can be authoritarian or institutionalized in nature, affording learners little power for change..."


Anyway, more stuff to ramble on about ;-)

I just finished reading Douglas Rushkoff's novel _Exit Strategy_, which explores some of these "architectures of control" issues. In it, an intelligent internet interface is created that learns from each individual user's online behavior and adapts in such a way that the person feels rewarded for clicking the "buy" button. The book is clever and funny, and it takes some of the ideas you're talking about to an extreme (although in a believable way - this future is imaginable, if scary), in order to raise important issues about control and greed.

I'll never look at a website the same way again.

The comments to this entry are closed.