Designing Culture | Jacobin

Design plays a central role in cultural reproduction. This isn’t necessarily a good thing, for anyone.

Want to hear a really pretentious definition of design? Probably not, but I have to listen to this stuff almost constantly and misery loves company, so here it is: “Giving form to culture.”

I hear people actually say those words from time to time, and it never puts me in a particularly good mood. My main beef with that definition is that after a year in a postgraduate design program and too many hours spent between stacks of anthropology textbooks, I still can’t figure out what “form” and “culture” even mean.

My other beef is that the above definition is delusional. It seems to be gesturing toward the all-too-common notion that designers have some kind of sociocultural superpower: by shaping the physical objects that mediate and regulate people’s behaviors and interactions, they are shaping society itself! It’s a classic credit-hogging move on the part of the design world’s plentiful narcissists, who would like you to believe that material culture emerges fully formed from the depths of their magical sketchbooks.

The reality is that most designers work under some pretty heavy constraints: There’s a client or employer who gives them a mandate and makes the final call on what will actually be manufactured, printed or constructed. There are precedents set by existing designs that simultaneously inspire and circumscribe the designer’s work and limit the range of possibilities that clients and users will find acceptable. Finally, designed objects, spaces and images are frequently reinterpreted and repurposed by people who have no idea what the designer had in mind. In short, design is subject to the same limitations as any other so-called creative practice, and designers are no more authors than, well, authors are.

But despite the limited influence that designers themselves are able to exert over culture at large, design as a practice plays a central role in cultural reproduction.

Industrial design in particular has been especially important in the creation and maintenance of class divisions. Here’s a second, much different definition of industrial design specifically: it’s the profession of creating instructions for factory workers. Design is one of the linchpins of capitalism, because it makes alienated labor possible.

Starting in the mid eighteenth century, some factory owners realized that they could increase the efficiency of their operations by allowing customers to order their wares from catalogs and samples rather than selling them directly off the shelf in stores. But first they had to solve an unprecedented problem: customers buying from a catalog would expect their goods to look just like the picture, or else they’d return the goods and probably start buying from a competitor. This meant that factory output would have to be made almost perfectly uniform, which had never been done before.

Originally, factory craftsmen had a fair amount of creative license over what they produced, which meant that individual products in the same style could vary quite a bit. Now that freedom had to be taken away. Complex, varied jobs originally performed by a single craftsman were chunked into simpler, more easily standardized units. Each of these subtasks was then assigned to a different artisan, with the goal of eliminating any creative decision making on the part of the people actually making the wares.

The most famous documented example of this process occurred in the factory of the pottery tycoon Josiah Wedgwood, described in Adrian Forty’s design history classic Objects of Desire. Forty quotes Wedgwood boasting that he would “make such Machines of the Men as cannot Err.” But having stripped his men’s work down to the most inane, repetitive tasks possible, Wedgwood needed to pay someone else to do the creative work of preparing the original models that the rest of the artisans would then bore themselves stiff trying to replicate.

Who would be good for such a job? The ideal candidate would be good with their hands and broke enough to need employment, but still conversant in the tastes of the upper classes, whose purchases supplied most of the factory’s revenue. What Wedgwood needed, obviously, was an artist. So he hired one, and the field of industrial design was born.

As manufacturing shifted away from handicrafts and became increasingly mechanized, design as a distinct form of labor, and designs themselves as a form of intellectual property, became more and more important to sustaining relations of production.

The historical lesson here is that the idea that designing something should be done independently from making it – in other words, the idea that design should even exist as a profession in its own right – has been foundational both to the formation of the modern working class and to capitalist production period. This is not to hate on designers, who don’t get much say in the matter either.

All of that, though, is only what goes on in the factory and the studio. Designed objects don’t exert their full influence over cultural reproduction until they get out into the world of our homes, offices, and schools.

Most criticism of industrial design’s impact on everyday life amounts to a lamentation of consumerism. I think that sort of misses the point, but let’s run with it for a moment. Design is often decried as a tool for creating false needs through unnecessary product differentiation, promoting a pandemic obsession with individuality and newness. As the popular argument goes, design enforces and reproduces existing social hierarchies by making the lower class waste their money on goods they otherwise wouldn’t want. This traps them in poverty by preventing them from accumulating capital, and also creates a feeling of inferiority to the higher classes, who are able to afford the material signifiers of status that poorer people are tricked into craving.

My attitude toward that line of reasoning could be characterized as seasick agreement. There’s a lot of truth in there somewhere, but such a facile explanation leaves me feeling queasy. Yes, everyone buys too much shit and poor people get exploited in the process, but forty-two years after Baudrillard’s Consumer Society we know it’s not that simple. The ideas of waste and need are monumentally more complicated than a lot of leftists are willing to admit. Who can I trust to tell me which of my needs are real? How can I know whether I’m wasting money or investing in symbolic capital?

In any case, when it comes to design’s influence on social structures, the focus on consumerism distracts from something more significant and interesting. Design’s real power is that it makes relationships and divisions between people concrete. Without physical stuff to remind us of how we supposedly differ from one another, our hierarchies would be awfully ramshackle; stripped of our possessions, categories like “class” start to look like just a bunch of learned behaviors and confused ideas. Whether prohibitively priced cars, gendered garments, or separate schools for blacks and whites, social hierarchies are always maintained with the help of physical objects and spaces designed to reflect those hierarchies. Otherwise everyone’s claims of superiority and difference would be quite literally immaterial.

This is why women’s rights groups were so pissed off when LEGO released its dumbed-down “LadyFigs” line targeted at young girls. By simplifying a common toy for girls to use, LEGO was not only insulting girls by implying that they are technically inept, uninterested in challenges and generally stupider than boys; more importantly, the company was also proliferating objects that obviously embodied some blatantly discriminatory ideas about differences between the sexes. The point would not be lost on a five-year-old, who would realize immediately that compared to her brother’s LEGOs, hers look like they were made for an idiot.

This is a big deal because one of the main ways that people are socialized is through using, observing and contemplating material objects. The idea that people learn their places in society by engaging with the physical stuff around them has a long history in anthropology, but it was finally cemented into the theoretical mainstream in 1972 when Pierre Bourdieu published his Outline of a Theory of Practice. Bourdieu makes the case that we come to internalize the expectations of our particular social group by analogy with categories, orders and relations of things. Spatial arrangements of objects in the home, for example, or the use of different farming tools at different times of year, come to stand for intangible relationships between genders, social strata and the like, thereby anchoring abstract ideas about social organization to the physical world.

Regardless of whether you buy what Bourdieu has to say about it, it’s interesting to note that people often really do act like objects and spaces are actual concrete instantiations of their relationships with other groups of people. A particularly good example of this sort of behavior comes again from Forty, who details the measures taken by Victorian elites to maintain a sense of superiority to their servants.

In nineteenth-century England, domestic servitude was one of the few lines of work in which employees still lived with their employers, a practice that had been common on farms and in workshops a century earlier. Servants, whose social peers in other professions had more of a life outside of work, were growing frustrated with what they saw as an anachronistic form of labor that offered little in the way of personal independence. Upper-class households read their servants’ disgruntlement as a crisis of disobedience, and they reacted by systematically degrading servants’ living standards, just to make sure everyone knew who was who.

In addition to creating a bunch of new rules for servants’ conduct (stuff like, don’t hand the master anything unless it’s on a silver tray), wealthy families began to build homes with separate living quarters and work areas for servants, which were decidedly shabbier than the rest of the house. Homewares companies started designing extra-low-quality furniture and crockery and marketing them to the rich as items for their servants to use, the idea being that anyone who ate and slept on stuff that bad couldn’t help but know their place.

Of course the servants knew what was going on. Forty cites the autobiography of one housemaid who complains about her “lumpy mattress, specifically manufactured for the use of maids, I suspect.” But it wasn’t particularly important whether the servants were savvy to the situation or not, because their employers had fulfilled their real goal: they’d successfully created material environments that reassured them that they were better than the people who worked for them, which enabled them to keep acting like they actually were better.

Once you realize that all designed objects carry this sort of encrypted information about the organization of society, something amazing happens: you suddenly stop feeling bored in home furnishings stores. Washing machines and cooking implements have a lot to say about norms surrounding domestic labor; office trash cans embody the values of a middle class that can’t deal with its own waste; alarm systems and porch lights offer a crash course in the popular phenomenology of crime. But these objects are not just passive representations of ideas about how society should run. They actively promote those ideas, validating certain prejudices and chastising us when our behavior deviates from certain norms.

Maybe the problem with designers who boast that they are “giving form to culture” is that they don’t realize how big a responsibility they’re claiming. The chicken-and-egg relationship between systems of stuff and systems of people is very real, and with the world as it is, anyone who could legitimately claim control over either would have to be a pretty unthinkable asshole. Rather than glorifying themselves as cultural architects, perhaps designers should be relieved that they are such a small part of the apparatus that actually gives rise to the stuff all around us.

That’s not to say that designers are powerless. Far from it. They occupy a nodal position in the capitalist mode of production, and they’ll be important for getting out of it. Stuff – objects, spaces, images, technologies – play just as critical a role in restructuring relations between people as they do in maintaining them, and a solar cooker or a free software application requires way more design work than a Philippe Starck lemon squeezer. But any kind of progressive work is difficult if we’re deluded about what we actually do. As designers, we’d do well to abandon preoccupations with our own ability to generate solutions, and start being more aware of the ways that we participate in the problems.

Very interesting polemic on the value and relevance of design.

10 reasons the U.S. is no longer the land of the free - The Washington Post

10 reasons the U.S. is no longer the land of the free

By Jonathan Turley, Published: January 13

Every year, the State Department issues reports on individual rights in other countries, monitoring the passage of restrictive laws and regulations around the world. Iran, for example, has been criticized for denying fair public trials and limiting privacy, while Russia has been taken to task for undermining due process. Other countries have been condemned for the use of secret evidence and torture.

Even as we pass judgment on countries we consider unfree, Americans remain confident that any definition of a free nation must include their own — the land of free. Yet, the laws and practices of the land should shake that confidence. In the decade since Sept. 11, 2001, this country has comprehensively reduced civil liberties in the name of an expanded security state. The most recent example of this was the National Defense Authorization Act, signed Dec. 31, which allows for the indefinite detention of citizens. At what point does the reduction of individual rights in our country change how we define ourselves?

While each new national security power Washington has embraced was controversial when enacted, they are often discussed in isolation. But they don’t operate in isolation. They form a mosaic of powers under which our country could be considered, at least in part, authoritarian. Americans often proclaim our nation as a symbol of freedom to the world while dismissing nations such as Cuba and China as categorically unfree. Yet, objectively, we may be only half right. Those countries do lack basic individual rights such as due process, placing them outside any reasonable definition of “free,” but the United States now has much more in common with such regimes than anyone may like to admit.

These countries also have constitutions that purport to guarantee freedoms and rights. But their governments have broad discretion in denying those rights and few real avenues for challenges by citizens — precisely the problem with the new laws in this country.

The list of powers acquired by the U.S. government since 9/11 puts us in rather troubling company.

Assassination of U.S. citizens

President Obama has claimed, as President George W. Bush did before him, the right to order the killing of any citizen considered a terrorist or an abettor of terrorism. Last year, he approved the killing of U.S. citizen Anwar al-Awlaqi and another citizen under this claimed inherent authority. Last month, administration officials affirmed that power, stating that the president can order the assassination of any citizen whom he considers allied with terrorists. (Nations such as Nigeria, Iran and Syria have been routinely criticized for extrajudicial killings of enemies of the state.)

Indefinite detention

Under the law signed last month, terrorism suspects are to be held by the military; the president also has the authority to indefinitely detain citizens accused of terrorism. While the administration claims that this provision only codified existing law, experts widely contest this view, and the administration has opposed efforts to challenge such authority in federal courts. The government continues to claim the right to strip citizens of legal protections based on its sole discretion. (China recently codified a more limited detention law for its citizens, while countries such as Cambodia have been singled out by the United States for “prolonged detention.”)

Arbitrary justice

The president now decides whether a person will receive a trial in the federal courts or in a military tribunal, a system that has been ridiculed around the world for lacking basic due process protections. Bush claimed this authority in 2001, and Obama has continued the practice. (Egypt and China have been denounced for maintaining separate military justice systems for selected defendants, including civilians.)

Warrantless searches

The president may now order warrantless surveillance, including a new capability to force companies and organizations to turn over information on citizens’ finances, communications and associations. Bush acquired this sweeping power under the Patriot Act in 2001, and in 2011, Obama extended the power, including searches of everything from business documents to library records. The government can use “national security letters” to demand, without probable cause, that organizations turn over information on citizens — and order them not to reveal the disclosure to the affected party. (Saudi Arabia and Pakistan operate under laws that allow the government to engage in widespread discretionary surveillance.)

Secret evidence

The government now routinely uses secret evidence to detain individuals and employs secret evidence in federal and military courts. It also forces the dismissal of cases against the United States by simply filing declarations that the cases would make the government reveal classified information that would harm national security — a claim made in a variety of privacy lawsuits and largely accepted by federal judges without question. Even legal opinions, cited as the basis for the government’s actions under the Bush and Obama administrations, have been classified. This allows the government to claim secret legal arguments to support secret proceedings using secret evidence. In addition, some cases never make it to court at all. The federal courts routinely deny constitutional challenges to policies and programs under a narrow definition of standing to bring a case.

War crimes

The world clamored for prosecutions of those responsible for waterboarding terrorism suspects during the Bush administration, but the Obama administration said in 2009 that it would not allow CIA employees to be investigated or prosecuted for such actions. This gutted not just treaty obligations but the Nuremberg principles of international law. When courts in countries such as Spain moved to investigate Bush officials for war crimes, the Obama administration reportedly urged foreign officials not to allow such cases to proceed, despite the fact that the United States has long claimed the same authority with regard to alleged war criminals in other countries. (Various nations have resisted investigations of officials accused of war crimes and torture. Some, such as Serbia and Chile, eventually relented to comply with international law; countries that have denied independent investigations include Iran, Syria and China.)

Secret court

The government has increased its use of the secret Foreign Intelligence Surveillance Court, which has expanded its secret warrants to include individuals deemed to be aiding or abetting hostile foreign governments or organizations. In 2011, Obama renewed these powers, including allowing secret searches of individuals who are not part of an identifiable terrorist group. The administration has asserted the right to ignore congressional limits on such surveillance. (Pakistan places national security surveillance under the unchecked powers of the military or intelligence services.)

Immunity from judicial review

Like the Bush administration, the Obama administration has successfully pushed for immunity for companies that assist in warrantless surveillance of citizens, blocking the ability of citizens to challenge the violation of privacy. (Similarly, China has maintained sweeping immunity claims both inside and outside the country and routinely blocks lawsuits against private companies.)

Continual monitoring of citizens

The Obama administration has successfully defended its claim that it can use GPS devices to monitor every move of targeted citizens without securing any court order or review. (Saudi Arabia has installed massive public surveillance systems, while Cuba is notorious for active monitoring of selected citizens.)

Extraordinary renditions

The government now has the ability to transfer both citizens and noncitizens to another country under a system known as extraordinary rendition, which has been denounced as using other countries, such as Syria, Saudi Arabia, Egypt and Pakistan, to torture suspects. The Obama administration says it is not continuing the abuses of this practice under Bush, but it insists on the unfettered right to order such transfers — including the possible transfer of U.S. citizens.

These new laws have come with an infusion of money into an expanded security system on the state and federal levels, including more public surveillance cameras, tens of thousands of security personnel and a massive expansion of a terrorist-chasing bureaucracy.

Some politicians shrug and say these increased powers are merely a response to the times we live in. Thus, Sen. Lindsey Graham (R-S.C.) could declare in an interview last spring without objection that “free speech is a great idea, but we’re in a war.” Of course, terrorism will never “surrender” and end this particular “war.”

Other politicians rationalize that, while such powers may exist, it really comes down to how they are used. This is a common response by liberals who cannot bring themselves to denounce Obama as they did Bush. Sen. Carl Levin (D-Mich.), for instance, has insisted that Congress is not making any decision on indefinite detention: “That is a decision which we leave where it belongs — in the executive branch.”

And in a signing statement with the defense authorization bill, Obama said he does not intend to use the latest power to indefinitely imprison citizens. Yet, he still accepted the power as a sort of regretful autocrat.

An authoritarian nation is defined not just by the use of authoritarian powers, but by the ability to use them. If a president can take away your freedom or your life on his own authority, all rights become little more than a discretionary grant subject to executive will.

The framers lived under autocratic rule and understood this danger better than we do. James Madison famously warned that we needed a system that did not depend on the good intentions or motivations of our rulers: “If men were angels, no government would be necessary.”

Benjamin Franklin was more direct. In 1787, a Mrs. Powel confronted Franklin after the signing of the Constitution and asked, “Well, Doctor, what have we got — a republic or a monarchy?” His response was a bit chilling: “A republic, Madam, if you can keep it.”

Since 9/11, we have created the very government the framers feared: a government with sweeping and largely unchecked powers resting on the hope that they will be used wisely.

The indefinite-detention provision in the defense authorization bill seemed to many civil libertarians like a betrayal by Obama. While the president had promised to veto the law over that provision, Levin, a sponsor of the bill, disclosed on the Senate floor that it was in fact the White House that approved the removal of any exception for citizens from indefinite detention.

Dishonesty from politicians is nothing new for Americans. The real question is whether we are lying to ourselves when we call this country the land of the free.

Jonathan Turley is the Shapiro professor of public interest law at George Washington University.

Read more from Outlook, friend us on Facebook, and follow us on Twitter.

Puts the extradition of UK citizens to the US into perspective, as well as the US criticism of other states for human rights abuses...

Possessive Apostrophe: Why You Shouldn't Care

So a while ago there was some furor when it became public that the city of Birmingham, England was planning to drop apostrophes from street signs. The general response from the Anglophone public seemed to be that this was a clear sign of the apocalypse and the apostrophes must be defended at all costs.

Really, people? Really?

Language Log has posted before about the illogicality of word rage, and apostrophe obsession is one of the reasons I refuse to read Eats, Shoots and Leaves no matter how many people recommend it to me. If you (in general, not Lynn Truss specifically) are going to suggest, even humorously, that people who misuse apostrophes should be mutilated or murdered, you obviously need to get a different hobby.

Because I'm going to tell you a little story.

Old English--the language that the Angles and the Saxons and the Jutes spoke when they showed up in Great Britain a few centuries back--had cases. If you have studied languages like German, Russian, Latin, Greek, or Sanskrit, you know all about cases and can go hide under a table and cry for a bit. If you don't know about cases, well, they're just changes that you make to nouns and adjectives--new endings, usually--based on what those words are doing in a sentence. A subject gets one case, an object gets another, and so on.(They're why modern English makes a distinction between I and me.)

Cases have two functions: one, they make it clear what words are doing in a sentence, even if you've gone and put them all out of order for artistic purposes, and two, they make the people trying to learn your language hide under tables and cry. (Because they don't know what case they're in, or how they got there, or how to get out of it again, and just when they think they've got it sorted they discover it's a special case with an irregular stem. Not that I know anything about this, Russian secondary locative.) I suspect that Old English cases were used as a form of psychological warfare against the Welsh, possibly in retaliation for the initial mutation.

Now, specifically, Old English had a genitive case. This case is the ancestor of Our English's possessive form: it let Old Englishmen (and -women) tack one noun onto another to indicate a relationship between the two. The ending for the Old English genitive was -es, pronounced, rather logically, as "ess." It gave us words like:

"king" = cyng --> cynges /kyNes/, where /N/ is the "ng" sound and /y/ is basically German ü
"cat" = catt --> cattes /kates/
"fish" = fisc --> fisces /fiSes/, where /S/ is "sh"

So you could say things like "cynges catt" for "The king's cat," "cattes fisc" for "the cat's fish (which she is eating at present)," even "fisces cyng" in the event you discovered that fish had royalty. Straightforward, right?

Well, somewhere on its way to becoming Middle English, Old English cases started to fade away. The vowels turned to schwas and then the schwas dropped off, like sixth fingers after you've tied a string around the bottom for a while. (Yes, that's a gross analogy, but I'm standing by it.) The consonants in the endings mostly fell off, but not all of them; /s/ is in fact a very persistant little bugger that hung on, though it sometimes changed to a /z/ in order to blend in better with the new neighbors. So those words up top became the more familiar:

king --> kings /kiNz/
cat --> cats /kats/
fish --> fishes /fiSez/

Wait a minute! you cry. Why didn't the vowel disappear from "fishes"? Well, because "fishs" is stupid. No, really; /s/ belongs to a family of consonants called "coronal fricatives," and if you try to pronounce two of those in a row, you will naturally insert a little schwa vowel in between to distinguish them. (The coronoal fricatives are /s z S ch/ and /Z/, which is like the "z" in "azure" and kind of marginal in English.) If the two consonants had just merged together, or if the /s/ had come off, there would've been no genitive distinction whatsoever for those words, and that would've been odd and problematic, the kind of breaks in pattern that speakers tend to intuitively fix via analogy (kind of how kids try to say I brang it on analogy with I sang it). So for words that ended with coronal fricatives, the vowel in the genitive ending hung on, to keep the two consonants on either side from encroaching on each other.

Or maybe your cry was: Wait a minute! Are those possessive forms or plural forms?

To which I say: Yes.

See, at this point there wasn't really any distinction between plural -s and possessive -s. Well, they were distinct in use, of course, but there was no need to disambiguate them in writing. Partly with was because most Middle Englishmen were illiterate and didn't care, and partly because there wasn't a standard spelling system anyway, and partly because...well, if they're not distinct in speech and this is somehow not a problem, why do you need to disambiguate them in writing? Seriously, this is not a problem 99% of the time; I have had more trouble making the distinction between "wight" and "white" in speech than between "students" and "student's." (Though I grant I am probably unusual in how often I talk about wights.) When was the last time you mistook a possessive for a plural or vice-versa? As opposed to any other kind of homophone? Go on, think about it for a minute.

But this does raise the question of where the killer apostrophes that everyone's so excited about came from, if they're not necessary. (And hint: they're not.) To understand, you have to remember that all the Old Englishmen were dead at this point, and the Middle Englishmen had only their own speech to work with. And what they had was the paradigm represented like this:

/kiNz/
/kats/
/fiSez/

At this point, the genitive/possessive was the only case ending left in English. And somehow, a large number of Middle Englishmen decided it wasn't an ending at all. They did a re-analysis of their language, and their conclusions looked something like this:

/kiNz/ = king his
/kats/ = cat his
/fiSez/ = fish his

That's right. They took the ending and decided it was, instead, a very oddly placed pronoun. And they wrote it this way. There are plenty of attested texts with phrases like "the king his justice" and "Moses his mercy" and even "the wife his child" even though that doesn't really make sense from a gender point of view. But if the -/ez/ in /fiSez/ is a form of the pronoun "his," then the -/z/ in /kiNz/ and the -/s/ in /kats/ must really be contractions of that pronoun.

And how do we denote contractions?

king's
cat's
fish his

And in the interests of consistency, that last eventually because "fish's" (our old friend analogy again) giving us the plural/possessive apostrophe rules we all know and freak out about.

(Where did the rule about plural possessives come from? The one that puts the apostrophe on the outside? Not sure; probably an attempt by eighteenth-century Latinistas to make English more "logical," for their own ideas of logic. Archbishop Lowth, I'm looking at you.)

So the possessive apostrophe was born out of a goofy-but-pervasive orthographical glitch. And its use was not at all standardized even into relatively recent times--Language Log cites a letter by Thomas Jefferson that doesn't use it once, and Tom was a pretty clever guy in his day. I'm not sure exactly when the apostrophe was raised to the status of shibboleth for the educated classes but it is definitely not a benchmark for the progression of the apocalypse. In fact, the "misuses" that some people love to spazz about--"CD's for sale," "I like taco's," "I Tivo'd it," etc.--reflect another ongoing re-analysis of how to use the apostrophe: as the marker of a morpheme boundary. (And the fact that this "misuse" is still perfectly legible, if unusual, ought to tell us something.)

That's why I don't have any problem with omitting possessive apostrophes in English. If it represented some kind of useful information or distinction--like the contraction apostrophe--that's one thing. (Though other languages seem to contract just fine without apostrophes, either.) But it's just an orthographic anomaly, somewhere between the persistence of silent "e"s on the ends of words and the elements of Shaw's "ghoti" in terms of its practicality, and I do not think the world will end if we quietly set it aside. Inertia and sentimentality are certainly not sufficient justifications to hang on to it.

(Strangely enough, the Internet cannot tell me if Birmingham actually went through with its apostrophe-elimination initiative, or if the council caved under backlash from enraged prescriptivists. Anyone from the area willing to comment?)

An erudite and very well-constructed argument about why you shouldn't care about correct apostrophe usage for possessive pronouns.

Contractions however are a different matter...