I made a tweet that made its way into a Twitter Moment. This actually went okay, because I muted the thread after 10 retweets. I think people mostly understood what it was saying, even if it was borrowed from Tumblr.
To pass the social system off as an objective artifact determined by (quasi-) scientific processes, forecasting has to scapegoat “irresponsible” individuals for failing to live up to the terms of the forecast. Adorno writes that “the constant appeal of the column to find fault with oneself rather than with given conditions” is evidence of “the implicit but ubiquitous rule that one has to adjust oneself continuously to commands of the stars at a given time.” When forecasts end up being inaccurate, the fault lies not in the prediction methodology but the individual’s failure to adjust to the forecaster’s advice.
This piece by Robin James calls out big data through the lens of Adorno’s critique of astrology. “Big data doesn’t forecast the future but remakes the present in the image of down-to-earth stereotypes.” It gives a voice to a lot of feelings I’ve felt that push against not just the privacy angle, but having to do with casting narratives about people. It reminded me of how at one point in the history of psychology there was a debate about whether academic psychology was concerned with individuals or populations. In as much as that ship has sailed, it is nearly myopic in its concern with groups. Which is to say: it is not a deductive inference to move from generalizations about a group to an individual. Nor is it even necessarily a transductive inference, but an inductive one. That debate was settled long ago: the focus is on groups, because that’s more descriptive. But the movements of big data, modeling, and forecasting have started swinging the pendulum in the other direction. Anyway. It’s a very good article.
Also, I clapped out loud when I read this, about the self-tracking movement and how it becomes a kind of imperative to fixing oneself rather than highlighting problems to be addressed collectively:
Adorno explains how this can seem empowering but really isn’t: “The idea that the stars, if only one reads them correctly, offer some advice mitigates the very same fear of the inexorability of social processes the stargazer himself creates.” It reinforces the neoliberal myth of individual responsibility for social problems and misdirects our attention toward dumbed-down superficial solutions to complex social problems. For example, framing problems of political economy, class, and race as an “obesity epidemic” assumes both that obesity is a problem and that it is a problem that can be solved by modifying individual behavior (diet, exercise).
This gets at something I rarely hear talked about and never hear from people focused on the technological side of things.
I dropped my father’s cup today.
My dad died when I was eight. I would get up before him, make a breakfast of bagels or toast and chocolate milk, and watch cartoons. He would come downstairs and go for a run, come back, shower and make coffee. He’d drink it out of one of a few mugs, all of a similar size, and most of which I still have. He had good taste in art, and his mugs were handmade and hand-painted by local potters. They’re irreplaceable.
There’s an old story about a Zen monk who had a favorite cup. It was beautiful and prized, and he enjoyed showing it to temple visitors. He used it for everything, all the while saying out loud, “This cup is broken.” When at last he dropped the cup, he wasn’t surprised. He repeated, “This cup is broken.”
Today my grip faltered when I grabbed one of my father’s mugs. It fell, hit the rim of the sink, rolled, and fell to the ceramic tile floor. I knew what was going to happen. I said, in the quick voice that makes no sound and forms no words, “This cup is broken.”
It bounced, rolled, and was stopped by its handle. If anything it scratched the tile. I picked it up, and as I placed it in the cupboard, said, “This cup is broken.”
The world is full of nonsense, but here’s a slice of sanity including something you can buy. I’m writing this note to myself because I used it to finish proof-reading my dissertation, a task I find nigh impossible.
Write down all of the things you have to do that aren’t written down in a systematic way. Write them on a piece of paper. Or write them on index cards. Write until your brain feels empty. You can write down feelings or theories too.
Now do something with everything you wrote down. Evaluate it. If you can do it in less than two minutes, do it when you read it. Otherwise put it somewhere. Doesn’t matter where. Stacking up the index cards is just fine. Throw away or cross out the things that aren’t important. Save your feelings and theories somewhere. It’s important to reflect on those, if only as a map to see how far you’ve come through time.
Now you have a list of the things you have to do that are hard. Pick one. Now break it up, either on paper or in your head, into a list of actions in the world you need to take to complete it.
Download Forest. Set the timer for 25 minutes. Do what it says.
Don’t work more than three hours unless you feel possessed by flow and momentum.
From time to time, do the whole thing again. That’s it.
Which task manager or notes app you use matters so much less than the process by which you do things. Process matters. Your notes app and task manager matter only to the extent that they mesh with your process. (It is worth spending some time finding the right fit, but don’t confuse this quest with any kind of productivity. I find it relaxing and intellectually simulating to evaluate task managers; I’m searching for the right amount of complexity, visual presentation, and ergonomics. I’m far from finding one.)
Also, it’s nice to take time and write to yourself.
In a quiet moment in Seattle, Robert Levine, a social psychologist from California, quoted the environmentalist Edward Abbey: “Growth for the sake of growth is the ideology of the cancer cell.”
I’ve written over 250,000 words since the middle of 2016 when I started journaling every morning. That doesn’t include the six months I spent writing 150 A5 pages in a physical journal, which I gave up because I wasn’t writing as much, nor accessing the kinds of truths and insights that come from writing near the speed of thought, bucking self-censorship or rewriting in my head, and charging forward to the next thought or feeling and giving it shape and constraint in language. I like writing with pens because it slows me down and forces me to focus on expressing myself well. But that’s not quite what’s needed in the morning when thoughts are still nebulous and unaligned with the day’s tasks or past.
What will I do with all these words I’ve made? I occasionally think of drawing from them to create articles or more formalized writings, but really they exist as a record of my reasonings with myself. It takes a lot of reflection to know whether something is happening because it should be happening, because you really want it to happen, or because the wagon is riding through deep ruts and the weight of habit is a blanket of unexamined comfort we hesitate to shed. Writing in the morning doesn’t always sort this out, nor does it often unveil such clarities, but when it does it is like meditation, or a funeral for a bad decision, or laying rich soil over a withered garden.
In this exercise, I write for myself, save for this that you are reading now. This feels somehow greedy, like a dragon hoarding its treasure, or a beetle scooting its ball of dung. Which is why I’m working to write more, to share more, to leave more open my shared self. When I was a child I had a terrible problem with believing everyone had the same education and knew the same things I did. This meant that when someone asked a question, especially a technical one, I thought of them as idiots, forgetting how I’d learned everything I knew, which was through patient questions and answers from other kids, adults, and books, and eventually the internet. My incredulity was cruel, because so much of our society is predicated on shame, and I could with a glance bring shame to an adult. (How do you exist without knowing how to use computers? How can you have a job without knowing how to edit the registry?) I soon came to realize that instead of invoking shame I could evoke money, and had to learn how to treat people decently and with patience while lubricated by money. My eagerness to help people spread, without the money, when I saw over and over how the delight of empowerment radiated from someone who learned how to do something to help themselves, for good. This is an important facet of education, and I’m somewhat suspicious of people involved in it without this kind of a story of revelation. But perhaps it is a fiction I tell myself, and it is the lesser of my two stories of educational revelation, the second being the jarring juxtaposition of freedoms and learning between my Montessori elementary school with my public middle school.
Anyway. It feels good to write, and the thing that holds me back from sharing my writing is half craft, and half shame, and the shame part, for lack of craft, is representative of the part of society that needs to wither, like the garden, replaced by honest words.
One of the things about that book, Authority, is that makes the case, to me at least, that people are basically lying, or if not lying, misrepresenting themselves as experts, as people worthy of attention. But I’m coming to realize that that’s okay; it’s nearly impossible to fight against anyhow. Not everyone who shares something has to be an expert in their field. Not everyone who writes a tutorial for something has to do it as a retrospective; it’s much easier to write a tutorial while working on the very thing the tutorial is about as a kind of more formalized note-taking that’s just shared with a bit of narrative framing. That’s not bad.
But coming from academia, and the particular aggressive and ultimately bad argumentative style that was inculcated in me at Indiana University’s cognitive science program, I have the instinct to say with chest forward, “Who are you to make such proclamations?” But who am I to make such proclamations? If someone wants to talk about learning, I can say “what you are saying is not backed up by the literature.” I can even appeal to authority a little bit and just say “no, that’s wrong, it’s a lot of effort for me to go through and get citations for you but I’m saying as someone who has been in graduate school in learning science for eight years that you are wrong.”
I worry about the very authority of the source, now, after reading that book. I worry the blog post I’m reading was written by someone who doesn’t understand the best way of teaching what they’re teaching, who doesn’t have a sense of the bigger picture, who isn’t intentionally introducing concepts here and holding them back there. I worry about this doubly when I read that the blog post’s author has a book on the subject.
But that’s okay. We’re all grasping at shadows in a dark room.
The hamburger menu seems ubiquitous. I think it’s symptomatic of a kind of thinking that we need to exercise from design. It gets used by designers to disempower users, and confusing or frustrating people is not the same thing as increasing engagement. Intentionally confusing those two things in a client’s mind in order to misrepresent them is violent.
Developers and companies typically want to increase engagement with whatever they make. It means people have either their application, their brand, or ideas the designing organization wants to perpetuate in the front of their mind, paying them in attention. This arose both from a desire for influence—the more someone uses a well-designed product with good interactions, the more they’ll use it in the future, leading to more money or engagement or user data—and a desire to present ads to a somewhat captive audience. One of the ways this manifests for users is in an effort to isolate them in their task: they’d like to navigate away or leave their screen or task but find themselves without the means to do so. (Apple notoriously solved this problem with a physical and ever-present home button so users always had a way out at their literal fingertips. This is now relegated to a non-intuitive swipe interaction in the iPhone X.)
The purpose of the hamburger menu is to isolate. Ostensibly this nearly ubiquitous icon came about in an effort to hide user interface elements on smaller screens with less usable visual real estate. It has since become a way, even with all the screen area of a 27” iMac, to lower the probability that someone changes screens or moves away from what they are currently doing, what the designer wants them to do. This is usually described as a way to “reduce clutter” or “simplify the design.” It is chickenshit minimalism.
I want to drive a wedge between making it more difficult for a user to change screens and making content more engaging to keep people engaged. The hamburger menu gets used by designers to remove navigational elements of an interface, which drives up metrics like the time on task, and management celebrates. Yes, smaller screens and finger input limit the number of functional touch targets, which lowers the maximum interaction density of a screen. But this is not the same thing as making content more engaging. It’s just making navigation more difficult. It makes it harder for the user to leave.
Constraining navigation is not the same thing as increasing engagement, except in as much as the engagement is frustration, and that frustration gets increased. Many designers do not want to admit that many of their design patterns are hostile to their users or serve to limit them to serve the values of the commissioning company.
Design is a series of tradeoffs between creating and constraining affordances. The hamburger menu is a poor and unjustified constraint, and using it to isolate a user is a kind of violence against that user, and not drawing a distinction between these things for a client or manager is hostile and violent. The alternative is to present the user with the first level of navigational elements, the first level of a nested menu, the main verbs or nouns of the application. Touch targets don’t have to be big, and they don’t have to be foregrounded. But don’t confuse locking a user in a room as increasing that user’s engagement and infer that increase of engagement is indicative of pleasure, or happiness, or usefulness. (This is the same reason most learning analytics are bad.)
Ever had a troublesome font in R that doesn’t want to render to
ggplot in RStudio, but will render just fine in
knitr? Do you use
extrafont but sometimes it just doesn’t work? Trying to look like Tufte’s Visual Display of Quantitative Information but ETBembo is being uncooperative? Try setting your font (globally) to this.
myfont <- ifelse(is.null(knitr::opts_knit$get("rmarkdown.pandoc.to")), "serif", "ETBembo")
serif if the code is being rendered in anything but
knitr. If it is being rendered by
knitr, say, by pressing the “Knit” button in RStudio, then it sets it to ETBembo.
I found it on StackOverflow.