Are We Bored Yet? The Apple Watch and New Technologies

One of the plights of modernity and postmodernity is hyperboredom. This is not the kind of boredom that comes out of having nothing to do, but the kind of boredom that comes out of having too many options and no way to distinguish which one is better than the other. We are jolted out of this boredom when we encounter disruptive technologies. These are technologies that fundamentally change a particular market and have an impact on our culture.

According to Ian Bogost, Apple is a company that is as much in the business of shaking us out of our routine with disruptive technologies as it is in the business of manufacturing them. This may explain why people flock to Apple’s announcements (either virtually, or in person) with a big-tent revival fervor in hopes of seeing what groundbreaking new technology Apple has in store for us. For a brief moment, the hyperbordom is replaced with anticipation and excitement over the possibility that the multitude of options will become passé to be replaced by that one technology that supersedes all of them.

Take, for example, Steve Jobs’ announcement in January, 2007 of this little gadget called the iPhone. He knew the implications of this device and where it stood in the grand scheme of things: (Quoted from “How Apple Introduced the iPhone” in The Atlantic):

This is a day I’ve been looking forward to for two-and-a-half years. Every once in a while, a revolutionary product comes around that changes everything and Apple has been—well, first of all, one is very fortunate if you get to work on just one of these in your career—Apple has been very fortunate. It’s been able to introduce a few of these into the world. In 1984, we introduced the Macintosh. It didn’t just change Apple. It changed the whole computer industry. In 2001, we introduced the first iPod. It didn’t just change the way we all listen to music, it changed the entire music industry. Well, today, we’re introducing three revolutionary products of this class. The first one is a widescreen iPod with touch controls. The second is a revolutionary mobile phone. And the third is a breakthrough Internet communications device. An iPod, a phone, and an Internet communicator. These are not three separate devices. This is one device. And we are calling it iPhone. Today Apple is going to reinvent the phone.(emphasis added)

Since then, every time Apple unveils a new iPhone, people flock to stores in anxious anticipation, some of them going so far as to sleep outside the Apple store’s doors in hopes of being the first to get the latest and best that Apple has to offer. And, it does not seem to be slowing down. Sales for the iPhone 6 and 6 Plus broke last year’s record, selling ten million phones last weekend, an opening weekend that was strategically timed to ensure that there will be visions of iPhones dancing in everyone’s head by December.

So with such excitement and Apple’s track record of disruptive technology, what happened with the Apple Watch?* Apple had not released a new device in four years. This was to be the next device after the death of Steve Jobs that shows Apple is still changing markets. However, rather than the fanfare of groundbreaking technology, the Apple Watch was met with mixed reactions..

In his article “Future Ennui” Ian Bogost says that the problem is not the technology itself, but the burden that comes with it. We have become bored of the constant barrage of groundbreaking technologies. He compares it to Google’s innovations,

Unlike its competitor Google, with its eyeglass wearables and delivery drones and autonomous cars, Apple’s products are reasonable and expected—prosaic even, despite their refined design. Google’s future is truly science fictional, whereas Apple’s is mostly foreseeable. You can imagine wearing Apple Watch, in no small part because you remember thinking that you could imagine carrying Apple’s iPhone—and then you did, and now you always do.

Bogost may be giving Google too much of a pass, though. The Google Glass has sparked some controversy among those paranoid of being filmed by its wearers.

Perhaps the difference between Google’s innovations and Apple’s innovations can be compared to the difference between reading Isaac Asimov and Margaret Atwood. Asimov writes about robots and artificial intelligence, and even explores some of the ways that this technology can go awry, but Asimov’s stories do not come with a sense of prophetic inevitability that Atwood’s do. Atwood writes speculative fiction, not science fiction (See Atwood’s book In Other Worlds). Atwood’s stories, like her recent Madd Adam trilogy, are disconcerting because they are a little too plausible. Rather than something that may be fifty years from now, her books describe a near-future in which technologies that are already in place are ratcheted up. Similarly, while people will likely not drive automatic cars in the next ten years, it is much more likely that they will be wearing technology that is collecting data on all of their bodily process, purchases, and locations in the next two years.

While the fervor over the iPhone 6 hit record levels, perhaps the mixed response to the Apple Watch signifies that we are tempering our enthusiasm over internet-in-our-pocket technologies. Clive Thompson, quoted in an in an opinion piece in the New York Times, says that our attitudes toward technology follows a predictable pattern, “We are so intoxicated by it, and then there’s a fairly predictable curve of us recognizing as a society that this is untenable, and we’re acting like freaks.”

Thompson is an optimistic writer on technology who believes that there are many benefits to the kind of community interactions that are possible with the internet. Rather than focusing on the doom-and-gloom of the here-and-now, Thompson takes a broader, historical perspective, reminding us, in an interview with The New Yorker that we have been through this before,

We have a long track record of adapting to the challenges of new technologies and new media, and of figuring out self-control…More recently, we tamed our addition [sic] to talking incessantly on mobile phones. People forget this, but when mobile phones came along, in the nineties, people were so captivated by the idea that you could talk to someone else—anywhere—on the sidewalk, on a mountaintop—that they answered them every single time they rang. It took ten years, and a ton of quite useful scrutiny—and mockery of our own poor behavior—to pull back.

Indeed, studies on cell phone addiction and parents neglecting their children and state laws addressing car accident deaths because people cannot pull away from their cell phones are all indications that we are becoming keenly aware that we might be acting like “freaks.”

While disruptive technologies may also disrupt our postmodern malaise, there does come a point when we become weary of the constant announcements of the next-big-tech. Bogost’s article is compelling because he touches on this very notion. Once there are too many next-big-tech options available, the hyperboredom of modernity and postmodernity sets in.

*The marketing team at Apple wisely opted not to go with “iWatch.”

Cell Phone Addiction, Texting Anxiety, and Email Bankruptcy

A new study in the Journal of Behavioral Addictions by Roberts, et al looks at the incidence of cell phone addiction among college-age males and females. The study also looked at what types of programs or behaviors had a positive correlation to addiction. As it turns out, some people do seem to be addicted to their cell phone, but perhaps the more accurate statement is that people are addicted to Facebook, Twitter, Instagram, and Pinterest.

Incidentally, I am writing this as I am sitting at a Starbucks, the enabler, par excellence, of socially acceptable addictions. Both men and women are sitting on their cell phones doing something with their thumbs. If any of these people were to leave home without their cell phones, would they suffer from withdrawal? That’s one of several questions from the study. Another is whether you find yourself using your cell phone more and more.

Withdrawal is one of several indicators of addiction. Roberts, et al use the standard definition of addiction to identify whether college co-eds are addicts. They look for the presence of salience, euphoria, tolerance, withdrawal symptoms, conflict, and relapse, as well as the incidence of continued use despite negative consequences. They found that many people have a cell phone addiction that is comparable to a behavior addiction, like compulsive shopping or compulsive gambling. (This is different from a substance addiction, which can involve not only neurological changes due to the formation of a habit, but also neurological effects that are a result of how the substance interacts with the body). In an effort to determine how and why a cell phone addiction forms, they focused on identifying the “tipping point” in which the cell phone goes from being a tool that people like to use, to becoming a need.

When exactly this tipping point occurs is difficult to identify. Incidence of phone addiction seems to correlate with the prevalence of Smart Phones, which means the underlying issue is what the phone is being used for. Furthermore, many of the students they surveyed consider their cell phone an integral part of their identity, meaning that the cell phone is viewed as something more than a tool or business or diversion. According to Kent Dunnington in his book Addiction and Virtue in which he looks at addiction from the perspective of Aristotle and Aquinas, addiction has an orienting nature to it that provides a semblance of identity and order (priorities) in a disordered, fragmented world. As the authors of the study point out, “Cell phones have become inextricably woven into our daily lives – an almost invisible driver of modern life.”

The study determined that men and women, who are addicted to their cell phones, use the cell phone slightly differently. Activities that positively correlate to cell phone addiction in men were number of emails sent, reading books, Facebook, Instagram, Twitter, number of phone calls, and number of texts. Activities that positively correlate to cell phone addiction in women were Pinterest, Instagram, Amazon, Facebook, number of calls made, and number of texts and emails. Women spent significantly more time on their phones compared to men (10 hours per day versus 8 hours per day), but had the same number of calls, texts, and emails as men. Women spent more time on Facebook, but Facebook was a stronger predictor of addiction in men.

The authors contend that the addiction has to do with being socially connected. Gaming, for example, was not strongly correlated with cell phone addiction, while social media was. Furthermore, mental health issues as a result of cell phone use indicate that social connection is much more important to people than entertainment. Consider two issues that have arisen since Smart phones became popular: Text bubble anxiety and email inbox overload.

Ben Crair has a thought-provoking piece in the New Republic on the concept of “text bubble anxiety” or the sense of tension someone has when they know that another person is typing a message but the message has not been sent. The longer someone takes to type, indicated by ellipses on iPhones or “Bob is typing…” in Google Chat, the more anxious the other person becomes because the longer someone types, the more we tend to assume it is something bad. In reality, the other person may have been interrupted by another phone call or had to re-type the message for some other reason. When the person finally does send the text, and it happens to have trivial content, then we tend to be disappointed. This roller coaster ride of assumptions takes an emotional toll. Jessica Bennett, in an op-ed in the New York Times, confesses that her therapist recommended turning off the typing awareness indicator because it was causing her mental distress.

Another mental health issue is due to an overwhelming email inbox. Some people become so burdened by a burgeoning inbox that they must declare what Sherry Turkel, sociologist at MIT, calls “email bankruptcy.” Similar to financial bankruptcy, email bankruptcy is when your inbox becomes so full of unread or unaddressed emails, that it has become too unwieldy. This can cause some people additional stress and anxiety. One solution is to archive all emails, clear their inbox, and send a message to contacts saying that if they want to continue to do business with you to send a new email.

When it comes to addiction, the behavior is really a symptom of a deeper problem. This study indicates that cell phone addiction is really an addiction to mediated socializing. Dunnington says that addictive behavior, which is based on something more than mere sensory pleasure, can tell us what human beings most deeply desire. While addictions, like addictions to social networking, may begin as diversions to deal with boredom, they morph from diversions to addictions because they provide a sense of purpose or, in this case, a sense of community that is lacking in our modern individualistic culture.

Sherry Turkel says that it is important for people in our culture to demarcate sacred spaces where one will not engage in internet mediated socializing because people need to interact with one another in a more substantive way. She also says that people need to learn the practice of privacy and solitude, or put another way, people need to set personal boundaries and to cultivate an ability to be alone without being lonely.

While this study certainly has its limitations (e.g., the test subjects were college students), it is telling that the activities that have a positive correlation to cell phone addiction are not gaming or entertainment, but social networking.

Disabilities, Super-Abilities, and “Normal”

Tim Howard (Wikipedia)

A recent ABC News report asks whether Tourette’s syndrome can give athletes an advantage. Two examples given in the report are soccer player, Tim Howard, and swimmer, Anthony Ervin. Goal-keeper, Tim Howard, made history in the recent United States versus Belgium World Cup game in which he blocked a record-breaking sixteen goals.  This is the latest in several career successes for Howard, who believes his Tourette’s gives him an advantage on the field. Olympic gold-medalist, swimmer, Anthony Ervin believes that his tics, caused by Tourette’s syndrome, help him with speed by channeling his nervousness.

Studies indicate that athletes with Tourette’s do not have a noticeably faster response time or move faster than athletes without Tourette’s, but there may be other factors that contribute to an advantage on the field, or in the pool. It may be that Howard and Ervin’s mental and physical discipline needed to manage their Tourette’s works to their advantage.

Another recent study, reported in Scientific American, looked at how people with dyslexia can identify visual cues better than those without dyslexia. Apparently, people with dyslexia can look at pictures of impossible figures, like the three-dimensional impossible figures in an Escher print, and pick out the problem more quickly than other people.  This ability can translate to the real world. Often people with dyslexia can look at room and find what looks out-of-place.

Waterfall, 1961 (Wikipedia)

Scientists are not entirely sure why this happens, but they speculate that it may have to do with the brain changes that occur in people who read a lot versus people who do not read as often or read slowly. People who read less tend to have a more holistic perspective of a particular setting rather than focusing on one thing and tuning out the rest of their surroundings. This coincides with studies on entrepreneurs with dyslexia. In the United States, about 35% of entrepreneurs have dyslexia. Many of these entrepreneurs say that dealing with their dyslexia has helped them to become very good at sifting information and grasping the “big picture” better than other people.

Both of these studies demonstrate perceived advantages from something that is labeled as a disability or abnormality. It may be that while abilities in one area are diminished, abilities in another area are enhanced. David Epstein in his book, The Sports Gene, says that most of the people we celebrate as great athletes and examples of human achievement, have attributes that fall outside of the norm. One of his many examples is that of elite-level basketball players. Most people have an arm span that is the same as their height, but most professional basketball players have a longer arm span than height, which would serve as an advantage on the court. Longer arm span is not necessarily considered a disability, although in some cases, it can be and indicator of Marfan syndrome. This is just one of many examples in Epstein’s book where an “abnormality” leads to an athletic advantage.

One article on the OCD Foundation’s website on Tourette’s syndrome points out that metaphors are everything, and a child who is told his Tourette’s is like driving a Ferrari while everyone else is driving a Toyota will grow up thinking quite differently about himself and his abilities than a child who believes he is limited due to a disability. Additionally, several of the entrepreneurs with dyslexia said that they had supportive parents and mentors who helped them see their abilities rather than their disabilities. Perhaps in our eagerness to “cure” abnormalities and diseases with advances in medical technologies and enhancement therapies, we lose sight of the gifts and creativity that these people bring.

Mortifying Reality TV

Ever since COPS aired in 1989, the reality TV professional niche genre has found a willing following. Even among the television connoisseurs, reality TV shows that follow a particular profession seem a bit more high-brow than makeovers, contests, match-making, or group living situations. You are able to see what it is like for those who work in a particular profession, and often viewers gain a greater respect for the particular difficulties that those professions face.

Notable professional reality TV shows are New York Med, which is a recent addition to professional reality TV that has received positive reviews. Ace of Cakes was a Food Network success that ran for ten seasons, documenting what it is like to work in the high-pressure world of the novelty cake business. Deadliest Catch documented the exciting life of crab catchers and ran for ten seasons on the Discovery Channel.

One profession that has garnered some interest in the reality TV world is the mortuary business. Lifetime announced that it would be airing a new reality TV show called “Good Grief” which follows twin brothers and one of their wives as they run the Johnson Family Mortuary in Ft. Worth. Other shows about the mortuary business include A&E’s “Family Plots” about Poway Bernardo Mortuary in California and TLC’s “Best Funeral Ever” about the Golden Gate Funeral Home in Dallas.

“Good Grief” was scheduled to air on July 23 with the TV description:

Take a step deep into the heart of Texas with the Johnson Family Mortuary! You’ve never seen a family funeral business like this one – full of spice and soul. Rachel runs the family business alongside her husband Dondre and his twin Derrick, together known as the “Undertaker Twins,” who bring the life to the business of death. Working with family is never easy with drama, fights and forgiveness, but with the Johnsons, death has never been so lively.

Such a description may seem a little creepily flippant about a somber subject, and may say something about our culture that even death and funeral arrangements can become fodder for prime time TV. However, lest we judge what our culture has come to too hastily, the reason why the show was cancelled is even more disturbing.

“Good Grief” was set to air on July 23. Lifetime dropped the show after the owners were evicted from the building where the mortuary was run, and authorities found eight decaying bodies inside. Dondre and Rachel were charged with seven counts of abuse of a corpse. Derrick claims that he had severed ties with the mortuary, and was not involved in the negligence. The details of what the police uncovered can be found here, but brace yourself because it is not for the faint of heart.

If art says something about a culture, then what does it say about our culture that Good Grief would have been a third reality TV show about the mortuary business but it was canceled because the family that Lifetime chose to follow was engaging in negligent behavior?

The Social Media Experiments and You

Google, Yahoo!, Target, and Facebook all engage in marketing research. They analyze metrics in order to provide targeted ads for its customers. For example, after a summer of attending multiple baby showers and buying items on registries, I started getting free samples of baby formula in the mail. Or, after using my preferred customer card at the grocery store, I received coupons for items that I am likely to buy. Similarly, Facebook filters the posts displayed in your News Feed based on interests, number of comments, and frequency of interaction and they select ads based on your activity.

Recently, the Proceedings of the National Academy of Sciences published a research paper authored by Adam Kramer of Facebook, Inc. and Jamie Guillory and Jeffrey Hancock of Cornell University (the paper was edited by Susan Fiske of Princeton). They investigated the emotional response of 689,003 randomly selected Facebook users by changing what is displayed in their News Feeds. Using word counting and analysis software, they filtered out “negative” posts in one set of users, and filtered out “positive” posts in another set, so each user set was looking at predominantly negative posts or predominantly positive posts. They then had two control groups to account for the statistical differences between negative and positive posts. One control had neutral posts that contained neither distinctly positive nor negative content. They looked at the experimental groups during the prior week to ensure that they did not differ in emotional expression. This experiment took place during the week of January 11-18, 2012.

Their results showed a small, but significant correlation between the emotional content in the News Feed and the experimental groups’ posts. People who viewed fewer negative messages tended to have more positive words in their status updates, and those who viewed fewer positive messages tended to have more negative words in their status updates. Interestingly, people who were in the group that viewed emotionally neutral posts used fewer emotive words in their status updates and wrote fewer words, overall.

Based on a particular interpretation of Facebook’s Terms of Use, this experiment was perfectly legal. But many people believe that even though it may be legal, it is not ethical. Others say that Facebook was just engaging in marketing research?

I talked with a marketing expert from a large digital agency to understand the business ethics perspective. Digital agencies use metrics and data to make better products for their clients, but, as I learned, they are very careful with their data and place a high priority on customer expectations.

He said that marketing research is typically done through surveys or focus groups, in which case people agree to participate. It is true that from the technical side, they can look at trends in user activity, but the key is to not manipulate the user in any way because 1) it skews their data, and 2) it is deceptive. When companies like Google, for example, conduct research analyses, they do not want to change their algorithm because that changes the kind of data they are collecting. Google indicates which search items are paid to appear at the top of the list and they filter “bad” content, such as child pornography.

From a business ethics perspective, the important point is customer (or user) expectations. Facebook users know that the News Feed is filtered and the ads are targeted based on user response, searches, and interest. Facebook crossed a line when it manipulated the end product without the users knowing because Faceboook was no longer providing the expected service.

Let’s take an example from another widely-used, free service. People set up a Gmail account with the expectation that Gmail functions to send and receive emails. Gmail recently started filtering inbox mail by categories such as “Primary”, “Social” and “Promotions”. What if, for one week, Gmail decides to only show mail in your “Primary” tab that is “positive” or “negative” to see how that affects your emotional responses in your correspondences? Your mail is still being sent to your Gmail account, but only certain mail is showing up in the “Primary” tab. Gmail has decided to change how it filters your email without your knowledge and for the purpose of seeing whether it changes your output. Since this may change the content of the emails the user sends out, this could be considered tampering with email correspondence.

Let’s look at a second example. There is a certain trust that customers place in a product, whether you paid for the product or not. Customers trust that the promoted benefit of the product is what it will actually do. If you download a free weather application, you expect it to give you weather information. You don’t expect the app to access other data on your phone and transmit it to someone else without your knowledge. The promoted use of the app was for weather, but its behind-the-scenes use was for something different. Usually this kind of thing is referred as “spyware.”

As the ethical inquiries continue, an important question will be whether Facebook’s experiment is analogous to the hypothetical Gmail example or the spyware example or if it is analogous to marketing research.

Artificial Intelligence in the News

There’s been much ado about artificial intelligence lately. This has largely been prompted by a computer convincing some people that it is a 13-year-old boy and an article written by a veritable who’s who among emerging tech thinkers warning of the risks of superintelligent machines.

Lol Humans

A computerbot, named Eugene Gootsman, was able to convince 33% of people who interacted with him for five minutes via chat that he was a human. This was touted as a clear instance of a computer passing the Turing Test, but it was met with some criticism, including this piece in by Gary Marcus in The New Yorker.

Ironically, rather than showcasing advances in human ingenuity, the Eugene Gootsman experiment reveals some of our less noble attributes. For one, in order to make computers sufficiently human-like, programmers needed to make the machines dumber. As Joshua Batson points out in his Wired commentary, prior winners in an annual Turing Test competition incorporate mistakes and silliness to convince the judges that the computer is a person. This calls into question the value of a test for artificial intelligence which requires a machine to be “dumbed-down” in order to pass.

Secondly, the Turing Test, as it was presented in the media, could easily be one of those tests that the psychology department at your local university conducts on willing participants. The results of the Eugene Gootsman test say more about the judges than it does about the machine. Taken from another perspective, 33% of people tested were more gullible than the rest of the participants.

You Have to Want to Want It

This is contrasted to Stephen Hawking’s warning in an Independent article, co-authored by Stuart Russell, Max Tegmark, and Frank Wilczek, that superintelligence may provide many benefits, but could also lead to dire consequences if it cannot be controlled.  Hawking and company write that “One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand.” Yes, one can imagine technology doing this, but the question is can the technology imagine itself doing this?

Hawking assumes that we will have the capability to engineer creativity or that creativity will somehow emerge from technology. However, we see from the examples of Eugene Gootsman, Watson the computer, Google smart cars, and Siri, that complex programming does not produce ingenuity. One could argue that the only way a machine would even muster up the “motivation” to do any of these things is if a human programmed that motivation into it.

An example from science fiction is Asimov’s three laws of robots. These are the inviolable principles programmed into the hardwiring of every robot in Isaac Asimov’s fictional world. These laws provide the motivations behind the robots’ behavior, and while they lead to some ridiculous and troubling antics, they are not the same as a robot coming up with its own fundamental motivations. In Asimov’s series, I, Robot, the impetus behind each of the robots’ behavior harkens back to these pre-programmed (ostensibly by humans) laws.

This is not to dismiss the field of artificial intelligence. This is to call into question some of the assumptions behind the recent hype regarding the progress and the potential peril of AI. Technology is becoming increasingly more powerful with the potential for both helping and hurting. Hawking’s warning that “the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all” is worth heeding. However, the problem is not endemic in the machine but in the humans that make and wield this technology.