Can We Make an Embryo in a Dish?

Induced pluripotent stem cells and embryonic stem cells are functionally equivalent, but should we be concerned about making embryos in a dish?

Induced pluripotent stem cells (iPSCs) have been hailed as the discovery of the decade, providing an ethical alternative to embryonic stem cells (ESCs). Both types of stem cells are pluripotent, which means they can potentially make all of the cells in they body. This is contrasted to totipotent cells, which can give rise to an entire organism. The very early embryo consists of totipotent cells.

Induced pluripotent stem cells have technical advantages over ESCs because the patient’s cells can be used rather than donor cells, and they are easier to control compared to ESCs. However, one of the concerns with iPSCs was whether they are truly equivalent to ESCs because of the various transcription factors that need to be turned on or off to get the cells to regress back to their pluripotent state. This debate was laid to rest with a new research report in Science, demonstrating that while iPSCs are genetically distinct from ESCs, they are functionally equivalent.

Before deeming every iPSC procedure ethical and effective, consider the question several researchers from Australia, The Netherlands, and the U.K. ask in a Nature Methods commentary “What if stem cells turn into embryos in a dish?” Their reason for asking stems from research that shows how pluripotent stem cells (both iPSCs and ESs) can form organoids, small three-dimensional clumps of cells that are comprised of a particular organ’s cell type. The techniques to make pluripotent stem cells undergo the self-assembly and morphogenesis required to form an organoid also causes these cells to have many of the properties of embryos at the gastrulation stage of development.

Without delving too deeply into the complexities of embryonic development, the gastrulation stage is a key point when it comes to regulations for human embryo research. (See here for a simple summary of recent research about stem cells that have been dubbed “gastruloids”). The U.K. has a fourteen-day limit on human embryonic research. Human embryos are not allowed to remain intact in vitro beyond the fourteen-day point or after the formation of the primitive streak, whichever comes first. Australia has similar regulations. The pluripotent cells that appeared to reach the gastrulation stage seemed to form a primitive streak and showed signs of forming the beginnings of the Central Nervous System.

There are two things to consider. First, while these are hallmarks of a particular point in embryonic development, it is not the case that this clump of cells is an embryo. The stem cells are self-organizing, but they are without the same kind of holistic directionality that an embryo has. So while these stem cells proliferate in a more “organized” way than, say, a tumor, they lack key embryonic features. However, the authors pose an important question that needs to be addressed because the technology could eventually make embryos in a dish.

Consider two situations in which it is possible to make an embryo without two genetic contributors, a mother and a father. The first is cloning, or somatic cell nuclear transfer, and the second is making gametes using iPSCs.

Somatic cell nuclear transfer has been successfully done in both animals and humans, although only animal cloned embryos have been implanted and birthed. Cloned animals tend to be unhealthy and often die young. This continues to be an area of research, as evidenced by a recent article in Cell Stem Cell in which researchers from South Korea reported more efficient methods for cloning human embryos.*

Gametogenesis is another active area of research. If induced pluripotent stem cells could be induced to differentiate into gametes (egg and sperm), then this would theoretically allow the creation of an embryo. This embryo may only have one parent if the egg and sperm were made from the same donor. Or, it could be made from two parents who are the same gender. This is not yet possible because the oocyte is particularly tricky to form, but there is ongoing research attempting to produce both types of gametes from induced pluripotent stem cells.

Whether one uses somatic cell nuclear transfer or gametogenesis via iPSCs, the creation of a human embryo is ethically problematic for many reasons. The authors of the Nature Methods commentary raise important questions that hinge on when an embryo becomes an embryo in the laboratory setting. There are valid reasons to give the embryo a special status whether it is ever implanted in a uterus or not. As technology allows us to unravel the complex operations that go into meiosis and embryogenesis, we must carefully consider where moral lines are drawn.

Because making an embryo in a dish would be taking the technology too far, drawing ethical lines may require a nuanced approach to just what types of experiments are okay and where in the technical process the line must be drawn so that pluripotent stem cells remain at the pluripotent stage.

* Technically, “clones” like Dolly the sheep are really chimeras, meaning there is a small amount of DNA from the oocyte donor that is different from the nuclear DNA. The clone would produce an embryo from one genetic source if both the original cell and the oocyte came from the same animal.

11/07/15 – This post has been changed from the original to clarify some of the scientific terms.

Synthetic Biology and Making Morphine in the Lab

Prescription pain medicine addiction has become prevalent and widespread with several areas in the U.S. calling it a public health crisis. Opiates include prescription pain medicines, such as Vicodin, OxyContin, or fentanyl. The surge in opiate drug addiction can be traced to changes in the increase in prescriptions for opiate drugs beginning in the 1990s. Now headlines tout the possibility of a “home-brewed heroin.”

If we unpack the headline, it turns out this “home-brewed” heroin is not exactly here yet. Scientists have replicated all of the metabolic processes that opium poppies use to turn glucose into morphine. They have replicated parts of this process in yeast strains in an effort to make less addictive pain medicines as well as other analgesics. This synthesis of cellular processes is called synthetic biology. By way of a quick review, synthetic biology involves creating the digital DNA code to make proteins, the internal machinery of a cell, in the lab. Yeast and e.coli are simple organisms and are often used to insert the DNA in a cell fitted with the necessary equipment to replicate and express the DNA. Craig Venter, in his book on synthetic biology, Life at the Speed of Light, calls DNA the software and yeast provides the hardware. Scientists want to tweak the software to make tailor-made drugs.

Synthetic biology overlaps with genetic engineering, but where it differs is that synthetic biology allows scientists to replicate an entire cellular pathway within an organism, such as yeast, as opposed to inserting or deleting mutations in a DNA strand and then inserting it in a cell.

The metabolic pathway reported in Nature (See the Nature News article) is the first part of the glucose-to-morphine pathway. The second part of the pathway, as well as a reaction that links the two parts, was recently reported by other research groups. All of these parts have been demonstrated separately in various yeast strains. If scientists were to combine these parts in one yeast strain, then theoretically, they would be able to convert glucose to morphine. This has not been done yet, but will likely occur soon.

The process for making morphine from glucose is complex (it’s approximately eighteen steps), and because scientists do not know the whole genome for the opium poppy, they have had difficulty identifying the enzymes that catalyze each step in the reaction pathway. To overcome this hurdle, scientists turned to enzymes in other organisms to that catalyze similar reactions. The most recent research that identifies the first half of the morphine pathway used an enzyme from sugar beets that scientists mutated to ensure that it produced the product they needed without unwanted byproducts.

The question remains, are we at a point where people can brew their own synthetic morphine? The short answer is no, not yet.

First, all of the steps have not been combined into a yeast strain. While this may be the next step in making synthetic morphine in the lab, it will need to be tested, and it may not work at first. Once scientists succeed at creating a yeast strain that can accomplish all of the steps, the process will need to be refined and optimized.

Secondly, in order for someone to brew their own morphine, he would have to acquire the yeast strain containing the synthetic DNA. This would mean acquiring the yeast from someone who not only knows the DNA code, but also has a PCR machine or some way to make synthetic DNA and then incorporate it into yeast.

Lastly, even if someone did acquire the yeast strain, according to Christine Smolke of Stanford University whose lab has made a semi-synthetic opioid using yeast, in an interview with Wired, said that the fermentation process would require specialized equipment and conditions that would be difficult to do outside a laboratory. It would also not produce enough morphine to make it cost effective.

While we are not at the point of worrying about home-brewed liquid morphine, the authors of the study were concerned about future consequences of their research. One of the motivations for designing the synthetic pathway is to tweak it to make less addictive pain medicine or to make medicines for other uses. This same ability to tweak the morphine-producing pathway could also be used for nefarious purposes.

The authors of the study sought ethical guidance from biotechnology-policy specialists Kenneth Oye, of MIT and Tania Bubela, of the University of Alberta. They published an article in Nature with Chappell Lawson, also from MIT, that came out in tandem with the research article. Oye, Bubela, and Chappell delineate the ethical and legal considerations for such research and provide four broad areas that should be considered:

  • Engineering – The yeast strains could be engineered to make them less appealing to criminals and more difficult to cultivate outside of a laboratory setting, similar to biocontainment practices with e. coli.
  • Screening – Since the DNA sequence would need to be ordered from a lab, there could be a screening process in place that flags orders of opiate-producing yeast strains
  • Security – They could employ biosecurity measures, such as watermarking yeast made from certain labs and background checks on people working with the strains.
  • Regulation – Opium is a globally controlled substance. The laws that apply to opium could be extended to cover opiate-producing yeast strains.

Overall, the headlines are a little misleading in that we are not yet on the cusp of people brewing their own morphine. However, the authors should be commended for considering the consequences of publishing their research and seeking ethical guidance. It is a good example of pre-emptively considering the hazards and consequences of technological advancement rather than responding to a crisis.

For more information, see my article in Salvo 31, “Dying to Feel Good: Modern Self-Realization & the Painkiller Addiction Epidemic

Where Do Babies Come From? Induced Pluripotent Stem Cells…Sort of

Recent headlines tout that it may be possible for same-sex couples to have biological children thanks to stem cell technology. Using skin stem cells, scientists from the University of Cambridge and the Weismann Institute found that it is possible to make primordial germ cells, the cells that eventually form into egg and sperm (gametes), from induced pluripotent stem cells created from a donor’s skin.

The opportunity for same-sex couples to have biological children may take over the headlines, but it is not the only, or necessarily the primary reason, scientists are interested in this research. One thing scientists hope to do is to use this technique to study age-related diseases. As we go through our lives, we accumulate epigenetic messages that tell genes when to turn on or off, when to make more cells, or when to stop making cells. Recent research shows that certain cancers and age-related diseases are likely due to these epigenetic factors going awry. These epigenetic factors are “reset” in germ cells, meaning scientists can start over and see how these factors develop at the cellular level. There is still much research that needs to be done in this area, including questions as to which epigenetic factors are passed on and which ones are actually reset in germ cells (See Nature’s recent issue on the results of the NIH’s Roadmap Epigenetics Project here).

While the epigenetic research is interesting, the headlines emphasize the reproductive possibilities. This technique could be another option for infertile couples who want to have biological children, including same-sex couples who want children that are biologically related to both of them. However, what is not touted as loudly is that, for now, the technology only works for male same-sex couples.

The why this technique can be used for two men but not two women has to do with how the cells are made. Primordial germ cells are made from skin cells are taken from each person to be converted into induced pluripotent stem cells. Those stem cells are then converted into primordial germ cells by turning on or off certain genetic factors involved in converting pluripotent stem cells to particular cell types.

Female cells have XX chromosomes and male cells have XY chromosomes, but in order to make primordial germ cells that are precursors to sperm, researchers need a Y chromosome. Lead researcher, Dr. Hannah pointed out, that it is easier to take away a chromosome than to insert one. Women don’t have any Y chromosomes to contribute, meaning that making primordial germ cells from women “is a long way off.”

Finally, one of the more troubling factors in this research is that little has been said about the health and well-being of the children that would be produced from this technique. A key point in the original research article in Cell is that SOX17 is a key factor in the process of making primordial germ cells and likely plays an important role in gene regulation. This was surprising to scientists because SOX17 does not play a key role in mouse development. This means that even though mice have been used in prior studies on creating primordial germ cells, they may not be a good model system for creating the subsequent sperm and egg cells. Often before a procedure or a drug makes it to human trials, it is first tested in mice and then in primates. When it comes to human development, though, things do not translate from animal models to human models as easily.

Once scientists are able to take the primordial cells and advance them to egg and sperm cells, they will be able to create an embryo. However, because the mouse models are different, this is a case in which we have no way of knowing whether these embryos or the children will be healthy until the experiment is actually done.

It is unclear from the interviews or the article if the assumption is that “unhealthy” embryos will die off before they implant in the uterus or how exactly researchers are expecting to test whether this technique work as another reproductive technology. If unhealthy embryos die, this poses an ethical problem for those that assume embryos should be granted dignity in their own right. But even if one does not accept that embryos are accorded a certain level of dignity, what about babies and children? What do scientists plan to do if the children born from this procedure are unhealthy or deformed?

Finally, it is worth mentioning that this technique would provide a source of eggs, which are needed for various other reproductive techniques and research endeavors. For example, The UK just passed legislation permitting what has been dubbed “three-parent IVF” in which scientists transfer the nucleus from one woman’s egg to another woman’s enucleated egg that will then be used in IVF. The hope is to prevent mitochondrial disease which is genetically passed down to offspring from the mother. Obtaining skin cells from a donor is much less invasive and poses less of a risk to the donor than obtaining her eggs after inducing hyperovulation.

This research is rife with ethical concerns, most notably the fact that it amounts to human experimentation on people who did not have the opportunity to choose to be the product of experimentation, but must live with the consequences, if they live at all.

NFL and Prescription Drugs

The NFL is being sued by 1,300 former players for the way it distributed prescription pain medicines so players can get back in the game. The former players claim that they were not informed of the side effects of potent pain killers such as Percodan, Percocet, Vicodin, and Toradol. Percodan, Percocet and Vicodin are all opioid painkillers and Toradol is a strong non-steroidal anti-inflammatory (NSAID) drug.

Many of the former NFL players involved in the lawsuit played during the 1980s and 1990s when practices for administering powerful painkillers, both opioids and NSAIDs, were cavalier. Today they are, in theory, more regulated. The players state that the “NFL medical staffs routinely violated federal and state laws in plying them with powerful narcotics to mask injuries on game days.” They also claim that medical staff was negligent by keeping important information on the players’ medical conditions from them, such as markers for kidney disease or broken bones.

At issue are 1) whether doctors and trainers violated the law by illegally administering prescription drugs, and 2) whether players were adequately informed of the side-effects of the drugs as well as informed of any medical issues that doctors found that might affect their decision.

Doctors Behaving Badly?

In an attempt to investigate whether illegal practices were going on, the DEA paid unannounced visits to several professional teams in November (2014) in which they questioned team doctors and trainers after the game. This investigation was to ensure that doctors were prescribing and distributing drugs appropriately, that they were handling controlled substances properly when crossing state lines, and that they had a license to practice in the state. Thus far, the DEA has not found evidence of illegal activities in their investigation.

However, an investigation from Vice Sports into how and where NFL doctors acquired such large amounts of prescription drugs, shows that, at least in the past, they were likely obtaining drugs illegitimately. From 2009 to 2010, several NFL teams, as well as other professional and college sports teams, acquired large amounts of opioids and NSAIDs from a company called SportPharm, an illegal drug distributor operating behind the legitimate company, RSF Pharmaceuticals. RSF Pharmaceuticals eventually shut down, and SportPharm was re-branded as a subsidiary of Champion Health Services, which is still in operation. Many teams would fill prescriptions in player’s names without the player knowing so that the actual quantities would fly under the radar.

Informed Consent

The second issue has to do with players’ rights, and whether they were adequately informed of what drugs they were given, their medical options given their current medical situation, and the long-term side effects.

Many of the players received opiate drugs without being told about their addictive nature, and were often told to take them for longer or in higher dosages than what is recommended by the FDA. Furthermore, many players were given prescription pain medicine without a doctor’s evaluation or monitoring. One former player reports that while playing for one team, an assistant trainer would pass out unlabeled manilla envelopes with pain medicine for any player that raised his hand and said he needed them. Another former player said that envelopes with prescription pain medicine would be waiting in the seats on the airplane for the players.

Player testimonies from the class action law suit website show that many players were given powerful pain medicine instead of being told that they needed rest and recovery or that the problem was actually much worse and required surgery. Several players said that NFL doctors knew of existing health issues, but did not inform the players. Two players’ testimonies state that NFL doctors knew that they had indicators of kidney problems but did not tell the players. Both former players now have renal failure.

Another former player, Rex Hadnot, said in a Washington Post interview that he was given Toradol pills and/or injections once-per-week for nine years. He was never told that Toradol should not be administered for more than five days due to risk of kidney damage, heart attack, and stroke.  He said that sometimes he would receive both a shot and a pill on the same day, a much higher dosage than the FDA recommends.

The Mountain Climber Problem

Part of the problem with discerning the ethics of safety for football players is exemplified in what H. Tristam Engelhardt calls “the mountain climber problem.” In general, climbing a mountain is more dangerous than not climbing a mountain, but we do not consider it unethical to allow a mountain climber to scale a mountain if he so desires. Similarly, playing sports is inherently more dangerous than not playing sports. Football players take on additional risks by choosing to play the sport. Therefore, what protections, if any, are football players owed?

There is a tension between restricting someone’s freedom and allowing them to put themselves in harm’s way. Typically, with the mountain climber problem, ethicists will say that it is unethical to allow additional harm to come to the person such that he or she could not accomplish the stated goal of climbing the mountain. For example, while mountain climbing is inherently dangerous, the climber should still use a harness and ropes. In the case of football players, while it is an inherently dangerous sport, one can enforce safety precautions to ensure that players are not injured in such a way that they cannot play the sport. This is the motivation behind stricter rules to prevent concussions, helmet design, and padding.

The difference between the mountain climber and the football player is that collisions are part of the sport. Pain is a given. The former players who are suing the NFL claim that their health was sacrificed in the name of sales. But, other players criticize the lawsuit as nothing more than a money grab on behalf of former players because they knew what they were risking by playing the sport.

Despite whatever motivations are behind the lawsuit or the NFL’s medical decisions, it is unethical to de-humanize athletes, even if they willingly chose to engage in de-humanizing activities. Let’s take a non-football example: If a woman choses to trade sex for money, she is willingly commodifying herself and ultimately engaging in a de-humanizing activity. While this may have been her free choice, it does not mean that if she goes to a doctor, the doctor is no longer ethically obligated to treat her with human dignity. In other words, even if she chooses to engage in activities that are de-humanizing, that does not mean it is okay for medical health professionals to treat her as less-than-human.

In the case of football players, even if they may choose short-term returns at the expense of long-term injury, they need to be given the opportunity to make an informed choice on the matter because, ultimately, they are the ones that have to live with the consequences.

In the latest issue of Salvo Magazine (Winter, 2014) I cover the larger issue of prescription pain medicine addiction, what opiate drugs actually do to the brain, and how one becomes addicted. The former NFL players’ claims about the over-prescribing of prescription painkillers may be part of a larger national problem that saw a peak in opiate drug prescriptions during the years that many of the former players were active in the NFL.

Creepy Critters in Our Gut

What is the microbiome?

The microbiome is the bacteria that reside within and on our bodies. Often these bacteria do more than just hang out with us. Some bacteria fight off disease, while some cause disease. Others will help us digest foods or reject bad food. For this post, I am going to focus on the gut biome, the bacteria that live in our large and small intestines, because the gut has made for some interesting headlines lately. The “microbiome” refers to all of the bacteria on the body.

The small intestines have a plethora of bacteria that act symbiotically with us to help us digest and process foods. Scientists have been studying the gut biome for many years, but it is only recently that it has garnered public attention. There have been several theories lately that have suggested the gut biome is responsible for everything from food allergies to autoimmune diseases to autism. Furthermore, new diet fads, fecal transplants, and probiotic supplements have emerged as a result of the gut biome hype, many of which are untested or whose claims are unsubstantiated. As is the case with pop-science trends, the microbiome is becoming the poster child for pseudo-scientific claims and grandiose promises.

What does the research show?

Let’s start with some facts because the gut biome does affect our health and well-being. The National Institute of Health is currently working on the Human Microbiome Project. This project seeks to identify and characterize the bacteria (and fungi) that are associated with the human body. Similar to the Human Genome Project, the original plan was to characterize the microbiome of healthy individuals and then to compare it to unhealthy individuals in hopes of understanding the role the microbiome plays in disease. However, those goals may need to be adjusted.

The Human Microbiome studies have revealed two things: 1) no two human microbiomes are alike, and 2) the microbiome is dynamic. Because each person has a unique microbiome, there is not a gold-standard, “healthy” microbiome by which to compare “diseased” microbiomes. Also, because the gut biome changes with diet and environment, it is difficult to determine a particular signature for a person. It’s composition is just too dynamic.

Additionally, the microbiome’s composition (the types of bacteria that make up the biome) are different at different times depending on the individual’s diet and environment. This is especially true with the gut biome. There are hundreds of different species of bacteria that could potentially live in our digestive system, and those species may be in different abundances at different times. Furthermore, sometimes studying two different parts of the same sample will show different results. This is a classic sampling problem. Imagine that you wanted to find the amount of lead in soil in a field. You could collect soil from the top of the ground, which might give you a different lead concentration than if you took soil that was one foot underground or you might get different results if you took samples that were 100 feet away from each other.  The gut biome has a similar problem. Apparently, the biome composition is different depending on where in the digestive tract you retrieve the bacteria (e.g., from a fecal sample or from the small intestines).

With these caveats, scientists have still observed some trends. For one, an individual’s gut biome changes after taking antibiotics. This makes sense because antibiotics are meant to kill bacteria. What is unclear is how long the changes persist and how this affects a person’s health.

Scientists also know that the gut biome plays a role in aiding digestion of certain hard-to-digest foods, such as carbohydrates. Furthermore, they have found differences between the gut biomes of obese people and non-obese people and between people with digestive diseases, such as Crohn’s disease. However, whether the different gut biome is the cause or is the result is unclear.

Healthy skepticism

There are several other correlations between the microbiome and physiological effects.  The difficulty is whether these are merely correlations or causation. William Hanage has an excellent article in Nature, “Microbiology: Microbiome Science Needs a Healthy Dose of Skepticism” in which he discusses five key questions to help discern the truth from the hype:

  1. Can experiments detect differences that matter?
  2. Does the study show causation or just correlation?
  3. What is the mechanism?
  4. How much do experiments really reflect reality?
  5. Could anything else explain the results?

Many studies show that the gut biome is very responsive to diet and environment, which means the differences we see in people with a certain disease (or condition) may be the gut responding to the disease rather than causing it.

The gut biome is a new area of research that may shed some light on digestive disorders and the effects of antibiotics on the body. However, Hanage cautions us to not fall into the same kind of non-discretionary, cure-all thinking that we’ve seen in other new areas of science such as the Human Genome Project, stem cell research, genetic engineering, or nanotechnology. He also remind us not to blame the microbiome for all of our ills: “In pre-scientific times when something happened that people did not understand, they blamed it on spirits. We must resist the urge to transform our microbial passengers into modern-day phantoms.”

Are We Bored Yet? The Apple Watch and New Technologies

One of the plights of modernity and postmodernity is hyperboredom. This is not the kind of boredom that comes out of having nothing to do, but the kind of boredom that comes out of having too many options and no way to distinguish which one is better than the other. We are jolted out of this boredom when we encounter disruptive technologies. These are technologies that fundamentally change a particular market and have an impact on our culture.

According to Ian Bogost, Apple is a company that is as much in the business of shaking us out of our routine with disruptive technologies as it is in the business of manufacturing them. This may explain why people flock to Apple’s announcements (either virtually, or in person) with a big-tent revival fervor in hopes of seeing what groundbreaking new technology Apple has in store for us. For a brief moment, the hyperbordom is replaced with anticipation and excitement over the possibility that the multitude of options will become passé to be replaced by that one technology that supersedes all of them.

Take, for example, Steve Jobs’ announcement in January, 2007 of this little gadget called the iPhone. He knew the implications of this device and where it stood in the grand scheme of things: (Quoted from “How Apple Introduced the iPhone” in The Atlantic):

This is a day I’ve been looking forward to for two-and-a-half years. Every once in a while, a revolutionary product comes around that changes everything and Apple has been—well, first of all, one is very fortunate if you get to work on just one of these in your career—Apple has been very fortunate. It’s been able to introduce a few of these into the world. In 1984, we introduced the Macintosh. It didn’t just change Apple. It changed the whole computer industry. In 2001, we introduced the first iPod. It didn’t just change the way we all listen to music, it changed the entire music industry. Well, today, we’re introducing three revolutionary products of this class. The first one is a widescreen iPod with touch controls. The second is a revolutionary mobile phone. And the third is a breakthrough Internet communications device. An iPod, a phone, and an Internet communicator. These are not three separate devices. This is one device. And we are calling it iPhone. Today Apple is going to reinvent the phone.(emphasis added)

Since then, every time Apple unveils a new iPhone, people flock to stores in anxious anticipation, some of them going so far as to sleep outside the Apple store’s doors in hopes of being the first to get the latest and best that Apple has to offer. And, it does not seem to be slowing down. Sales for the iPhone 6 and 6 Plus broke last year’s record, selling ten million phones last weekend, an opening weekend that was strategically timed to ensure that there will be visions of iPhones dancing in everyone’s head by December.

So with such excitement and Apple’s track record of disruptive technology, what happened with the Apple Watch?* Apple had not released a new device in four years. This was to be the next device after the death of Steve Jobs that shows Apple is still changing markets. However, rather than the fanfare of groundbreaking technology, the Apple Watch was met with mixed reactions..

In his article “Future Ennui” Ian Bogost says that the problem is not the technology itself, but the burden that comes with it. We have become bored of the constant barrage of groundbreaking technologies. He compares it to Google’s innovations,

Unlike its competitor Google, with its eyeglass wearables and delivery drones and autonomous cars, Apple’s products are reasonable and expected—prosaic even, despite their refined design. Google’s future is truly science fictional, whereas Apple’s is mostly foreseeable. You can imagine wearing Apple Watch, in no small part because you remember thinking that you could imagine carrying Apple’s iPhone—and then you did, and now you always do.

Bogost may be giving Google too much of a pass, though. The Google Glass has sparked some controversy among those paranoid of being filmed by its wearers.

Perhaps the difference between Google’s innovations and Apple’s innovations can be compared to the difference between reading Isaac Asimov and Margaret Atwood. Asimov writes about robots and artificial intelligence, and even explores some of the ways that this technology can go awry, but Asimov’s stories do not come with a sense of prophetic inevitability that Atwood’s do. Atwood writes speculative fiction, not science fiction (See Atwood’s book In Other Worlds). Atwood’s stories, like her recent Madd Adam trilogy, are disconcerting because they are a little too plausible. Rather than something that may be fifty years from now, her books describe a near-future in which technologies that are already in place are ratcheted up. Similarly, while people will likely not drive automatic cars in the next ten years, it is much more likely that they will be wearing technology that is collecting data on all of their bodily process, purchases, and locations in the next two years.

While the fervor over the iPhone 6 hit record levels, perhaps the mixed response to the Apple Watch signifies that we are tempering our enthusiasm over internet-in-our-pocket technologies. Clive Thompson, quoted in an in an opinion piece in the New York Times, says that our attitudes toward technology follows a predictable pattern, “We are so intoxicated by it, and then there’s a fairly predictable curve of us recognizing as a society that this is untenable, and we’re acting like freaks.”

Thompson is an optimistic writer on technology who believes that there are many benefits to the kind of community interactions that are possible with the internet. Rather than focusing on the doom-and-gloom of the here-and-now, Thompson takes a broader, historical perspective, reminding us, in an interview with The New Yorker that we have been through this before,

We have a long track record of adapting to the challenges of new technologies and new media, and of figuring out self-control…More recently, we tamed our addition [sic] to talking incessantly on mobile phones. People forget this, but when mobile phones came along, in the nineties, people were so captivated by the idea that you could talk to someone else—anywhere—on the sidewalk, on a mountaintop—that they answered them every single time they rang. It took ten years, and a ton of quite useful scrutiny—and mockery of our own poor behavior—to pull back.

Indeed, studies on cell phone addiction and parents neglecting their children and state laws addressing car accident deaths because people cannot pull away from their cell phones are all indications that we are becoming keenly aware that we might be acting like “freaks.”

While disruptive technologies may also disrupt our postmodern malaise, there does come a point when we become weary of the constant announcements of the next-big-tech. Bogost’s article is compelling because he touches on this very notion. Once there are too many next-big-tech options available, the hyperboredom of modernity and postmodernity sets in.

*The marketing team at Apple wisely opted not to go with “iWatch.”