Review: Three-Body Problem by Justin Tung

Reads: 148  | Likes: 15  | Shelves: 0  | Comments: 0

  • Facebook
  • Twitter
  • Reddit
  • Pinterest
  • Invite

Status: Finished  |  Genre: Science Fiction  |  House: Competent But Unfitted

Read Justin Tung's explanation of some scientific terms of Liu Cixin's Sci-Fi "Three-Body Problem"

Book Review: Three-Body Problem by Liu Cixin

Author: Justin Tung



One series that I’ve totally fallen in love with is the Three-Body Problem by Liu Cixin. It’s masterfully written narrative which interweaves not only baffling technological and scientific innovation on an overwhelming scale, but also contains beautifully written and compelling human characters and stories. However, as someone who does not have a ton of education in hard sciences, there were a few instances when I found myself having to stop and look things up. Many of these were what I came to call “proper-noun terms”, or eponymous terms, where a specific scientific idea is referenced, and unless I understood what it was, I couldn’t really understand that part of the narrative. These terms span such a survey of hard sciences that I thought it would be interesting to describe three such terms, but in my own words.

Astronomy- Schwarzschild Radius:

I took an astronomy course in college, and was already familiar with the concept of an event horizon, and this is a formulaic representation of the same idea. An event horizon is the boundary beyond which nothing, not even light, can escape the gravitational pull of a certain mass, normally a black hole. The Schwarzschild Radius is a measurement of that boundary, but as it is represented in a specific solution to Einstein’s field equations which was developed by Karl Schwartzchild in 1916.

Imagine a hole where the sides gradually get steeper and steeper. Initially, the slope is de minimis and it’s easy to wander toward or away from the hole. However, as you get closer, it’s a lot easier to fall in towards the center, and much more difficult to climb away from the center. Eventually there will be a point where it’s impossible to climb out, and that’s the analogous Schwarzschild Radius.

Geology- Mohorovi?i? Discontinuity:

Simplistically speaking, the Mohorovi?i? discontinuity is the transition between the Earth’s crust and mantle. It lies about 8 kilometers below oceanic crust, and 32 kilometers below continental crust. This transition was discovered by Andrija Mohorovi?i? in 1909 because of its impact on the modeling and understanding of seismic activity.

Seismic activity can produce two types of body waves, P waves (compression) and S waves (longitudinal). The reason why the Mohorovi?i? discontinuity is called a “discontinuity” is because P waves can travel at a different speed from one place to another by traveling through the mantle, as opposed to traveling through the crust. This difference in speed is due to the mantle’s chemical composition, as well as conditions created by increased heat and pressure. As a result, the compressibility, or the “squishiness” of the mantle is different from that of the crust, leading to faster P wave transmission.

One fun thing about the Mohorovi?i? discontinuity is that no one has ever seen it, and no drilling for industrial or scientific reasons has ever reached that depth. No one has observed the other side of a Schwarzschild Radius either, but the reason for that is at least in part because of theoretical physics. if light, the fastest thing in the universe, cannot cross from the inside of the radius to the outside, then even if a device or person were to cross the radius, there’s no way to report back. The problem with observing the Mohorovi?i? discontinuity is mostly a technological

one. It is only a matter of time before we develop the technology necessary to deal with the massive heat and pressure at those depths, as well as a compelling reason to do so.

Computer Science- Von Neumann Architecture:

This term describes a type of computer architecture which is used by most modern computers. Published by John Von Neumann in 1945, this organization separates computing into three main functions: central processing, main memory, and input/output. This organization leads to a phenomenon which is referred to as the Von Neumann bottleneck. This bottleneck occurs because a computer built with Von Neumann architecture cannot both fetch instructions and process information at the same time, due to both those functions depending on the same communication structure, or address space, called a “bus.” Because the computer uses that one bus for both the data that it tries to process as well as retrieving stored-memory instructions, the processing capacity of the computer is idle while stored memory is accessed, and vice versa.

There are several ways people have tried to resolve this bottleneck. One proposition is to go with a different architecture. Von Neumann architecture is also called “Princeton Architecture,'' and is rivaled by “Harvard Architecture,” which uses separate bus paths from each part of the computer to another, and can transfer information simultaneously without interference. The reason why Harvard architecture isn’t more common seems to be because of cost and complexity. Because most computers don’t spend much time needing to maximize output and computing, cycling between these functions is acceptable for the majority of users and use cases, and the added complexity and expense of developing a Harvard architecture is avoided, while still having a computer with relatively competitive processing capacity.

The analogy I like to use is this: Imagine that you have a house with a garage, and there’s a single driveway going to the garage. The problem with Von Neumann is that your data neighbor and your address neighbor can’t drive up to your house at the same time. The Harvard solution is to build another driveway, but then you have to re landscape, install another garage door, and unless both neighbors often need to drive in and out of your houses simultaneously, it’s not worth the hassle.

People have tried to retain the advantages of Harvard architecture but without the drawbacks by modifying it in a few critical ways. I won’t take the time to explain all of these because I haven not taken the time to really understand them. In a big picture way though, the main principle is to create a hybrid between a Von Neumann system and a Harvard system. Some modifications maintain the dual address space, but allow stored memory to be accessed as data, or data to be accessed as instructions. Other modifications start with a single address space, but retrieves instructions and data from separate places.


Justin Tung is a student at the University of Texas School of Law. He has won no awards, made no notable publications, sold no work, and garnered no critical acclaim. He reads and writes the things he enjoys because he enjoys them, and that’s the way he believes it should be. When he’s not writing, Justin can be found in his student housing trying to achieve a state of wu wei through slothful procrastination.


Submitted: May 20, 2021

© Copyright 2021 Competent But Unfitted. All rights reserved.

  • Facebook
  • Twitter
  • Reddit
  • Pinterest
  • Invite

Add Your Comments:

Facebook Comments

Other Content by Competent But Unfitted

Short Story / Literary Fiction

Book Review / Editorial and Opinion

Book / Literary Fiction