What Lt. Cmdr. Data Tells us About Star Trek’s Interpretation of Moore’s Law

It’s no secret that I’m a big fan of Star Trek. As someone who was largely a social outcast growing up, it was in some ways a default — Gene Roddenberry’s vision represents a future that is largely hopeful and where differences are accepted and encouraged within Federation’s society rather than intentionally muted, and that was a source of comfort.

My real entrée into that universe (so to speak) was through Star Trek: The Next Generation, the sequel series that in many ways exceeded the original, and which was commonly on television through my formative years. Indeed, I’ve always preferred the command of the measured Picard to the often reckless Kirk, and the android Data’s discovery of humanity over the Vulcan Spock. But still, after high school I ventured far from science fiction in the pursuit of other interests.

With the availability of the entire series of TNG on various streaming services for the past several years, I’ve reacquainted myself with the series, several times. As I am at a different stage of my own “discovery of humanity” than when I originally viewed the series, other questions and thoughts have emerged. One that I’m tackling today involves the android Data, and what his technical specifications suggest about one particular part of the future: the future of technology.


Who is Data?

DataData, played by Brent Spiner, is an android character, constructed by odd “mad scientist” Dr. Noonian Soong. Of the prototypes and models produced by Soong, only Data, his (evil) “brother” Lore and a later approximation of Soong’s fatally ill wife Juliana Tainer, survived activation and were able to truly approximate human characteristics. The reason cited for Soong’s repeated failures with few little successes was the complexity of the “positronic brain,” a technology that made such a human-like android possible. Data, Lore and Tainer were ultimately Soong’s only successes, with a fourth activated android named B-4 unable to process the complex thoughts needed for truly social interaction.

Data was arguably the most successful of Soong’s creations, and certainly the most visible in the series. He joined Starfleet and achieved the rank of lieutenant commander through a decorated career, with his service on the Enterprise making him a regular character.[1. “Data,” Memory Alpha, the Star Trek Wiki. http://en.memory-alpha.org/wiki/Data.] Lore, Soong’s first successful android, wandered the universe after re-activation and generally caused problems — his misadventures providing an interesting recurring character to the TNG series. Tainer only appeared in a fairly lame season 7 episode of TNG, the storyline of which was more exploratory of Soong’s madness than anything else. B-4 had it even worse, appearing only in the abysmal Star Trek: Nemesis film, which remains the worst performing and most poorly reviewed Trek film to-date.[2. “Star Trek Nemesis,” Memory Alpha, the Star Trek Wiki. http://en.memory-alpha.org/wiki/Star_Trek_Nemesis.]

Many of the storylines about Data during TNG (and the mostly disappointing TNG-era feature films that followed) involved his search for humanity, as this was handicapped by his being built and programmed without the capability of experiencing emotion. (Of course, part of Lore’s problem was that he was provided with this ability, inadequately controlled.[3. “Descent (episode),” Memory Alpha, the Star Trek Wiki. http://en.memory-alpha.org/wiki/Descent_(episode)]) Data’s quest for humanity, which invited comparisons to both Spock from the original series, and Pinocchio of literature[4. Yes, even in the show itself — first, in the pilot episode “Encounter at Farpoint,” and again in the second-season finale’s poorly disguised clip episode “Shades of Gray.”] is ultimately a compelling storyline throughout the TNG fictional universe.


Data’s Specifications

In one of the best episodes of the series, “The Measure of a Man” (which admittedly draws heavily from To Kill a Mockingbird, but I digress…), Data’s legal status within Starfleet is the subject of a JAG hearing, which sought to determine whether he was an independent sentient being, or property of the organization. As a part of this hearing, Data is compelled to testify by Commander William Riker (who himself is compelled by the JAG to serve as prosecutor for the hearing) and is required to list his technical specifications:

Riker makes a devastating, if ultimately unsuccessful, case against Data's independence (Memory Alpha)

Riker: Commander, what is the capacity of your memory, and how fast can you access information?

Data: I have an ultimate storage capacity of eight hundred quadrillion bits. My total linear computational speed has been rated at sixty trillion operations per second.[5. “Next Generation Transcripts – The Measure of a Man,” http://www.chakoteya.net/nextgen/135.htm]

With a little calculation, we can put this quote into both contemporary and current contexts. His memory storage of “eight hundred quadrillion bits” certainly does not sound as impressive today, in late 2014, as it likely did in February 1989 when the episode aired.

8×10^17 bits = 10^17 bytes = 100 Petabytes = 100,000 Terabytes[6. “Megabytes, Gigabytes, Terabytes… What are They?” http://www.whatsabyte.com/]
While 100,000 Terabytes is still a lot of storage in today’s terms (it’d take just 20,000 of these), it’s not nearly as impossible to imagine in 1989, when the first consumer gigabyte (1/1000 Tb) hard drive was still two years away from being introduced at the budget-conscious price of $2,699.[7. “Terabyte,” Wikipedia. http://en.wikipedia.org/wiki/Terabyte]

While memory (storage) is an interesting spec to put into context, it’s not nearly as crucial for understanding Data’s technology as his processing power.

First, an important note: Data’s reported specifications of “sixty trillion operations per second” uses a measure which may be related to the now-antiquated Instructions per Second[8. “Instructions per Second,” Wikipedia. http://en.wikipedia.org/wiki/Instructions_per_second.], obsolete because it measures raw processing power without regard to other system architecture or realistic processing use. Though this was more common in earlier parts of computing history (hello 1980s!), it has largely been abandoned for measures such as Floating Point Operations Per Second (FLOPS). Equally important to note here is that the operations used to measure FLOPS are more difficult for a processor than simple IPS[9. “FLOPS,” Wikipedia. http://en.wikipedia.org/wiki/FLOPS.], which makes a measure in FLOPS more impressive than the same number in IPS. Though Data didn’t specify that he was citing his processing power using FLOPS, we’re going to assume that he was, for sake of simplicity.

60×10^12 FLOPS = 60 teraflops
Okay, so what about context for this? The fastest supercomputer in the world in 1989 was the SX-3/44R from the NEC Corporation, which operated at 1.7 gigaflops,[10. “History of Supercomputing,” Wikipedia. http://en.wikipedia.org/wiki/History_of_supercomputing] meaning Data was 35,296 times the speed of the fastest computing system in the world to that point. No wonder that number seemed like a science fiction dream!

Not so fast. Let’s put this in context for today: the fastest supercomputer in the world currently (December 2014) is the Tianhe-2 housed at China’s National University of Defense Technology, which operates as fast as 30.7 petaflops (1 petaflop = 1,000 teraflops) and may well be left in the dust if Google gets its quantum supercomputer off the ground.[11. “Google is Building the World’s Fastest Supercomputer,” CNN Money. http://money.cnn.com/2014/09/03/technology/google-quantum/] We’ve built supercomputers that’ve already left Data hopelessly obsolete by today’s standards.

Admittedly, supercomputers are generally not as mobile or contained as Data is, often taking up entire rooms, floors or buildings to assemble the processors necessary to achieve this kind of speed. What about desktops or something within reach of an average consumer? Well, the newest high-end Mac Pro achieves approximately 7 teraflops in its main processor, and up to 3.5 more in its graphics processor.[12. “Mac Pro – Technical Specifications,” Apple. http://www.apple.com/mac-pro/specs/] The Mac Pro isn’t up to Data’s standards… yet. But it’s coming remarkably close, considering how far off it seemed in that episode’s contemporary context.


Moore’s Law

Predicted by Intel founder Gordon Moore in 1965, “Moore’s Law” is the conjecture that the number of transistors in a circuit doubles approximately every two years.[13. “Moore’s Law,” Wikipedia. http://en.wikipedia.org/wiki/Moore%27s_law] While based on the history of computing to that point, it was also predictive and was used as a planning target for technology firms, becoming something of a self-fulfilling prophecy beyond Moore’s expectations: since his initial paper was released nearly 50 years ago, the “Law” has been upheld by constant exponential growth in transistors in such circuits as recently as this year. The Wikipedia-sourced chart below illustrates this, though it only comes to 2011:

The growth is expected to continue for at least a few more years with the development of graphene-based transistors,[14. “Moore’s Law: How Long Will It Last?” TechRadar.pro. http://www.techradar.com/news/computing/moore-s-law-how-long-will-it-last–1226772] and might continue longer with unforeseen technological innovation.

Unfortunately, Moore’s Law is technically somewhat worthless for this context, because Data’s brain is a “positronic matrix” that is, presumedly, far beyond silicon transistors and semiconductors. Plus, importantly, we have no idea from the evidence in Star Trek canon how many circuits Data has.

What we can do is take what I call “House’s Corollary” to Moore’s Law and look at it a bit. What is House’s Corollary? You’ve probably heard of it, and may have even heard it called Moore’s Law. It’s a 1975 statement by David House, another Intel executive at the time, who suggested all changes (including Moore’s proposed processor trajectory, but also other architectural considerations) would result in computer performance doubling approximately every 18 months.[15. “Moore’s Law to Roll On for Another Decade,” cnet.com. http://news.cnet.com/2100-1001-984051.html.] Both Moore and House were close — Moore’s been pretty much dead-on, while House’s prognostication was only off slightly, witnessing a doubling every 20 months since 1975. Several have suggested this alteration to Moore’s Law be used.[16. Moore’s Law and Computer Processing Power,” datascience@Berkeley Blog. http://datascience.berkeley.edu/moores-law-processing-power/.]


Some Analysis

What does this mean for Data and the Star Trek universe? To dig into that question, let’s start with some basic assumptions:

  1. Given the inability of Soong, Lore, Data himself[17. See Lal, Data’s short-lived creation (and one of the more terrifying TNG faces, pre-completion) in the third season episode “The Offspring.”] and other scientists to replicate the success of the positronic matrix that makes up the brain of a Soong-type android, Data was likely far and away the most advanced piece of human-made computing technology at his activation.
  2. Data was first activated upon discovery in the year 2338, but was initially constructed in 2336.[18. This is in spite of his claim in the pilot episode as being a graduate of “Starfleet Class of ’78,” which isn’t possible given the timeline. “Data,” Memory Alpha – The Star Trek Wiki. http://en.memory-alpha.org/wiki/Data.]
  3. “The Measure of a Man” episode took place in 2365,[19. “The Measure of a Man (episode),” Memory Alpha – The Star Trek Wiki. http://en.memory-alpha.org/wiki/The_Measure_Of_A_Man_%28episode%29.] another point of reference to account for possible upgrades to Data’s hardware in the interim years.
  4. The nuclear holocaust that ends World War III on Earth in 2053[20. “World War III,” Memory Alpha – The Star Trek Wiki. http://en.memory-alpha.org/wiki/World_War_III] has a detrimental effect on human computer technology.
  5. However, that impact cannot be total destruction, because within 10 years of the end of that conflict in 2063, Zephram Cochrane achieves warp speed, resulting in first contact with the Vulcans.[21. The storyline of the best film to come from the TNG crew. “Star Trek: First Contact,” Memory Alpha – The Star Trek Wiki. http://en.memory-alpha.org/wiki/Star_Trek:_First_Contact.]
  6. Moore’s Law, in line with current predictions of petering out in 2030 or earlier, remained in effect until the beginning of World War III in 2026,[22. Yes, despite the fact that the Eugenics Wars of the mid-1990s never materialized in real life.] and continued after the end of the war, inspired by Cochrane’s warp flight, contact with the Vulcans and its resulting excitement for a new era of space exploration.
  7. We approximate the impact of Moore’s Law with a simplified and conservative modification of House’s Corollary — that computing speed will double every 24 months from the present — to account for the likelihood that circuits on silicon-based superconductors are not the final advancement of computing technology.
  8. To curb projections and comparisons even more conservatively, we’ll use the consumer-available Mac Pro’s speed measurements of 7 teraflops as a starting point, instead of the Tianhe-2 supercomputer’s 30.7 petaflops.


With these assumptions in place, how do Data’s specifications fit into a continuance of Moore’s Law?

At the beginning of World War III in 2026 (only 11 years away, yikes!), consumer-available computers will be performing at 336 teraflops, far in excess of Data’s 60 teraflop capabilities.

Let’s assume that the war puts computing back to 1989 levels (the NEC SX-3/44R’s 1.7 gigaflops) by its end in 2053, which makes the war truly destructive of human technology, perhaps even too far. Then, let’s assume that Moore’s Law is re-established through Data’s construction in 2336, up to the report of his specs in 2365.

Where does that put us? In 2336, the most powerful computer would be clocking speeds of 7.764 x 10^43 teraflops, just a touch faster than Data’s 60 teraflops. Of course in 2365, the difference is more pronounced, with the top computer clocking 1.908 x 10^48 teraflops.

How far under expectations was Data? Check out this chart. The blue line is the expected, while Data’s specs are marked (click for full-size):

Assuming 300+ years of exponential growth unsurprisingly gets us some pretty crazy numbers, with which Data’s simply don’t compare. Of course, no current analysts expect Moore’s Law to continue for this long, but then, Moore didn’t expect it to continue for 50 years after 1965, either.



The obvious conclusion is that Star Trek’s timeline does not foresee Moore’s Law continuing until the creation of Data in 2336. Fair enough, and obviously today’s brightest minds agree that Moore’s Law has a limited shelf-life for the coming years.

What’s more interesting about this is that, despite Data’s apparent failure to meet the prediction of Moore’s Law and despite the fact that our top computing power today is beyond Data’s abilities, Data’s artificial intelligence is far more advanced than even the greatest supercomputers are capable of achieving currently. I’ve touched on this in some extremely superficial looks at AI on this blog, but I’ve honestly been entirely underwhelmed with the development of AI technology through the past 20 years. To me, it’s been even more disappointing than the failures of our culture to create flying cars, hoverboards and automatically fitting jackets predicted by 2015 in Back to the Future 2.

Then again, and this might be the most important conclusion of all: I’ve probably thought way too much about how the dream represented in TNG by Data’s existence seems entirely unachievable, even given the current trajectory of technology. It’s science fiction — a hopeful story of the future with some limited basis in reality, which is supposed to be an escape. And ultimately, thinking too much about all this served as a a nice mental escape for a lazy Sunday afternoon.



Author: Andrew Shears

Andrew Shears is an Assistant Professor of Geography at Mansfield University in Mansfield, Pennsylvania. His research interests lie at an intersection of the human-environmental nexus, and includes branches of mapping, technological, memorialization and urban geographies. He lives in Wellsboro, Pennsylvania with his wife Amy, a professional photographer.

3 thoughts on “What Lt. Cmdr. Data Tells us About Star Trek’s Interpretation of Moore’s Law”

  1. That depends on what operations meant in the 24th century. For years this has been an ambigious definiton. Are we talking about simple addition or subtraction as one op. Not necessarily as it could be multiple. So the ops may have had a different defined meaning in datas time. Perhaps one op is to calculate a years worth of computations on a modern supercomputer today

  2. Its possible that the key breakthrough will be to get rid of the heat in semiconductors, switching from copper to silver is problematic as AgO is also conductive. I’ve found a workaround which is to use hyper-conducting pathways formed and destroyed in microseconds within a critically under-doped cuprate but it has not been tested as of yet.
    Its based on using silicon carbide for the actual computational elements but the magic is in those pathways where electron/hole balance yields a “positronic” system in this context holes being the “anti-electron” analogs.

Comments are closed.