top of page

TECH Ripple’s CTO invented a distributed computer system 20 years before blockchain


Manahel Thabet

he mysterious Satoshi Nakamoto is often credited with inventing blockchain – the tech behind the recent cryptocurrency and decentralization boom. But long before Nakamoto published his seminal paper that shaped Bitcoin as we know it, Ripple $XRP▼3.08% chief technology officer David Schwartz had already come up a similar concept.

Almost 30 years ago on August 25, 1988, Schwartz filed a patent for a “multilevel distributed computer system” that would “preferably” run on “personal computers.” The technology was designed to leverage the combined processing power of numerous devices to accomplish singular tasks.

Three years later, Schwartz was eventually granted the patent. While the undertaking ultimately didn’t pan out, we spoke with the Ripple CTO about what his vision for the distributed system entailed – and how it overlaps with today’s blockchain tech.

The origin story

One of the main problems Schwartz, whose background is in cryptography, was trying to solve was how to distribute computing-intensive tasks (that would’ve been otherwise impossible to process by a single machine) to a network of devices.

“A distributed computer system is a network of computers each of which function independently of but in a cooperative manner with each other. Versatility of a computer system can be increased by using a plurality of small computers, such as personal computers, to perform simple tasks and a central computer for longer more complex tasks,” the patent documentation reads. “Such an arrangement lessens the load on the control computer and reduces both the volume and cost of data transmission.”

“I was working on graphics rendering problems that require significant amounts of CPU power,” Schwartz told Hard Fork. This is how the idea for his invention was born – and ironically, how it came to a halt.

“CPUs improved in performance much more quickly than expected and there didn’t seem to be much need for distributing tasks dynamically to CPUs with available processing power,” Schwartz explained.

But before this so-called distributed computer system got shelved, the cryptographer and his team were able to run some experiments on the technology.

“We had a working implementation that generated images of fractals,” Schwartz revealed to Hard Fork. “You could add more CPUs to the cluster and workloads would dynamically distribute to them.”

Distributed computing and blockchain tech

That said, Schwartz’s system was far from perfect. For one, establishing a connection between various computers was much more complicated back then than it is now. Another challenge had to do with breaking the intended tasks down into smaller portions that can be processed and transferred from one computer to the next.

While the invention of the internet has mostly solved the interconnection issue, some of the issues Schwartz encountered back in the 1990s continue to persist today. Indeed, breaking down large network components into smaller portions is a challenge Ethereum is still trying to solve.

As far as Schwartz’s adventure into distributed computing from 30 years ago goes, he says the experience is still coming in handy in his work today.

“It does seem that the things I worked on in the past keep coming up in the things I’m working on now,” Schwartz added. “I think that’s more just due to most of my work being in the same general area of distributed computing and cryptography.”

bottom of page