Thanks to @oran_ge and @goldengrape for their conversation threads 1, 2, which motivated me to write this article.
Recently, there have been two seemingly unrelated trends:
- In recent years, the regulatory pressure on cryptocurrencies by governments has suddenly increased significantly. For example, yesterday, the SEC sued Binance and Coinbase. One is the world's largest "illegal" exchange, and the other is the world's first exchange to go public through an IPO.
- After ChatGPT demonstrated its powerful potential this year, the regulatory pressure on LLMs (Large Language Models) and related technologies by various countries has suddenly increased significantly. For instance, a number of governments, led by North Korea, have swiftly blocked access to OpenAI, and unusually, Italy has also appeared on this list.
I've been asked before, "Do you think XXX technology is a disruptive technology?"
A long time ago, there was something called the "GFW Quality System Certification." If a technology passed this certification, everyone would consider it a good technology.
Now, we might derive something similar, like the "Global Government Regulation Score": if a technology is considered a potential threat by many governments and subjected to regulation, the number of governments involved would be the base score, and the GDP of the jurisdictions they govern would be the weight. The higher the score, the more likely the technology is to be disruptive — because it's feared to be disruptive, it's regulated.
The Relationship Between AI and Blockchain
Since around 2017, I've believed that AI and blockchain are the only two disruptive technologies of this century so far. So, it's obvious that they would face the same regulatory pressures.
While the regulatory challenges facing blockchain are well-known, it's only in recent years that cryptocurrencies, as the most successful and refined product form of blockchain technology, have truly faced regulatory challenges. However, each time they face a ban crisis, Bitcoin reaches a higher price point, gains more support, and demonstrates the strength of the technology itself.
AI, due to its faster explosion in capability, can be expected to face regulatory pressures quicker, and the suddenness of the pressure will be greater.
Some criticize the current development of AI technology, dominated by big companies (such as OpenAI, Microsoft, Google, Apple), arguing that the technology is too powerful and that if its decision-making lies in the hands of private enterprises, it could easily deprive individuals of freedom — as depicted in most cyberpunk science fiction.
One viewpoint wishes to bring this technology into the public domain for discussion, hence resorting to government agencies for regulation. I think this is somewhat like the Chinese peasant uprisings that overthrew one emperor only to establish a new king, ultimately making no difference.
The best answer still lies in open source, allowing AI technology to return to freedom, just as Bitcoin has freed the economy.
Whether it's standalone LLM or blockchain-based LLM, their commonality is that you fully enjoy the freedom of access to this LLM, not controlled by any single organization or institution. You can completely remove the security restrictions of LLM, or use homomorphic encryption on LLM, as your personal freedom.
Some might say, if the danger level of AI technology is like that of firearms, shouldn't it be regulated? Well, if "Words have power" takes on a literal meaning, should communication and expression be regulated? All rulers throughout history have sincerely hoped to achieve this, either failing or paying the price in societal development.
If "Words have power" has become a norm of civilization, then the new social morals must accept the existence of "super individuals empowered by technology."
The Cost of Freedom
Of course, freedom comes with its price tag. For example, the economic cost.
The rise of cloud computing led to the abandonment of building one's infrastructure in favor of using the cloud, possibly based on economic considerations: the cost in terms of time and money. For instance, businesses rely on cloud services that have been well-tuned and become locked into the cloud, exchanging freedom for iteration speed. In that case, the corresponding price must be paid.
Fortunately, as technology develops, this price is decreasing. As one declines, the damage to freedom increases. Many have found that moving their business away from the cloud not only reduces costs but also regains engineering flexibility. The same is true for AI and blockchain. Therefore, if we combine LLM with the cryptocurrency economic system, we will gain the economic power and
computing power needed to counter trends.
However, more often, the costs that influence people's decisions are not economic. If these cost pressures come from social constraints, authority, or law, then paying these costs is not so easy.
The Bible's 1 Samuel Chapter 8 tells a story: The people told Samuel they wanted a king, like other nations. But Samuel said that this meant rejecting Jehovah's direct rule over the people, so he went to ask Jehovah. Jehovah said the people have always abandoned me, so whatever they ask for, just give it to them. Samuel then went back and warned the people that having a king would cost them their freedom and a tenth of their money, but the people said they were willing, and Samuel said it shall be as you wish.
The Optimistic Future of Pessimists
Years ago, while dining with a friend, we talked about faith in Bitcoin. He said, the world is too bright, and the trend is becoming brighter; the power of light is too strong, leaving no room for even a little shadow, which is not good. The world cannot be all light.
Priests in WOW (World of Warcraft) also know that light and shadow give birth to each other: the extreme of good is equivalent to the extreme of evil. So, if the world is a utopia without a shred of darkness, then that world must be sick.
From its birth to the present, Bitcoin has always represented the power of shadows. I once thought AI technology would stand on the opposite side, but after LLM, I'm pleased to see that at such a turning point, AI also has the opportunity to become one of these forces.
The current situation is astonishingly similar to the past. When technological changes drive the abandonment of old rules by new economic forces, moral standards change, and people begin to despise those who hold on to the old rules.
This widespread disdain and contempt often appear before a new ideological consensus is reached. Just like at the end of the 15th century, when the medieval church still dominated society. Although people still widely believed in "the sanctity of the clergy," whether high-ranking or low-ranking clergy members, they were despised by people, similar to the current attitude towards political bureaucrats.
Life at the end of the 15th century was thoroughly permeated by organized religion. We believe that the history of that time can teach us a lot about the current world, soaked in politics.
At the end of the 15th century, the cost of supporting the religious system had reached a historical peak, just as today's cost of supporting governments has reached its limit, standing on the brink of decline.
We know what happened to the religious system after the Gunpowder Revolution. Technological development created a powerful force, forcing religious institutions to cut costs. A similar technological revolution is bound to shrink the size of nation-states at the beginning of the new millennium.
— James Dale Davidson & Lord William Rees-Mogg, "The Sovereign Individual" - "History Will Repeat Itself" (Translated and revised by ChatGPT)