Unforgettable: How Blockchain Will Fundamentally Change the Human Experience
From the invention of the wheel to the printing press, new technology has changed the human experience. Our comprehension of the world is no longer limited to a village. Our collective knowledge grows by inconceivable exabytes of data every day. And our memories, our very recollections of the events that shape our lives, are changing too.
In fact, according to neurobiologist Dr. James L. McGaugh, a researcher specializing in learning and memory, technological advancements right up to the advent of the internet have made it less necessary for humans to construct lasting records of our own memories.
Dr. McGaugh found that the presence of “emotional arousal” appears to enhance the storage of memories, helping us to hold on to our most important experiences and let go of the mundane daily clutter. He wrote:
“It is said that, before writing was available to keep records of important events, such as a wedding or granting of land, a child was selected to observe an event and then thrown into a river so that the child would subsequently have a lifelong memory of the event.”
Thanks to new inventions (and common decency) infants are no longer subject to the traumatic possibility of death by drowning.
Yet the questions of who is recording the events, how they are being recorded, and whether any information is being omitted, distorted, destroyed or removed, continue to command society’s attention.
We’ve long been living in a world in which history is documented and human brains are wired to have selective memory. However, with the advent of blockchain technology, we now have a tool to record data that (ideally) cannot be edited, tampered with, or removed. Unlike the pages of a book or an entry in a database, data in the blockchain cannot be altered. In effect, records stored on a blockchain are immutable and live forever.
The question of data permanence for many, though, isn’t blockchain’s most salient feature. In fact, fellow neurobiologist at the University of California, Dr. Craig Stark argues, “Blockchain lets us detect if data has been changed, but we’ve had data permanence for a long time. Vellum is good for thousands of years. I’ve seen examples of coding information in DNA that would let it last millions of years.
There’s a real difference between forgetting and altering or distorting. I may forget the name of a childhood teacher and simply not be able to retrieve the information. Or, I might mis-remember it as “Ms. Fiddlesticks”, with that name most likely coming from other sources in my memory. Blockchain will, of course, help with this misinformation or alteration of the information.”
Yet, blockchain is still in its infancy. As more use cases evolve and the technology’s capabilities expand beyond recording simple transactions to documenting entire cultures and societies; how cautious should we be? How much information do we actually want to be stored forever? And what happens if the information that finds its way onto a blockchain is false, slanderous, or entered in error or malice?
Blockchain’s immutability could be problematic in a world in which we have (in theory, at least) “the right to be forgotten.” An immutable record of events could, in fact, change the human experience in ways that are unfathomable today.
The case for ‘Progressive Decentralization’
When CryptoKitties developer Arthur Camara detailed his team’s foray into blockchain coding he described how the CryptoKitties revenue model was not determined through an exact science, or using advanced prediction models, but rather by an educated guess. He admitted:
“Immutability is awesome and scary. We easily could have chosen wrong, and since you can’t change something once you add it to the blockchain, that would have been cat-astrophic.”
As he argues the case for ‘progressive decentralization’ (essentially, transitioning gently into decentralization rather than diving in headfirst), he explains that immutability is deeply frightening at a technical level.
“Immutability, the inability to be edited, is at once the blockchain’s greatest strength and its largest barrier to meaningful adoption. The pressures of immortal code paralyze developers: you can tinker in a test environment forever, but there will always be real-world variables you can’t anticipate. Covering your eyes and hitting launch is no way to make breakthroughs. It’s more likely to produce breakdowns.”
According to acting CTO of Brave New Coin, Paul Salisbury, “best practices” have evolved over the last five years and knowledge sharing has “lightened the load on individual developers.” Yet, we’ve all seen what happens when blockchain’s immutability backfires — and how, in effect, it can be rendered ‘mutable’ again.
The most obvious case is the birth of Ethereum Classic. The DAO hack and the $50 million of stolen ether opened many people’s eyes to the fact that blockchain wasn’t as immutable as they thought — at least, not when one clan could simply choose to rewrite history.
Does blockchain tell the “real truth”?
Joshua Ellul is Chairman of the Malta Digital Innovation Authority (MDIA) and Director of the Center for Distributed Ledger Technologies at the University of Malta. He speaks of the DAO hack and questions:
“When Ethereum and Ethereum Classic forked, which fork is the real truth, the real Ethereum? The records of that hack are still there, it’s more of a correction of history that took place. This raised serious concerns. Really, it’s not the end users that get to decide (at least in this case). Ultimately the decision is dependent upon the node operators. Are they swayed by popular voices? So, it could well be the person with the most popular voice who decides which version of truth is written.”
He further ponders, “Centralized voices — even if it was seen as democratic, is it the popular vote that should be defining truth? Is that the right path to be going down?” When viewed through this lens, blockchain’s “truth” could be little more reliable than any other record keeping tool we’ve had to date.
This article originally appeared on cointelegraph.com To read the full article and see the images, click here.
Nastel Technologies uses machine learning to detect anomalies, behavior and sentiment, accelerate decisions, satisfy customers, innovate continuously. To answer business-centric questions and provide actionable guidance for decision-makers, Nastel’s AutoPilot® for Analytics fuses:
- Advanced predictive anomaly detection, Bayesian Classification and other machine learning algorithms
- Raw information handling and analytics speed
- End-to-end business transaction tracking that spans technologies, tiers, and organizations
- Intuitive, easy-to-use data visualizations and dashboards
Nastel Technologies is the global leader in Integration Infrastructure Management (i2M). It helps companies achieve flawless delivery of digital services powered by integration infrastructure by delivering tools for Middleware Management, Monitoring, Tracking, and Analytics to detect anomalies, accelerate decisions, and enable customers to constantly innovate, to answer business-centric questions, and provide actionable guidance for decision-makers. It is particularly focused on IBM MQ, Apache Kafka, Solace, TIBCO EMS, ACE/IIB and also supports RabbitMQ, ActiveMQ, Blockchain, IOT, DataPower, MFT, IBM Cloud Pak for Integration and many more.
The Nastel i2M Platform provides:
- Secure self-service configuration management with auditing for governance & compliance
- Message management for Application Development, Test, & Support
- Real-time performance monitoring, alerting, and remediation
- Business transaction tracking and IT message tracing
- AIOps and APM
- Automation for CI/CD DevOps
- Analytics for root cause analysis & Management Information (MI)
- Integration with ITSM/SIEM solutions including ServiceNow, Splunk, & AppDynamics