banner
leaf

leaf

It is better to manage the army than to manage the people. And the enemy.
follow
substack
tg_channel

Considerations

Issues Developers Must Consider:#

Developers need to think about the stakeholder protection services provided by third parties and design decentralized protection methods. If this is not possible, developers must inform stakeholders that this technology lacks the kind of protection they are accustomed to. Developers may even decide to abandon application development due to high user risks.

Issues Users Must Consider:#

Users must understand the risks that lack of protection poses to themselves and the parties they represent (consulting clients, cared-for patients, citizens whose rights need protection). This risk must be openly acknowledged, and effective informed consent must be obtained from the service recipients. They should also seek non-blockchain solutions that can fill the gaps.

  • Lack of Privacy

The two most popular blockchains, Bitcoin and Ethereum, are public and known for their transparency and accessibility, allowing anyone to access, add to, and audit the entire blockchain. However, if transparency poses a serious threat to user privacy, private blockchains may be necessary. For example, Nebula Genomics uses private blockchain technology to give patients "full control" over their genomic data.

Blockchains may contain information that should only be visible to certain users, which may require a hybrid approach that combines private and public blockchains. For instance, electronic health records contain highly sensitive data that must remain confidential, as well as information that should be shared with institutions like disease control centers and health insurance providers. Comprehensive blockchains like Hashed Health, Equideum Health, and BurstIQ can allow patients greater control over their data while sharing biometric information.

  • Issues Developers Must Consider:

Developers must carefully consider their ethical obligations to balance transparency and privacy, then decide whether the application at hand is best suited for a public blockchain, private blockchain, or hybrid model. One important factor to consider is how blockchain members may be identifiable and the potential ethical consequences of this. Other important decisions include determining who can access what data under what conditions and the time frame for access.

  • Issues Users Must Consider:

Users need to understand the impact of transparency on their business and service recipients. They must be aware of and address the risks that wallet holders may be found (including cases where wallet holders inadvertently disclose their identity).

For example, a customer of a financial services company may wish to make an anonymous donation to a charity or political party because they do not want to disclose the amount donated, political leanings, or affiliations. The financial services company may recommend using blockchain for the transaction, as it would anonymize the customer's identity. However, the company also has an obligation to inform the customer that this anonymous transaction record is public and discuss the best ways to avoid identity disclosure.

Zero State Issues#

The accuracy of the data contained in the first block, or "genesis block," is often questioned, leading to zero state issues. If due diligence on the data is not properly executed, errors or malicious fraud can occur. For example, in a blockchain used to track goods in a supply chain, the first block incorrectly shows that a truck is loaded with copper from a certain mine, when in fact the copper comes from another source. Personnel associated with the truck's cargo may have been deceived or bribed, and the creator of the genesis block may be unaware.

If we replace the goods here with blood diamonds (unprocessed diamond ore used to fund arms for civil war conflicts), the ethical issues become apparent. If a government establishes a blockchain as a database for maintaining land registry records, and the person entering information into the first block writes the wrong landowner, it could lead to serious injustices, effectively amounting to theft of land. Organizations like Zcash, which create privacy-protecting cryptocurrencies, also strive to ensure the accuracy of the genesis block.

  • Issues Developers Must Consider:

Developers must carefully verify all information that the genesis block is to contain, making every effort to ensure this information is accurately entered. They must also alert users to the zero state issue and disclose the fact that the blockchain may contain erroneous information, allowing users to assess potential risks and conduct due diligence.

  • Issues Users Must Consider:

Blockchain users should examine how the genesis block was created and where the data originated. They should be particularly vigilant about whether the information recorded on the blockchain has ever been the target of fraud, bribery, or theft. They should consider whether the organization that created the genesis block is trustworthy and whether the block has undergone reliable third-party review.

Users also need to understand that even if the data in the genesis block and subsequent blocks is accurate and legitimate, problems can still arise. For instance, if the truck is loaded with legitimately sourced diamonds, and the route has been accurately recorded on the blockchain multiple times, clever thieves may still be able to switch the diamonds for fake ones along the way. Users must inform service recipients of the zero state issue, disclose the due diligence conducted on the genesis block, and find protective measures to prevent fraud (if any exist).

  • Blockchain Governance

There are many terms used to describe blockchain technology: decentralized, permissionless, autonomous—these terms may lead users to make assumptions about governance, such as thinking it is a paradise for libertarians and anarchists, or that all members have equal say in how the blockchain operates. However, blockchain governance is very complex, involving significant ethical, reputational, legal, and financial implications. Who holds power on the blockchain, how power is obtained, what oversight exists or does not exist, and how decisions are made are all determined by the creators of the blockchain. The following two examples illustrate the issue—one notorious, the other ongoing.

  • The first decentralized autonomous organization (DAO) operated on the Ethereum network. It was a hedge fund originally called "The DAO." Members had different voting rights based on how much they invested in the fund (in Ether). In 2016, The DAO was hacked, and approximately $60 million worth of Ether was stolen. Members had vastly different opinions on how to respond, and there was even no consensus on whether the hacker's actions constituted theft. One faction believed that the funds obtained illegally through software vulnerabilities should be recovered and returned to the rightful owners. The other faction believed that The DAO should not consider rewriting fraudulent transactions, but simply fix the vulnerabilities and allow the blockchain to continue operating. The latter insisted that "rules are law" and "blockchains are immutable," arguing that the hacker's actions adhered to the rules and were therefore not ethically unacceptable. Ultimately, the former faction prevailed, and Ethereum implemented a "hard fork" to move the funds to a recovery address, allowing users to recoup their investments, effectively rewriting the blockchain's record.

  • The second example involves the controversy surrounding Juno governance. Juno is another decentralized autonomous organization. In February 2021, Juno conducted an "airdrop," sending free tokens to community members to boost participation. A cryptocurrency wallet holder exploited this system, acquiring a large number of tokens worth over $117 million at the time. In March 2022, the community proposed to remove most of this "whale" user's tokens, reducing their holdings to a normal range for airdrop eligibility. A month later, the proposal passed with a 72% vote, leaving the user with only 50,000 tokens, while the rest were nullified. Now, this user, who claims to be investing with other people's money, threatens to sue Juno.

These incidents illustrate that careful governance structures must be built for blockchains and the applications running on them, with due diligence conducted.

Issues Developers Must Consider:

Developers must determine appropriate governance methods, particularly noting that governance structures may leave openings for hackers and wrongdoers. This is not a mechanical issue. Developers' values must be clearly expressed and reflected in the blockchain. For instance, how Ethereum developers weighed the decision of whether to modify the blockchain or simply fix the vulnerabilities in the DAO incident illustrates the difference in perspectives. The divide between those who voted in favor of confiscating the whale user's tokens and those who opposed it in the Juno incident is similarly related. To avoid such ethical issues, developers should establish governance guiding principles from the outset.

If the distribution and acquisition of power and money within the system are not carefully considered, divisions may arise. In the DAO incident, the hacker exploited a software vulnerability, causing chaos within the community: can flawed rules be considered law? In the Juno incident, part of the turmoil stemmed from the lack of thorough consideration regarding the initial distribution of tokens. Developers must understand that those with voting rights have vastly different beliefs, values, ideals, and desires. Robust governance is one of the most important tools for managing these differences, and if developers' values are reflected in the regulatory infrastructure, policies, and processes governing the blockchain, significant ethical and financial risks should be avoidable.

Issues Users Must Consider:

Users must consider whether the values of the blockchain creators align with those of their organization and their clients. They must determine how much volatility, risk, and loss of control they and their clients can tolerate. They must clearly express what they believe to be good and responsible governance standards and only use blockchains that meet those standards. Users may be using a decentralized network without a single authority, but they are certainly dealing with some political entity.

Establishing an Ethical Risk Framework for Blockchain

All ethical risks of technology are as numerous as their uses. For example, AI-controlled autonomous vehicles may pose life-threatening risks to pedestrians. Social media applications carry the risk of spreading misinformation. Ethical and reputational risks accompany nearly all data-driven technologies, and they also exist on blockchains. When applying blockchain, senior leaders must establish a framework for mitigating risks. Various scenarios should be carefully considered: What major ethical issues must our organization strive to avoid? How should edge cases be handled? Ethical issues should be anticipated, and thought should be given to: What governance structure do we have? What regulation is needed? Does blockchain technology undermine our organizational values and ethical principles, and if so, how can we mitigate that damage? What protective measures should be arranged to safeguard our stakeholders and brand? Fortunately, many of these questions have already been addressed in adjacent AI ethical risk literature, and I have also written a guide on implementing AI ethical projects. All blockchain projects can start with such materials.

Account Abstraction (AA) has been discussed since 2015, with several different versions of EIPs proposed (EIP-101, EIP-86, EIP-859, EIP-2938, EIP-4337). Recently, the development of account abstraction seems to have become a hot topic of discussion in the community, with various solutions for account abstraction being rolled out at the application level. So what exactly is account abstraction? What do people want to solve through AA? This article discusses Ethereum account types, the EIPs for account abstraction, and the potential use cases for account abstraction.

The term "smart contract" was first proposed by American computer scientist Nick Szabo in 1994, meaning obligations of the parties to a contract recorded in the form of computer code, enforced by the code under agreed conditions. However, Szabo only proposed the concept without detailing how to implement it. In 1996, Ian Grigg proposed the "Ricardian contract," which could be read by humans and parsed by programs, endowing smart contracts with legal attributes and becoming the main route for subsequent smart contract exploration. The effective implementation of smart contracts requires the following characteristics.

  • Consistency: Smart contracts need to be consistent with the contract text and not conflict with existing laws. Observability: The contract content and its execution process should be observable and transparent, allowing all parties to the contract to observe, record, and verify the contract status through a user interface. Once established, the contract cannot be altered.

  • Verifiability: The results generated by smart contracts should be verifiable, possessing a certain degree of fault tolerance, with code execution conforming to the contract, and repeated execution yielding the same results, meeting the conditions to serve as judicial evidence. Privacy: The operation of smart contracts should ensure that the identity information of the parties and the contract content is controlled within the "minimum necessary" scope of knowledge, meeting the needs for commercial confidentiality and personal privacy protection.

  • Self-enforcement: This characteristic is both the core connotation of smart contracts and their main value, meaning that once the conditions stipulated in the contract are met, the smart contract should have the ability to fulfill obligations without relying on legal coercion, being unimpeded and non-repudiable.

  • Meeting all of the above conditions is relatively difficult, so practical applications of smart contracts have been very limited in the more than a decade since the concept was conceived. Later, "Ethereum" utilized the characteristics of blockchain to achieve the operation of smart contracts through decentralization and immutability. Smart contracts have also been "bound" to blockchain to such an extent that people believe that only blockchain and DeFi can realize the value of smart contracts. To date, Ethereum's smart contracts have only been applied in a few areas such as crypto assets, NFTs, gambling, and gaming, without promoting the real economy. Moreover, due to the lack of a scalable application ecosystem and the speculative nature of virtual currencies, applications have also been limited. In fact, the concept of smart contracts predates blockchain, and the conditions for their operation are not solely met by blockchain.

  • Since the advent of Bitcoin, the private sector has launched various so-called cryptocurrencies. According to incomplete statistics, there are currently over 10,000 influential cryptocurrencies, with a total market value exceeding $1.3 trillion. Cryptocurrencies like Bitcoin use blockchain and cryptographic technology, claiming to be "decentralized" and "completely anonymous," but limitations such as lack of value support, extreme price volatility, low transaction efficiency, and huge energy consumption make it difficult for them to perform monetary functions in daily economic activities. At the same time, cryptocurrencies are often used for speculation, posing potential risks to financial security and social stability, and becoming payment tools for illegal economic activities such as money laundering.

  • To address the defect of large price fluctuations in cryptocurrencies, some commercial institutions have launched so-called "stablecoins," attempting to maintain stable currency values by pegging them to sovereign currencies or related assets. Some commercial institutions plan to launch global stablecoins, which will impact the international monetary system, payment and settlement systems, monetary policy, and cross-border capital flow management.

Recommended reading of the following articles for a better understanding of account abstraction:

✦ "Motivations, History, and Analysis of Account Abstraction" by Sandglass:

https://mp.weixin.qq.com/s/ZGzw3VE-8KEQE5xu7Jw_8A

✦ "Introduction | Overview of Ethereum Account Abstraction" by EthFans:

https://mp.weixin.qq.com/s/3VvjB2GXcH95j2Hr3zcsVg

✦ "Introduction | Account Abstraction (EIP-2938): Why & What It Does" by EthFans:

https://mp.weixin.qq.com/s/CKtk6xKcXFVjyPKDxHBnhw

✦ "On Account Abstraction (2022)" by Sandglass:

https://mirror.xyz/0xbeC73ba0817403cd11C11bE891D671EA30443562/95LlE7sLCL4UTvL7rU3ZAXnBvlDbh7X-rm0QWkc43Us

✦ "EIP-4337" by Plancker DAO:

https://www.notion.so/EIP-4337-0baad80755eb498c81d4651ccb527eb2

In addition, the first "His Name is Little V" event co-hosted by the Planck community and the ECN community shared information about EIP-4337 contract wallets.

Abstraction in computer programming refers to the process of hiding all data except for the data related to the "object," with the goal of reducing complexity and improving effectiveness. It represents objects by omitting unnecessary details. Abstraction is one of the three main principles of object-oriented programming and is related to encapsulation and data hiding. This article will provide an overview of the following aspects:

➤ Ethereum Account Abstraction

External Accounts/User Accounts

Contracts

➤ Proposed EIPs for Account Abstraction

EIP-86: Abstraction of Transaction Origin and Signature

EIP-2938: Account Abstraction

EIP-4337: Account Abstraction Implemented via Entry Point Contract

➤ Use Cases

Wallets

Sponsored Transaction Mixing

DeFi Protocols

Account Abstraction

Ethereum's account abstraction aims to create a single account type that encompasses all relevant aspects without any irrelevant aspects, making developers' work easier.

Ethereum Account Types

Currently, there are two types of accounts on the Ethereum blockchain:

Image

User Accounts (EOA)

User accounts are for general use (by humans).

These accounts are controlled by private keys corresponding to public addresses, such as the user's wallet account.

These accounts are also known as external accounts (EOA) and can be created on the blockchain without an ETH balance. However, transactions can be made between two external accounts using ETH or other ERC-supported tokens.

External accounts (wallets) are used for sending and receiving cryptocurrency and exist outside the Ethereum Virtual Machine (EVM).

Contracts

Contracts are a set of instructions controlled by code.

Creating a contract usually incurs associated costs due to network storage.

Users can perform various functions, such as receiving transactions from external accounts and contract accounts, as well as sending transactions to them.

It can also initiate code that executes various activities, including token exchanges or creating a new contract.

Contract accounts exist as "smart contracts" within the EVM.

If you send 1 ETH to an account controlled by code, no one can control that ETH anymore. The only way to transfer that ETH is through the execution of the contract, i.e., the code itself.

Both types of accounts have the potential to receive, hold, and send ETH and tokens, as well as communicate with other smart contracts deployed on the network.

Account Abstraction Proposal

Ethereum Account Abstraction (AA) enhances these two forms of accounts, making them more comparable and allowing the management logic of external accounts to be as universal as that of contract accounts.

Its goal is to reduce the two forms of contract accounts to one form. The uses of a single account form include minting and contract transfers. Developers and users will no longer need to distinguish between account types, as transactions will be fully transferred to the EVM and detached from the blockchain protocol.

Ethereum developers have been searching for ways to implement this, but no proposal has reached a Final state yet. In the following sections, we will outline the three Ethereum Improvement Proposals (EIPs) that have proposed account abstraction so far.

Timeline of Account Abstraction Proposals

Image

2016:

Vitalik Buterin proposed the initial abstraction change idea for Metropolis.

The goal was to prepare for a security abstraction of accounts. In the traditional model, ECDSA (Elliptic Curve Digital Signature Algorithm) and the default nonce scheme are the only means of protecting accounts. In this model, all accounts are contract accounts, which can pay gas, and users can freely define their security models.

2017:

Vitalik Buterin proposed EIP-86 for the abstraction of transaction origin and signature.

The goal was to abstract the signature verification and nonce checking mechanisms, allowing users to establish account contracts to perform any required signature or nonce checks instead of relying on traditional methods.

2020:

Vitalik Buterin, Ansgar Dietrichs, Matt Garnett, Will Villanueva, and Sam Wilson proposed EIP-2938 for account abstraction.

The aim was to allow contracts to become "top-level" account types that can pay fees and execute transactions.

2021:

Vitalik Buterin, Yoav Weiss, Kristof Gazso, Namra Patel, and Dror Tirosh proposed EIP-4337 for account abstraction via entry point contract specifications.

The goal was to avoid changes to the consensus layer protocol and rely on higher-level infrastructure.

EIP-86: Abstraction of Transaction Origin and Signature

According to its "abstract," EIP-86 proposes a series of changes that serve the comprehensive purpose of "abstracting" signature verification and nonce checking, allowing users to create "account contracts" for executing any required signature/nonce checks instead of relying on the currently hard-coded mechanisms in transaction processing.

Traditional model: ECDSA and the default nonce scheme are the only means of protecting accounts.

New model: All accounts are contract accounts, which can pay gas, and users can freely define their security models.

Using a forwarding contract as an example, author Vitalik Buterin explains that this type of contract verifies signatures, and if the signature is valid, it initiates payment to miners and then sends a call instruction to the specified address using the given value and data.

➤ Advantages

The main advantages of this proposal are as follows:

Multi-signature wallets

Traditional method: Every transaction in a multi-signature wallet must be agreed upon by all participants. We can simplify this by combining all participants' signatures into a single approved transaction, but this method still increases complexity, as all participants' accounts must hold ETH.

New method: With the help of this EIP, current contracts can hold ETH and directly submit transactions containing all signatures to the contract, which will pay the fee.

Custom Cryptography

Traditional method: Users must adhere to ECDSA, which is a form of cryptography using elliptic curves.

New method: Users can upgrade to ed25519 signatures or any scheme they wish to upgrade to; users are not required to adopt ECDSA.

EIP-2938: Account Abstraction

According to the abstract of EIP-2938, "Account Abstraction (AA) allows contracts to become 'top-level' accounts that can pay fees and execute transactions.

Traditional model: The validity of transactions is directly defined by ECDSA signatures, a simple nonce value, and account balances.

New model:

  1. Account abstraction extends the validity conditions of transactions by executing random EVM bytecode.

  2. To represent validity, a new EVM opcode PAYGAS is introduced, setting the gas price and gas usage limits for contracts.

  3. Account abstraction is now divided into two categories:

Single-tenant AA: This type is designed to support use cases with few wallets or other participants.

Multi-tenant AA: This type is designed to empower applications like Uniswap with many users.

Consensus Changes

NONCE opcode: Adds a NONCE opcode that pushes the nonce field of the transaction.

PAYGAS opcode: Adds a PAYGAS opcode that creates an irreversible checkpoint, ensuring that state changes prior to PAYGAS cannot be reversed.

Sam Wilson is one of the authors of this proposal, and he explains here how AA transactions differ from other traditional transactions.

In AA transactions, there will be no gas price or gas limits, no sent value, and the signature field is replaced with target instead of to. In multi-signature contracts, these fields are passed in calldata and processed by the contract.

If a transaction reaches a node, the validity of the transaction will be checked. However, the way traditional transactions and AA transactions are checked differs.

In traditional transactions: Node checks: Their nonce matches the next nonce of the account, the account balance is sufficient to pay for their value and the highest gas fee, and their signature matches the address of the account.

In AA transactions: Node checks: Their nonce matches the next nonce of the contract, the contract's bytecode starts with a standard prefix, the verification logic calls PAYGAS before reaching the verification gas limit, no prohibited opcodes are called before PAYGAS, and the contract balance is sufficient to pay the gas fees set by PAYGAS.

Block broadcast time is the average time required for a new block to reach the majority of nodes in the network.

When a block with AA transactions arrives, all pending transactions of the same account will be deleted. On the other hand, traditional transactions will be re-validated and may be published upon receipt of the new block.

EIP-4337: Account Abstraction Implemented via Entry Point Contract

This is the latest proposal put forward by Vitalik Buterin and the community. It was proposed as an ERC, and this proposal includes avoiding changes to the consensus layer protocol while relying on higher-level infrastructure.

It aims to achieve the following goals:

Account abstraction: Allows users to use smart contract wallets with random verification logic as their primary accounts instead of EOAs.

Decentralization: Allows those who package transaction bundles to participate in the process that includes account abstraction user activities. Users can process any activities occurring throughout the public memory pool without needing to know the direct communication addresses of any arbitrary actors.

No consensus changes: To facilitate faster adoption, this proposal avoids consensus changes.

Transaction fee payment: Allows transaction fees to be paid with ERC-20 standard tokens, enabling developers to pay fees on behalf of their users, as well as supporting use cases similar to sponsored transaction proposals like EIP-3074.

How does this proposal work?

Image

Vitalik Buterin explains how this proposal works well here.

This is the latest proposal for account abstraction, currently still in draft form, awaiting merging into an EIP. Compared to the conventional Ethereum transaction memory pool, this design adds, maintains, and sacrifices some functionalities.

Key Highlights

➤ No centralized actors, removes user-end wallet setup complexity, fully supports EIP-1559, has the ability to replace fees, sends a new UserOperation with a higher premium than the old UserOperation to replace operations or retains the ability to make it faster to be packaged.

➤ There are some new advantages added:

Flexibility of verification logic

Sufficient to achieve quantum security at the execution layer

Upgradability of wallets

Flexibility of execution logic

➤ However, despite the protocol's best efforts, it slightly increases the possibility of DoS attacks, increases gas overhead, and executes only one transaction at a time.

Use Cases for Account Abstraction

Wallets

EOA and Contract Wallets

EOA Wallet: A wallet protected by a private key.

Contract Wallet: A wallet implemented on-chain using smart contracts.

Security Considerations: If there are bugs in the smart contract code, contract wallets face security risks from vulnerable smart contracts. This risk can be minimized through security testing and audits conducted by wallet providers. However, in EOA wallets, the risk is entirely borne by the wallet user, just as they bear the responsibility if they accidentally lose their private key.

Argent, Dapper, Gnosis Safe, and Monolith are examples of smart contract wallets.

EOA's Meta Transactions

Ethereum blockchain users need an EOA holding gas to connect with the blockchain network or rely on wallet providers to facilitate meta transactions through their relays or third-party relay networks (e.g., Gas Station Network). The former relies on ETH purchased from centralized exchanges (which require KYC) to minimize user experience friction by shifting the responsibility of consumers to relayers, with fees paid by on-chain/off-chain wallet providers and/or off-chain users.

Meta transactions are transactions that include data signed by the willing executors of the transaction.

The relay-based architecture has some drawbacks:

  1. They can be seen as centralized intermediaries with the ability to suppress transactions.

  2. Due to the additional 21,000 basic gas fee required for relayed transactions and their need to profit on top of gas fees, they are technically/economically inefficient.

  3. The use of protocol power dedicated to relayers.

Account abstraction allows smart contract wallets to accept users' gas-free meta transactions and pay gas fees for them without relying on relay networks. This foundational capability will greatly enhance the user experience of these wallets without losing Ethereum's decentralization guarantees.

Sponsored Transactions

Sponsored Transactions are encompassed in EIP-2711 (status: withdrawn), which proposed a mechanism that allows people to transact without holding any ETH by allowing others to pay the gas fees.

Some use cases include:

  1. Allowing application developers to pay fees on behalf of users.

  2. Allowing users to pay fees with ERC-20 tokens, with the contract acting as an intermediary to collect ERC-20 tokens and pay network fees in ETH.

Operation

This proposal can support these use cases through a paymaster mechanism.

For use case 1: The paymaster verifies that the sponsor's signature is included in the paymasterData, indicating readiness to pay for the UserOperation. If the signature is valid, the paymaster accepts the instruction and deducts the UserOperation fee from the sponsor's share.

For use case 2: The paymaster checks whether the sender's wallet has enough ERC-20 balance to pay for the UserOperation. If sufficient, the paymaster accepts the instruction and pays the ETH fee before requesting the ERC-20 tokens in postOp.

Mixing

Let’s explore the example of the Tornado Cash mixing mechanism to understand how we can use AA in DeFi protocols.

Traditional Tornado Cash Contract Privacy Issues

When users withdraw, Tornado Cash provides them with privacy protection. They can prove that the funds come from a unique deposit, but no one knows where that deposit originated except the user.

Users typically do not hold ETH in their withdrawal addresses, and if they use their deposit address to pay gas, it creates an on-chain link between the deposit address and the withdrawal address.

This issue can be resolved by third-party relayers, who verify that the ZK-Snark and nullifier remain valid, publish transactions using their ETH to pay gas, and collect users' refunds from the Tornado Cash contract.

Solution Provided by Account Abstraction: Users can submit an AA transaction targeting the TC contract, after which ZK-SNARK verification and nullifier checks are executed, and PAYGAS is called directly and quickly. This allows withdrawers to pay gas directly with tokens sent to their withdrawal address, without needing a relayer or creating an on-chain link to their deposit address.

DeFi Protocols

Let’s explore the case of the DeFi protocol Uniswap to understand how we can use AA in DeFi protocols.

A new version of Uniswap can be created that allows direct transactions targeting the Uniswap contract.

Currently, users can deposit tokens into Uniswap in advance; Uniswap can store users' balances and public keys to verify transactions that spend those balances.

The goal of AA is to improve the gas efficiency of DeFi protocols by preventing transactions that do not meet high standards from being packaged on-chain (e.g., the existence of matching orders).

In the traditional model: Normal traders store their tokens outside of the Uniswap contract.

In the new model: Arbitrage traders store their tokens on Uniswap, and in the event of changes in external markets, they can also transfer and execute arbitrage transactions. Ultimately, if another arbitrage trader executes this transaction first, these non-profitable transactions will not be packaged on-chain. This allows arbitrage traders to avoid paying gas and reduces the number of junk transactions packaged on-chain. This will increase the scalability and market efficiency of the blockchain, as arbitrage traders are better able to correct price discrepancies in cross-chain transactions.

Arbitrage traders refer to traders who buy low and sell high by exploiting price differences between two or more markets.

Original link: https://etherworld.co/2021/10/06/an-overview-of-account-abstraction-in-ethereum-blockchain/

Six, Ethereum's Future PoS Protocol

Casper PoS is a security-deposit based economic consensus protocol. In the protocol, nodes, as "bonded validators," must first deposit collateral (this step is called bonding) to participate in block production and consensus formation. The Casper consensus protocol constrains the behavior of validators through direct control of these deposits. Specifically, if a validator does anything that Casper deems "illegal," their deposit will be forfeited, and their rights to produce blocks and participate in consensus will be revoked. The introduction of deposits solves the "nothing at stake" problem, which is the low cost of wrongdoing in classic PoS protocols. Now there is a cost, and it is objectively proven that validators who do wrong will pay this cost.

In the future, Ethereum will use the Casper PoS protocol, which does not use computational power for proof but uses digital assets to prove its existence. That is, there is no need to spend money to buy mining machines for mining, but rather to buy ETH through physical assets to become a validator.

Plain text

Seven, Challenges of Ethereum Programming

Coding is not that difficult, especially if you have a background in other software programming. However, if programmers want to become core developers of Ethereum to study technical issues such as security and scalability, it can be relatively more challenging, as this is a very new technology, and only a small portion of people understand these challenges, but it is not impossible. If anyone here wants to join the Ethereum research group, we are also recruiting, and of course, we welcome volunteers to work on other Ethereum projects. In summary, our Ethereum community has many different aspects; some people come in because they are interested in the technology, while others want to do other software development based on it, and some are currently researching the platform to understand what can be developed on it. Even though blockchain technology has been around for 9 years, I still feel it is very young, developing rapidly, and there are many different ways to get involved.

Plain text

Eight, Vitalik's Recommended Learning Methods

For those interested in becoming Ethereum programmers, I recommend starting to follow these two websites:

http://ethereum.org (This website has tutorials and guidance on how to write smart contracts, how to upload them, and how to write apps);

http://github.com (This website has a lot of technical information about how Ethereum models work, etc.).

The first thing: What is Ethereum?

Ethereum is a smart contract platform built on a decentralized internet application foundation that can program payments in any Bitcoin/Ether. This means the platform can compare prices of Bitcoin and Ether trading platforms in different countries around the world through regional block links. Once price differences are found, it buys Bitcoin and Ethereum from the lower-priced platform and sells them on the higher-priced platform to profit from the price difference. This is a new profession that has gradually emerged in the industry, commonly referred to as "arbitrage."

Using large high-end cloud computing, shorting or going long on various currency trading platforms around the world (buy low, sell high), completing transactions within 0.28 milliseconds, and ensuring unlimited floating trading points for value-added, guaranteeing each investor's dividend appreciation. Our members only need to entrust their Bitcoin and Ether to the platform for trading without any operation. The Ethereum platform guarantees each member a monthly return of up to 15%-25%. As long as digital currencies exist and prices fluctuate, the profit margin will always exist, and there will always be profits to be made, with returns of up to 15%-25%. As everyone knows, this year is the first year of digital currency development, so the platform's blood-making function and stability are self-evident.

The second thing: Company Introduction?

Company Introduction

The company was established in preparation in 2015, and the project was globally launched in 2016. As of now, it has been successfully promoted in over 45 countries, and our platform has 18 language versions worldwide. Our customer service team has reached hundreds, and our trading team has reached thousands. In a short period, we have successfully showcased our best side; our team consists of young, vibrant professionals who are always eager to achieve goals. Now, we are gradually mastering compound interest, and through high-quality asset management services, we have achieved great success.

Our platform trades using your investment and compound interest. The average monthly income is 30%-50%, with half of (15%-25%) allocated to investors. The cryptocurrency market allows people to gain more profits, but our strategy mainly focuses on the safety of funds.

Here, you are investing in the future of the internet blockchain! We are committed to the global popularization of the Ethereum platform, promoting its excellent capabilities and prospects, as well as the implementation of asset management plans, and obtaining economic benefits through cryptocurrency exchange trading activities.

The third thing: Who is the founder?

Ethereum Founder: Vitalik Buterin

A Russian kid who vowed to disrupt the real economic system with blockchain, his newly created blockchain platform has attracted the attention of tech giants like IBM and Samsung, as well as investment banks like Barclays and Credit Suisse.

Born: 1994

Current Position: Founder and Chief Scientist of the blockchain platform "Ethereum"

Education: University of Waterloo, Canada

Awards: Bronze Medal in the Olympiad in Informatics, Thiel Fellowship, 2014 World Technology Award

Blockchain Definition: Blockchain is a decentralized ledger composed of cryptographic algorithms and economic models.

The cryptocurrency he developed is rivaling Bitcoin, and he defeated Facebook founder Mark Zuckerberg to win the 2014 IT Software World Technology Award. This award recognizes Buterin's outstanding achievements in designing and developing the Bitcoin 2.0 platform, Ethereum.

Vitalik Buterin's Legendary Story

Born in 1994 in Russia, he began studying Bitcoin at 17 and founded "Bitcoin Magazine."

At 18, he won a bronze medal in the Olympiad in Informatics.

At 19, he dropped out of the University of Waterloo in Canada; that November, he published the first version of the "Ethereum White Paper" and began recruiting developers.

At 20, he received the Thiel Fellowship and established the nonprofit Ethereum Foundation, publicly presenting the Ethereum project at a Bitcoin conference in Miami. That July, he launched the Ethereum project crowdfunding, raising 31,000 Bitcoins (approximately $18.4 million at the time).

At 21, the initial version of Ethereum, Frontier, was released, and Ether began trading publicly on exchanges worldwide.

At 22, he was named one of Fortune magazine's 40 Under 40 in 2016.

Vitalik Buterin, the 22-year-old hacker disrupting the real economic system

This September, Fortune magazine's bold headline was also a hot topic of debate among blockchain experts worldwide.

The protagonist of the topic is the 22-year-old Vitalik Buterin.

He is the founder of the hot blockchain platform Ethereum. At an age when most people have just stepped out of campus, he already harbors ambitions to change the world: to disrupt the real economic system with blockchain.

Today, there are over 700 cryptocurrencies based on blockchain technology, and since Ethereum's launch in late July 2015, it has swept the virtual currency market. As of October 31, 2016, the total market value of Ether (the cryptocurrency that maintains the operation of the Ethereum platform) was approximately $944 million.

Although it is still far from Bitcoin's $11.18 billion, its rapid rise is still seen as Bitcoin's number one competitor.

Vitalik Buterin has attracted over $100 million in investment

—— Ethereum and related applications and their fundraising amounts

Ethereum public blockchain platform------raised $18.4 million

DigixDAO ------established a gold-backed financial platform on Ethereum, raised $5.5 million

Augur ----- a decentralized market prediction platform, raised $5.32 million based on Ethereum smart contracts

The DAO ----- a venture organization based on the Ethereum platform, raised $132 million for shared economy projects using blockchain technology.

The fourth thing: How is the project developing?

Looking back at major events in the Ethereum platform:

On January 1, 2016, the Ethereum project developed abroad, launching in over 40 countries globally, with the platform available in 8 languages.

On August 1, 2016, Ethereum entered China.

On October 22, 2016, a meeting was held in the Philippines.

On October 23, 2016, the first club in China was established.

On November 4, 2016, a meeting was held in Vietnam.

On November 7, 2016, a meeting was held in Russia.

On November 13, 2016, a charity event was held in the Philippines.

On November 18, 2016, the Russian club opened.

On November 24, 2016, the second meeting in Vietnam was held.

On November 28, 2016, a meeting was held in Malaysia.

On December 4, 2016, a charity event was held in the Philippines.

On December 7, 2016, the Russian club opened in Yekaterinburg.

On December 17, 2016, a major meeting was held in Moscow.

On December 17, 2016, a football match was held in Pakistan.

On December 18, 2016, a 招商会议 was held in Shenzhen.

On January 8, 2017, the second meeting and club in Malaysia were held.

On January 8, 2017, the second club in China opened in Ningbo.

On January 15, 2017, a 招商会 was held in Yunnan, China.

On January 20, 2017, a photo was taken of the winter promotion award winners.

On April 10, 2017, Ethereum rapidly developed, with the number of global members exceeding 230,000.

On May 21, 2017, the number of global members exceeded 300,000, and the price of Ether surpassed 1,000 yuan. Ethereum became the first digital currency to reach the 1,000 yuan mark after Bitcoin. The history of cryptocurrency has been rewritten!

On May 21, 2017, the number of global members in Ethereum trading exceeded 427,000, doubling every day! The price of Ethereum soared to 3,000 yuan, and the Enterprise Ethereum Alliance (EEA) added 86 new member organizations! These include Deloitte, DTCC, Infosys, MUFG, National Bank of Canada, State Street, Toyota, Samsung SDS, San Francisco Stock Exchange, Wall Street Emerging Technology Center, Wall Street Blockchain Alliance, Jiangsu Huaxin Blockchain Research Institute, etc. The complete member list can be accessed at: https://entethalliance.org/enterprise-ethereum-alliance-release-05-19-2017.pdf

The Ethereum team developed Ethereum Trade, with Vitalik Buterin as one of the shareholders. The development of Ethereum laid a sustainable foundation for Ethereum Trade, and the Ethereum Trade team promoted Ethereum worldwide, causing ETC and ETH to surge several times in just a few months.

Birds of a feather flock together; a certain group of people must share common values, goals, hobbies, etc.; this is consensus. Simply put, it is the power of the masses! When consensus reaches a certain level of breadth and height, it forms a brand; the brand's awareness and influence are also a reflection of its consensus. The future belongs to unicorns and is also an era of brand competition.

The entire blockchain circle has also been caught up in a bubble. The most interesting phenomenon I have observed is that those teams with real technology are not in a hurry at all, possibly due to their smooth financing, while those whose resumes are incredibly impressive but whose concepts are pieced together or even logically inconsistent are the ones who are in a rush.

From the level of code submissions, Ethereum is undoubtedly the most active blockchain in development. Whether it is the number of submissions on GitHub, the number of stars and forks in repositories, or the number of developers, it far exceeds Bitcoin, Ripple, Bitcoin Cash, EOS, and Litecoin, as well as all other cryptocurrencies.

Ethereum is an open-source blockchain underlying system, somewhat like the blockchain version of Android, providing APIs and interfaces for everyone to quickly develop various decentralized applications (Dapps) on it. Although blockchain currently cannot match traditional internet speed and efficiency, according to statistics from Chain Tower Research, as of September 30, 2018, the Ethereum platform has recorded 940 DApps, with 352 of them being game-related DApps, accounting for 37.5%, prediction-related DApps accounting for 20%, trading market-related DApps accounting for 5%, and others accounting for 37.5%.

In simple terms, the Ethereum development community can be roughly divided into three levels of developers, from the outside in, from upper-level application projects to the underlying architecture.

The outer layer consists of various upper-level application project developers built on the underlying architecture of Ethereum. These developers may not directly participate in the technical advancement of the underlying architecture but still contribute to the prosperity of the entire community ecosystem. From the once-popular CryptoKitties to the "funding plate" game Fomo 3D, which once made over 100 million yuan in a single day, all are DApps built on Ethereum.

The second layer consists of peripheral developers from outside the foundation who are also invested in the underlying architecture. Since Ethereum is a completely open-source ecosystem, developers from all over the world can participate in the underlying development work in various forms as long as they are interested. Ethereum's off-chain scaling solution, Raiden Network, is one example.

At the core is the "Ethereum Foundation," led by founder Vitalik Buterin, headquartered in Singapore, currently with about 30 researchers dispersed around the world, dedicated to researching and developing the core underlying architecture. Many of them are developers from the 90s, like Vitalik.

To solve the problem of cryptocurrencies relying on proof-of-work (PoW), which leads to massive energy consumption, the Ethereum community has been actively researching how to transition to a proof-of-stake (PoS) mechanism in recent years. Sharding technology is the key technology for Ethereum's transition from PoW to PoS.

One major difficulty of the PoS mechanism is how to generate good random numbers to ensure that attackers cannot effectively attempt many random numbers simultaneously to achieve their attack goals. This part must rely on cryptography led by Justin Drake, such as verifiable delay functions (VDF), and specialized ASIC (Application-Specific Integrated Circuit) hardware research.

In the past, ASIC mining machines discussed in the cryptocurrency circle were mainly aimed at PoW calculations, with the goal of quickly calculating PoW to gain an advantage in block generation. Now, the ASICs discussed in Ethereum are specifically designed for VDF calculations. Justin Drake aims to design an ASIC that makes it difficult for attackers to easily surpass performance and crack its random numbers with powerful computing power. In other words, the likelihood of the entire network being compromised can be reduced to a level that is almost worry-free.

Another key direction for Ethereum is Casper, with Danny Ryan as the main developer. Casper is the key to Ethereum's transition from proof-of-work (PoW) to proof-of-stake (PoS) mechanism, expected to solve inherent drawbacks of the PoS mechanism, such as collusion among nodes, and smoothly replace the PoW mechanism.

Image

Currently, Ethereum has over 14,000 nodes globally, distributed around the world. Most nodes are in the United States and Europe, with U.S. nodes accounting for 43%. Asia, particularly China, has the most nodes, accounting for 13% of global nodes.

In fact, the Ethereum community in China has been developing for a long time, and many early core Ethereum developers came from China. However, during the recent blockchain entrepreneurial boom, many early members of Ethereum in mainland China have shifted to various other projects, while the core Ethereum development community in Taiwan and Hong Kong continues to thrive.

In Taiwan, the Taipei Ethereum community, established for over two years, has included Vitalik himself and many blockchain luminaries like Litecoin founder Charlie Lee. Moreover, the global developer team of the Ethereum Foundation currently has about 30 people, of which 4 or 5 are from Taiwan.

The development work of Ethereum can be simply divided into four processes: researching theories, writing specifications, implementing prototypes, and implementing clients. In reality, the production of software programs is certainly not that simple; the actual operation is more complex, but the sequence still follows the order of research, writing specifications, and then development and implementation.

The Ethereum Foundation not only welcomes peripheral developers to contribute voluntarily but also offers rewards to encourage more programmers to tackle more challenging issues. Since early 2018, it has issued a total of $11 million in rewards for 52 projects.

Among them, projects addressing Ethereum's scalability and security have received the most rewards and project support. In terms of amounts, 61.3% of the rewards were allocated to scalability projects, and 16.8% to security projects. In terms of project numbers, 29.1% were scalability projects, and 18.8% were security projects.

The application process for development rewards consists of several steps. The first step is to submit a project application. Applicants must clearly demonstrate their commitment to the Ethereum ecosystem, development capabilities, development focus, and progress planning. They must also present the differentiation from other projects. Of course, the project must support open-source.

Image

Gavin Wood provided a very clear definition in the Ethereum yellow paper. Although some content is dirty and perplexing, the overall structure is clear. After reading the yellow paper, you will roughly understand that Ethereum not only has many revolutionary innovations but also has a self-reinforcing technical barrier. If you then look at Vitalik's purple paper, you will understand that he had already thought deeply about future scalability long before you.

I will briefly introduce the technical highlights of Ethereum, and you can judge for yourself whether those projects claiming to disrupt Ethereum can achieve that.

Invincible Usability

The interface design of Ethereum is not elegant but very simple, and the protocol is easy to understand. Although there are many technical flaws, these are not the primary concerns for developers aiming to implement DApps. If you want to create a better chain, you must first achieve: a usable VM, a language that supports development, parameter serialization and deserialization scripts, storage data structure models, leveldb storage interfaces, and line protocols, etc.

Due to the head effect, a large number of toolchains will be based on Ethereum in the future, making disruption on public chains more difficult. The cost of switching main chains is very high.

Powerful EVM

Much lighter than existing general-purpose VMs on the market, simple and easy to use. There are no cumbersome external dependencies. It also has the ability to temporarily store data and choose between stack or memory, with no limits on the size of stack and memory.

I can predict that many advanced language toolchains will be introduced on top of the EVM, extending the boundaries of applications with capabilities like types and abstractions, but for the role of the EVM, it is already sufficient.

From UTXO to Accounts Implementation

Let me first mention a separate example. Baidu, due to its particularly strong search capabilities, used a keyword indexing model for its Tieba instead of a forum model, meaning that when you enter a forum, the series of posts you see is a search result rather than a section. Sometimes when problems arise, the first floor disappears while the second and third floors remain, hence the saying "the first floor feeds the bear."

UTXO was once seen as a masterpiece of Bitcoin, contrary to our habitual thinking, as it constructs the user's account balance system by finding all transactions that can be signed by the user. At the same time, because of the existence of the UTXO queue, Bitcoin easily implements a technique to prevent double spending through packaging and comparing long and short chains.

Ethereum uses an account system, simplifying the implementation difficulty and saving space with the concept of state. This trade-off is actually difficult to judge, as each transaction requires an additional nonce to prevent replay attacks, and there are also some flaws in scalability and privacy protection.

But in any case, this system has run hard for a long time, up until today. In my view, it has made the implementation of light clients very simple, which is very meaningful for developers.

Trie Design, Friendly to Light Clients

The three pointers in each Ethereum block header represent three core trees: state, transactions, and receipts.

The transaction tree is relatively simple, and receipts are an RLP-encoded data structure, which simplifies indexing and makes the use of logbloom very convenient for light clients.

The design of the block header is closely related to light clients. Most nodes do not need to fully synchronize but require convenient access to data.

The main data structure within blocks is the MPT, where sparse areas use KV nodes. In the state and account trees, the depth of diverge nodes is 64, using sha3(k) as the key, making it very difficult to perform DOS attacks.

Currently, the entire state access query can become faster, and full synchronization of full nodes can also become faster, but this is not within the scope of our discussion today.

Future Opportunities

Because the purple paper has brought new directions, I will not introduce compression algorithms or uncle block implementations, etc. The purple paper introduces the PoS mechanism, friendlier light client synchronization implementations, scoring and measurement implementations, sharding, and cross-shard communication, indicating that Ethereum is not only growing but also has a wealth of practical business to continuously validate its design and obtain feedback. Just this point alone makes it impossible for other general public chains to criticize Ethereum's scalability and think they can handle it.

Suppose you attend a party with 23 people; what is the probability that two of them share a birthday?

The correct answer is about 50.7%. Isn't that different from your intuition?

This is the famous birthday paradox; we can easily be blinded by what is easily visible and fall into a trap of our own logical correctness.

To prove the authenticity of this probability, I wrote two programs. The second one is a loop formula that directly calculates 365/365 * 364/365 * 363/365 * ... * 343/365, and the result matches the correct answer. The second one exhaustively tests 100,000 groups of 23 birthdays, and the repeated result is 49.8%, close to the correct answer.

Still not satisfied, I ran it for a while longer, and as the number of people increased, the probability of sharing a birthday approached 1. In other words, in a standard domestic primary or secondary school class, the probability of two people sharing a birthday is very high.

For potential cognitive biases, my usual approach is like these two pieces of code: first conduct theoretical analysis, then validate through practical data.

There are also some things that not only have intuitive biases but cannot be validated at the moment. For instance, my first interaction on Weibo about Bitcoin happened when Bitcoin was $20, and my article criticizing Ethereum's memory hard was published when ETH was $0.8. At that time, not only did intuition tell you that this numerical game might go to zero, but no one could foresee or deduce the later uncontrollable situation.

Ruffchain was controversial from day one. Even the track itself is filled with various extreme interpretations. We do not have time to explain, nor is it necessary to explain. If I were a critic, I could find all sorts of information interpretations every day and make some irresponsible predictions. However, my vision for Ruffchain is not complicated; my energy and limited ability only allow me to do the following:

  1. Build a user-friendly public chain that separates transactions and contracts, rather than implementing transactions through contracts like ETH or packaging a contract with a transaction like BCH.

  2. Form a one-stop solution based on the business data of some suitable industry clients. This way, anyone can use blockchain services at an absolutely low cost in the future.

I have often criticized Ethereum's technology, but its low-cost, low-threshold contracts have maintained the top position in smart contracts until now.

This development has taken nearly two years, with countless iterations, projections, and optimizations for potential block size growth, and repeated adjustments to the p2p network protocol (which will require another major upgrade in the future). The workload has been much larger than I expected. In this process, I found that the entire industry is much earlier than I anticipated; many foundational technologies have not been well established. We still do not have a high-performance distributed database for random read and write, nor do we have sufficiently usable p2p networks to support various blockchain businesses in the future. Apart from a few major chains' technological iterations and upgrades, there are no decent engineering solutions on the market, and many technological innovations in white papers are completely unusable under the current infrastructure.

But this also presents an opportunity. Once these technologies mature, the visible application scenarios will become richer. Just like artificial intelligence was most popular in the 1950s, it wasn't until later, with the rise of GPU arrays and other hardware acceleration capabilities, and then FPGA to ASIC, that we had the ability to truly apply artificial intelligence to everyday scenarios.

Regarding industry access, fortunately, our target clients have given us a lot of support, willingly acting as our guinea pigs, providing rich testing business data, and encouraging us to improve our technology. This represents that blockchain is not a pseudo-demand; these clients need to solve the trust relationship of multi-party operational data and acknowledge that this is a part of their business flow with extremely high costs.

However, this part also presents some other challenges: we need to put some potential clients' applications on-chain and provide technical support services. This part, like companies such as BAT, requires proactive filing, and the requirements and content for filing are very numerous. For example, content regulation, since blockchain data is immutable, once content that does not comply with the laws and regulations of the application location is generated and packaged, how to roll back and clean it up is more energy-consuming than technical development and is work that the team did not anticipate.

After the mainnet goes live, it will enter a regulatory trial operation phase, during which the block data will be completely cleared after the trial operation, and any unfiltered block information that violates local laws and regulations will be marked by nodes and forcibly rolled back.

  1. A new p2p network protocol, better storage and retrieval optimizations, stronger performance, and can support some business scenarios with higher network requirements.

  2. We need the support of community power to improve a rich open technology ecosystem, hoping to have a richer application library, middleware library, business modules, and various optimization and security modules in the coming year. To this end, the Ruffchain Foundation will provide generous rewards.

  3. More business cooperation; applications will bring more technical feedback, and the foundation of business landing is the main direction for the future technical evolution of the mainnet. Compliance remains a challenge; once asset data is on-chain, the workload for security and privacy is essential. If financial data is involved, the workload will increase by an order of magnitude.

  4. The cold start of the mainnet and subsequent development rely on the support of community users. We were born at the end of the bull market and the beginning of the bear market. In a bull market, everything is "faith," while in a bear market, everything is "scams." The pressure of operation has always been high, and in the future, I will continue to focus more on evangelism and business development, with recent activities such as AMAs and node plans being launched successively.

Forks are completed. Without an EVM and large state trees, to maintain BCH's centralization while supporting contracts, the omni protocol can only utilize the op-return field in the existing data structure. Although USDT has been implemented on the BTC mainnet, it is still far from being a contract.

First, the Op-return is still included in the transaction, meaning that executing a contract requires a real transaction to carry it. Since every transaction needs to be signed, changing the field will invalidate the signature, so normal BCH transactions cannot load external information into the op-return field. This leads to a large number of new BCH transactions serving as carriers, which may be meaningless, creating significant network pressure. Additionally, apart from paying the official fuel WHC, BCH must also be paid.

Second, without the existence of EVM, the pressure of interpreting pure text ledgers is significant, and mistakes also need to be interpreted. Any input in an op-return field is considered valid by default, and once a large number of tokens are supported, the situation becomes very concerning.

According to the development plan of Wormhole, the first step is to port omni over, the second phase is to create a decentralized protocol similar to 0x, and the third phase is to support non-fungible token needs like ERC721. The fourth phase is to introduce off-chain settlement mechanisms like plasma.

However, in my view, the design of this path is quite strange. Following my previous analysis, it should be like this: the main chain can support using a field to expand a ledger, and it is best not to create a large number of transactions, so once Wormhole plans to support dapps, it should have a layer 2 system to explain and execute a certain amount of contracts, ultimately creating a transaction on BCH to package and submit. The measurement unit of WHC should be gas, and this gas can be calculated based on the evaluation of network resources to determine the reasonable amount of contract execution (the quantity tends to stabilize) contained in each block.

The correct order should be: 1. Port omni. 2. Achieve decentralized settlement between BCH and tokens in a single transaction. (There is a pit here; since ETH is not an ERC20 standard token, protocols like 0x still need an ETH token to assist, and BCH is the same.) 3. Establish an independent extension layer responsible for packaging WHC contracts and submitting them centrally. 4. Once new technologies emerge, catch up.

When giving advice to the Huobi public chain, the first paragraph is to write its strategic positioning, that is, what the motivation is, why to do this, and then set technical and operational goals.

What is the purpose of supporting contracts on the BCH network? What is BCH's fate? The Bitcoin white paper describes a peer-to-peer payment system, not a super ledger and operating system supporting numerous distributed applications.

The historical mission of Wormhole is not to kill Ethereum or fight EOS. Alipay is not a speculation system; Alipay is a bank.

In terms of system performance, CDNs play an important role in optimizing network performance. I originally wanted to learn about the principles and technologies related to P2P CDN, but I didn't expect that learning about P2P itself would take a long time, and P2P has many differences from my original understanding.

Although P2P systems have only recently become popular, the technical pioneers of P2P have existed for a long time. Early examples include NNTP and SMTP, as well as the Internet routing system, which are mostly decentralized and rely on participants' resource contributions. However, the nodes in these systems are organized, and the protocols are not self-organizing.

What were those inspiring P2P systems like?

  1. The Past of P2P is Like Smoke Peer-to-peer (P2P) computing is not a new technology. Traveling back in time to before the "Internet bubble" in 1999, the release of three systems triggered the first wave of P2P enthusiasm:

Napster music sharing system

Freenet anonymous data storage

SETI@home volunteer-based scientific computing project.

Napster allowed users to download music directly from each other's computers over the Internet. Because bandwidth-intensive music downloads occurred directly between users' computers, Napster saved enormous operational costs and could provide free services to millions of users. Although legal issues ultimately determined Napster's fate, the idea of cooperative resource sharing inspired many other applications.

Freenet aims to combine distributed storage with content distribution, anti-censorship, and anonymity. It is still alive, but it may involve the "dark web," which is not convenient to mention. SETI@home is the largest and most widely influential distributed computing project globally. SETI (Search for Extraterrestrial Intelligence) is a scientific experiment searching for intelligent life beyond Earth, using radio telescopes to listen for narrowband radio signals in space, as some signals cannot be naturally produced; detecting such signals could prove the existence of extraterrestrial civilizations.

On May 17, 1999, SETI@home officially began operation, attracting a massive number of volunteers worldwide. The SETI@home project is headquartered at the Space Sciences Laboratory at the University of California, Berkeley, and the recorded and analyzed signal data comes from the Arecibo Observatory in Puerto Rico, which once housed the world's largest single-dish radio telescope, with a diameter of 350 meters. This record now belongs to China's FAST telescope project. Last year, on March 31, SETI@home officially announced it would enter hibernation, and the volunteer computing resources distributed globally would no longer receive new data packets. The data required for analysis has been fully processed, and researchers will now focus on back-end analysis of the results and publish the project's research findings. To some extent, SETI@home may be the most successful P2P project.

More than twenty years have passed, and P2P technology has far exceeded the scope of music sharing, anonymous data storage, and scientific computing, and has been increasingly applied in open-source communities and industries, especially with the success of Skype and the brilliance of P2P in the IPFS and Filecoin blockchain fields, P2P technology has once again received significant research attention.

  1. What is P2P? What does P2P systems mean?

P2P networks, or peer-to-peer computer networks, are a distributed application architecture that distributes tasks and workloads among peers, forming a networking or network form of the peer-to-peer computing model at the application layer. — Baidu Encyclopedia

However, in reality, P2P does not have a unified and complete definition.

2.1 Characteristics of P2P Generally, P2P systems are distributed systems with three typical characteristics: decentralization, self-organization, and multiple management domains.

2.1.1 Highly Decentralized Peer nodes implement client/server functionality, with most of the system's state and tasks dynamically distributed among peer nodes, with almost no dedicated nodes with centralized state. Therefore, most of the computation, bandwidth, and storage required by the operating system are provided by participating nodes.

2.1.2 Self-organization Once a node is introduced into the system (by providing the IP address of participating nodes and some necessary key data), maintaining the system requires almost no manual configuration.

2.1.3 Multiple Management Domains Participating nodes are not owned and controlled by a single organization/institution. Generally, each node is owned and operated by an independent individual who voluntarily joins the system.

2.2 Classification of P2P Systems P2P systems can be roughly classified based on the presence or absence of centralized components.

2.2.1 Semi-Centralized P2P Systems In such systems, there is a dedicated controller node that maintains the collection of nodes and controls the system. Semi-centralized P2P systems are relatively simple and can be managed by a single organization through the controller. For example, early versions of BitTorrent had a "tracker" that tracked whether a group of nodes was uploading and downloading the same content and periodically provided a set of nodes that could connect to the nodes; the BOINC volunteer computing platform has a website to maintain membership and allocate computing tasks; Skype has a central website to provide login, account management, and payment services.

Semi-centralized P2P systems can also provide organic growth and rich resources. However, they do not necessarily provide scalability and resilience, as the controller forms a potential bottleneck and single point of failure.

2.2.2 Decentralized P2P Systems In decentralized P2P systems, there are no dedicated nodes that influence the operation of the system, and there are no inherent bottlenecks, allowing for good scalability and potential resilience against failures, attacks, and legal challenges.

In some decentralized P2P systems, resource-rich nodes, high availability, and publicly routable IP addresses act as supernodes. These supernodes also have additional responsibilities, such as serving as junction points for nodes behind firewalls, storing states, or maintaining indexes of available content. Supernodes can improve the efficiency of P2P systems but may also increase their vulnerability to node failures.

Image

  1. Advantages and Disadvantages of P2P Due to these technical characteristics, P2P technology has inherent advantages:

3.1 Low Deployment Threshold Since P2P systems rarely or never require dedicated infrastructure, the upfront investment required to deploy P2P services is often much lower than that of CS systems.

3.2 Organic Growth Since resources are provided by participating nodes, P2P systems can grow almost indefinitely without needing to "upgrade" existing infrastructure, such as replacing servers with more powerful hardware.

3.3 Adaptability to Failures and Attacks P2P systems are highly adaptable to failures because there are very few nodes critical to the operation of the system. To attack or shut down a P2P system, an attacker must simultaneously attack a majority of nodes.

3.4 Richness and Diversity of Resources Popular P2P systems have rich resources that few organizations can undertake alone. These resources have different hardware, software architectures, networks, power supplies, geographical locations, and jurisdictions. This diversity reduces the vulnerability to cascading failures, attacks, or even censorship.

Everything has its two sides, and P2P is no exception. For example, the decentralization capability of P2P can help citizens avoid censorship; at the same time, it can also be abused to evade law enforcement and engage in criminal activities. P2P systems face many challenges, raising concerns about their manageability, security, and enforceability. Additionally, P2P applications are affecting the traffic provided by Internet service providers and may disrupt the current Internet economy.

  1. How do P2P systems work? The most important technology in P2P systems is building a network overlay layer whose routing capability can work well under high node loss. For P2P scenarios, more specific issues include maintaining application state, coordinating application-level nodes, and content distribution.

4.1 Maintenance of Overlay Network P2P systems maintain an overlay network, which can be thought of as a directed graph G = (N,E), where N is the set of computers and E is a set of overlay links. A pair of nodes connected by a link in E knows each other's IP addresses and communicates directly over the Internet.

In semi-centralized P2P systems, new nodes join the overlay network by connecting to a controller located at a known domain name or IP address. Therefore, the overlay layer initially has a star topology, with the controller at the center. Dynamic overlay links can form between participants introduced by the controller.

In decentralized P2P systems, new nodes need to obtain the network address (e.g., IP address and port number) of a node that has already participated in the system through an external channel. For example, such addresses of bootstrap nodes can be obtained from a web site, and new nodes contact bootstrap nodes to join the overlay network.

4.1.1 Unstructured Overlay Networks In unstructured P2P systems, there are no constraints on the links between different nodes, so the overlay graph has no specific structure. In a typical unstructured P2P system, a newly joined node forms its initial links by repeatedly executing a random traversal starting from a bootstrap node and requesting a link to the traversal's terminating node. By executing more random traversals, nodes gain additional links.

Typically, the minimum node degree is chosen to maintain connectivity in the overlay graph, while maintaining the maximum node degree is to bind and maintain the overhead associated with maintaining overlay links.

4.1.2 Structured Overlay Networks In structured overlay networks, each node has a unique identifier in a digital space, which allows them to be evenly distributed in that space. The overlay network graph has a specific structure: the identifiers of nodes determine their positions in that topology and limit their sets of overlay links.

Keys are also used when assigning nodes. The key space is divided among nodes, and each key is accurately mapped to one of the current overlay nodes through a simple function. For example, a key can be mapped to the node whose identifier is closest to the key in the counterclockwise direction. In this technique, the key space is considered circular.

The structuring of the overlay network is aimed at improving the efficiency of key-based routing. Key-based forwarding schemes implement KBR (n0, k), which generates a path, i.e., a series of overlay nodes, given a starting node n0 and a key k, where the endpoint is the node represented by key k. Generally, they achieve a balance between the number of routing states required at each node and the number of forwarding nodes needed to pass messages.

Thus, in semi-centralized P2P systems, the controller facilitates the formation of the overlay layer. In other P2P systems, the maintenance of the overlay network is entirely decentralized. Compared to unstructured overlay networks, structured overlay networks require more resources to maintain a specific graph structure. In return, structured overlays can effectively execute key-based forwarding methods.

The choice between unstructured and structured overlays depends on the usefulness of key-based forwarding methods for applications and the frequency of updates among overlay members. Key-based forwarding methods can reliably and effectively locate uniquely identified data items and maintain a generating tree among member nodes. However, maintaining a structured overlay in a high-loss environment incurs related costs, and if the application does not require the functionalities provided by key-based forwarding methods, these costs should be avoided.

Of course, some P2P systems use both structured and unstructured overlays simultaneously; for example, the controller uses a structured overlay while content distribution can use an unstructured overlay.

4.2 Distributed Network State Most P2P systems maintain some application-specific distributed network state. Generally, the network state is considered a collection of objects with unique keys. Maintaining distributed network state involves the storage and location mechanisms of these objects.

4.2.1 Network State in Semi-Centralized Systems In semi-centralized P2P systems, state objects are typically stored on the nodes that insert the objects and any nodes that subsequently download the objects. The controller node maintains information about which objects exist in the system, their keys, names, and other attributes, as well as the current nodes storing these objects. Queries for a given keyword, or a set of keywords matching object names or attributes, are directed to the controller, which responds through a set of nodes from which the corresponding state objects can be downloaded.

4.2.2 Network State in Unstructured Systems Similar to semi-centralized systems, content is typically stored on the nodes that introduce the content into the system and replicated on other downloaders. To facilitate finding content, some systems place copies (or pointers) of inserted objects on other nodes, for example, along a randomly traversed path in the overlay graph.

To locate an object, querying nodes typically send request messages through the overlay network. Queries can specify the desired object by key, metadata, or keywords. Nodes that receive the query and have matching objects (or pointers to matching objects) respond to the querying node. In this case, node i inserts an object into the system and keeps its unique copy, but inserts pointers to that object on all nodes along a random traversal ending at node r. When node s attempts to locate the object, it sends a large number of queries, first reaching all nodes one hop away, then reaching all nodes two hops away. In the final step, the query reaches node r, which returns node i's address.

Typically, the range of flooding is limited to the probability of finding objects that exist in the system with the required message count. An alternative to flooding is for the querying node to send request messages along a random traversal through the overlay.

4.2.3 State in Structured Overlay Networks In structured overlay networks, distributed state is maintained using Distributed Hash Tables (DHT). DHTs have the same put/get interface as traditional hash tables. By using a simple function, the inserted key/value pairs are distributed among the nodes in the structured overlay network.

Given this copy placement strategy, DHT's put and get operations can be directly implemented using KBR primitives. To insert (place) a key/value pair, we use KBR primitives to determine the responsible node for key k and store that key/value pair at that node, which then propagates it to the set of copies of k. The responsible node can respond to retrieval requests or forward them to a node in the copy set.

When DHTs fluctuate, the mapping of keys to nodes changes, and key/value pairs must be moved between nodes. To minimize the required network communication, large data structures are typically not directly inserted into the DHT; instead, an indirect pointer is inserted into the value corresponding to the key, pointing to the node that stores the actual value.

Thus, unstructured overlay networks are very effective in locating widely replicated objects, while systems based on KBR can reliably and effectively locate any objects present in the system, regardless of their sparsity. In other words, unstructured overlay networks excel at finding commonly existing items, while structured overlay networks excel at finding needles in haystacks. Unstructured networks support arbitrary keyword-based queries, while KBR-based systems directly support keyword-based queries.

4.3 Distributed Coordination Typically, a set of nodes in P2P applications must coordinate their operations without centralized control. For example, a set of nodes that replicate a specific object must notify each other of updates to that object. Similarly, nodes interested in receiving specific streaming content may wish to find nearby nodes that have available upstream bandwidth among those currently receiving that channel.

Generally, there are two different methods to solve this problem: self-replicating techniques, where information spreads through the system in a virus-like manner, and tree-based techniques, where a distribution tree is formed to propagate information.

Here, we will focus only on decentralized P2P systems, as in semi-centralized systems, the controller node can complete coordination.

4.3.1 Coordination in Unstructured Overlay Networks In unstructured overlay networks, coordination typically relies on self-replicating techniques. In these protocols, information spreads similarly to how infections spread among people: the node generating the information sends it to neighboring nodes in the overlay, and neighboring nodes then send the information to their neighbors, and so on. This propagation method is very simple and effective, but there is a trade-off between the speed of information dissemination and management costs. Additionally, if a given piece of information is only of interest to a subset of nodes, and those nodes are widely dispersed in the overlay network, that information may ultimately be unnecessarily transmitted to all nodes.

A more efficient way to coordinate actions among a group of nodes is to form a generating tree among the nodes. The generating tree is embedded in the overlay network graph and is completed using distributed algorithms. This tree can then be used to send multicast messages to all members or compute summaries of state variables within the group (e.g., sum, average, minimum, or maximum). However, the increased coordination efficiency

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.