Pieter Wuille on Bitcoin Scaling, Sharding, Sidechains, ETH and More

Do repost and rate:

Pieter Wuille, a man easy to acclaim, and someone that generally has the respect of all flavors of bitcoiners, was kind enough to take his time to go through some topics on a most important matter, the scalability of decentralized computer networks.

The Bitcoin Core developer and Blockstream co-founder might not be familiar to newcomers, but he is one of the key developers working on extending bitcoin’s coding language in addition to maintaining and developing the core software.

A software he has been working on for now close to a decade. There at nearly the beginning, and still there, a handful is perhaps maybe even too much to count the men with his experience.

So we present below a pretty much full transcript with no commentary, and with only small cosmetic modifications as we’ve tried to maintain the ‘feel’ of the real time interaction to provide the full context and thus give you the opportunity to form your own opinion, with it starting off as kind of a warming up because the better parts are nearer the end of the discussion:

Hi, I wanted to know about this two way sidechain decentralized peg, have people given up on it?

Pieter Wuille: I don’t know about people in general (at least drivechain and some others are still pursuing it), but i personally have come to the belief that they’re a bad idea.

Why?

In short, because mainchain miners can steal sidechain funds, and the same incentives against that which exist on the mainchain don’t exist (full nodes can’t reject the theft).

So you saying sharding is impossible?

That depends on what you call sharding, and I didn’t say impossible – just a bad idea.

Sidechains which are not majority-hashrate protected but full node protected don’t have that problem.

But they also don’t have any scaling advantages over just larger mainchain capacity.

Ok well at its basic I’m thinking of basically a way to connect say two sets of nodes while still under the same rules.

Not sure if you’re familiar with what ethereum is trying to do or plenty of other projects, but interoperability between blockchains basically, you think that’s not possible?

I think they need this for very different reasons. Bitcoin transactions can already be validated in parallel and always have been.

In ethereum, transaction validity depends on the state of the chain, which limits parallellism.

Sharding may make sense as a way to bypass that (though i’m skeptical about the logistics around it).

I don’t think sharding makes sense as a scaling mechanism beyond that.

So you saying bitcoin has sharding, you can have 100 nodes have x data on one ‘network’ and another 100 nodes have x other data and somehow combine them to have one network that can now process 2x more?

Sure. That’s not implemented to be clear, but it doesn’t require consensus changes to accomplish.

Run two nodes each of which keeps 50% of the UTXO set, and only treat a block as valid if both say it’s ok.

But that’s incoming transactions, what about the ones that have to be stored?

You don’t need to store transactions.

And then synching new nodes etc.

That’s an easy problem. Bulk data storage and transfer is something the internet solves really well already. Validation is the hard part.

So validation is connection isn’t it, basically getting these two node networks to talk, like getting two computers to talk and so have the world wide web.

In what I suggest every node would still see and download and verify every transaction.

This does not reduce bandwidth or cpu or i/o costs, it just lets do 2x the work, with 2x the hardware.

But that’s the current… ohh you mean in what you’re saying more nodes can mean more capacity?

No, I think it’s all a bad idea. Just saying it’s possible… if individual hardware nodes were too weak to keep up with the full chain, this idea would let you split it in N.

So you given up on scaling?

That’s a very unnuanced statement. I think the hardest problems in scaling bitcoin is facing aren’t about on-chain capacity.

Or more bluntly: on-chain capacity does not scale, period – whether you put it on a side-chain, or on a shard, it’s still effectively an everyone-eventually-needs-to-verify-everything problem.

That doesn’t mean there are no improvements possible in that regard, but they’re small constant factors at best. Real ecosystem scaling needs other solutions.

Why does everyone have to verify everything? If you take current bitcoin, for example, in a way 51% of miners is enough isn’t it? Or you can say nodes.

For full validation, everyone needs to validate everything. I don’t think everyone needs to be a full node, but i think everyone should be able to be one if they want to.

I’m not sure what you mean by validation? In current bitcoin, there’s a 51% rule no?

No. A block is valid when it is valid, period. And every node verifies this independently.

How do they verify it? With proof of work right.

Lol, no.

So how do they verify it?

By verifying every signature, checking no transaction double spends, that no miners is printing more than they’re allowed to, … and a dozens of other rules. Those are the consensus rules.

Proof of work is there to create an economic incentive for miners to converge towards a single chain, rather than many alternative chains which are equally valid.

Well exactly, something we all agree on as truth.

So full nodes will treat the most-work *valid* chain as the active one, but they will never accept a chain with an invalid transaction in it, no matter how much work it has.

They can accept what they want, within a network, what we care about is that chain isn’t it? Otherwise anyone can code their node as they please.

Anyone can code their node as they please. But if they want to stay on the same currency, their rules have to match others’ nodes.

Exactly, and that’s the problem PoW solves.

No, it does not. PoW is a tiebreaker.

A tiebreaker based on 51%. So how do we get from that to everyone?

It’s a solution for the problem that two independent valid chains can exist, where the same money goes to person A in one, and to person B in another.

Which is THE problem.

Someone has to make the decision about which of the two gets accepted. That is what miners solve. Using PoW, In a decentralized way.

Yes and they solve it by 51%.

Everything else is decided by everyone individually.

So how do we get to everyone? But everyone can decide all other things anyway.

Yes. But it’s expensive.

Obviously what we care about is the double spending. That you can’t print coins etc.

Yes.

So if in the current network this is solved by 51%, why can’t we have two networks where 51% of each solves it?

Because if you (as a network) receive two transfers from the other network, you have to verify that those are not double-spends either.

Which means you need to know that everything that happened in that network was valid.

Which is verified in the other network.

How do you know if you don’t verify it?

I don’t verify bitcoin, but I use it. Others do it.

That’s ok, others do.

I can if I want to verify it, but obviously, I don’t verify what medicine I get. Specialization of labour, one can’t verify everything. The important thing is obviously that you can verify it if you want to.

There is no specialization. The problem in a consensus system is that everyone has to agree on everything. Otherwise the network splits along disagreement lines.

So why should we trust the current verifies and not the network 2 verifiers?

So you have the network 1 verifiers, and the network 2 verifiers and some people who verify both.

And the code. So what we are verifying is sybil resistance. You have the code, validity.

When a transfer from network 2 comes to network 1, then only-network-1 verifiers will blindly accept it.

We’re all on same code. What we are verifying is proof of work basically.

NO. Please.

Sorry I’m not arguing or even disagreeing, just conversing I guess.

It’s a common misunderstanding that PoW validity implies full validity, but the only reason that this assumption holds is because people are doing the full validation, and would reject work that miners create on an invalid block.

They do full validation how? The code does it.

Yes.

So you have two networks running the same code, but in two PoW networks, storage, etc., maybe?

It’s really a lot more complicated than that to make sure things keep converging under adverserial circumstances.

And in the end, the result is that in order to have same level of confidence in validity, you need to do 2x as much work.

Well that’s what I wanted to understand, what are the complications and how you think they could be overcome or do you think it’s not worth even trying?

I think the best you can gain by doing so is a small constant factor and always with trade-offs.

My original interest in sidechains wasn’t because of scaling by the way, but to enable experimentation with *different* rules without needing to introduce a new currency first.

Somehow lots of people instead took it as a mechanism for scaling, which I think is flawed – if that’s desirable, increasing mainchain capacity is a far more trust-minimized solution to achieve the same thing, with the same costs.

But you can’t increase mainchain capacity can you?

If everyone agreed that it should be increased, sure you can.

Well do you think it should be increased?

Yes, at some point. I don’t think there is any urgency.

Wouldn’t the problem with that be you have to download huge amount of data?

Technology improves. No reason why our capacity to distribute and verify can’t increase as well.

But it’s ever growing data.

I think there are solutions to that.

Like what?

At some timescales, human consensus works better than technology.

If you’re able to say, get a good way to verify what the state of the network was 5 years ago (have its hash printed in every newspaper, whatever…), you can bootstrap from that point, without needing to download all history before it.

So how would you increase it, like would it be some percentage, maybe the old soft and hard limit, or something else?

Yeah, couple % per year maybe, but that’s still only a small constant factor over anything but huge time scales.

So if you could set it right now, how much do you think it should be and growing by how much?

I spent a few years of my life arguing about that topic; I don’t plan to do that again. You can find proposals with my name on it.

Right are you familiar with what eth is trying to do in regards to sharding, what you think of their plans?

I think their need for sharding is because of a different problem than bitcoin has, which may justify it. Beyond that, I think it’s a rube goldberg machine.

How do you think they could get these different networks to talk, you think they can even do it?

I don’t really care.

No I’m just wondering if they can, why can’t bitcoin?

Bitcoin has no need for it; I believe the same can be accomplished by increasing capacity – the same is not entirely true for eth because transaction validation isn’t as parallellizable. Beyond that, I think it’s a bad idea as said.

So I’m not hugely technical, what makes utxo more parallelizable than accounts?

It isn’t that much due to utxo vs accounts. It’s due to the script system having access to the blockchain state. This isn’t the case in bitcoin.

If I give you a transaction (and the potential unconfirmed data it depends on), you can unambiguously validate once and for all whether that transaction is valid or not in a way that will never change.

Of course, it can only be included in the same once all the outputs it spends exist (double spending protection), but validation is something you can do in parallel with other transactions.

In eth, transactions modify a global state, and validity of a future transaction (as well as its effects) depend on that state which inherently introduces a serialization requirement… transactions may be valid in some order, but not in another order

Bitcoin of course also has a global state (otherwise you can’t protect against double spending) but it’s extremely restricted; it’s independent from its script system, and only accounts for ‘is this coin available or not.’

So just to clarify here do you mean this isn’t the case in eth or in bitcoin [quoting]: It’s due to the script system having access to the blockchain state. This isn’t the case in bitcoin.

In eth, the VM has access to global state. In bitcoin, script does not.

So basically your solution to scalability is print the latest block in a paper?

That’s again a very unnuanced statement. Scalability means dozens of different things.

If you’re specifically talking about the problem of needing to download and verify an ever-increasing dataset when initially joining the network, I think that one of the solutions is having publicly verifiable commitments to history, and bootstrapping off those.

Those commitments can take many forms.

So how would you do that bootstrapping technically and practically speaking. Let’s say the paper of record has the latest block, how do nodes now even access this paper?

Your software shows you the hash it downloaded off the network, and asks you to compare it with what you found elsewhere. Or it’s even embedded in the software. Look at the assumeutxo project.

Meaning you need an oracle?

I don’t know what you mean by that. You get the software from somewhere right?

Doesn’t that make the somewhere the new committee?

Not more than the process that results in you finding the software already is (which includes reproducible builds from open source code, auditable review, …).

So you saying you merge it in Github basically? And then maybe get someone to archive the old history.

I feel you’re trying to put words in my mouth.

No I’m asking, I have no view. Just clarifying and trying to see how it can actually work.

My point is that at some time scale, humans already have processes in place to get correct information – otherwise bitcoin wouldn’t be working in the first place.

I don’t know what that timescale is, but at some point, inherently, there is not much gained anymore by validating infinitely far back.

I also think this is only a minor issue – the question of how to scale an economy w.r.t. on-chain capacity is a much more interesting one.

FIN

Wuille said he had other matters to attend to, so we couldn’t get to that economy level scaling, maybe another time.

Copyrights Trustnodes.com

Regulation and Society adoption

Ждем новостей

Нет новых страниц

Следующая новость