Will the size of Bitcoin core full-node be too big to run on a normal computer?Bitcoin-QT: which external memory (connected via USB2.0 + TrueCrypt) would work?Security risk(s) in running bitcoin core full nodePruning a Bitcoin full nodeHelp with bitocin.conf - how to configure a testnet full node in a safe way?Bitcoin Full Node + Lightning running on single-board-computer, possible?I run a full node. Is there a desktop client to connect to this full node?Bitcoin core full node won't start on mac reboot!Under what circumstances will a full node disconnect a peer?Possible to run a Lightning node on same computer as Bitcoin Core node?
Disrespectful employee going above my head and telling me what to do. I am his manager
How can I cut a metal pipe while preserving the wires inside?
Self organizing bonuses?
I didn't do any exit passport control when leaving Japan. What should I do?
33 Months on Death Row
How honest to be with US immigration about uncertainty about travel plans?
'The Kukhtarev's model' or 'Kukhtarev's model' ('John's car' or 'The John's car')?
Island of Knights, Knaves, Spies
Is data science mathematically interesting?
How to protect my Wi-Fi password from being displayed by Android phones when sharing it with QR code?
Can the Bountiful Luck halfling racial feat be used multiple times in one round?
Is a light year a different distance if measured from a moving object?
algebra permutations
What is the purpose of the redundant "いい人" in this example sentence
Can i hide one of my siamese twins heads in public?
How should I tell a professor the answer to something he doesn't know?
Is it realistic that an advanced species isn't good at war?
Will the size of Bitcoin core full-node be too big to run on a normal computer?
Novel set in the future, children cannot change the class they are born into, one class is made uneducated by associating books with pain
If the music alphabet had more than 7 letters would octaves still sound like the same note?
PhD Level Linear Programming Textbooks
Suspicious crontab entry running 'xribfa4' every 15 minutes
Is it plausible that an interrupted Windows update can cause the motherboard to fail?
How does Firefox know my ISP login page?
Will the size of Bitcoin core full-node be too big to run on a normal computer?
Bitcoin-QT: which external memory (connected via USB2.0 + TrueCrypt) would work?Security risk(s) in running bitcoin core full nodePruning a Bitcoin full nodeHelp with bitocin.conf - how to configure a testnet full node in a safe way?Bitcoin Full Node + Lightning running on single-board-computer, possible?I run a full node. Is there a desktop client to connect to this full node?Bitcoin core full node won't start on mac reboot!Under what circumstances will a full node disconnect a peer?Possible to run a Lightning node on same computer as Bitcoin Core node?
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;
I am a Bitcoin enthusiast, with no background knowledge in computer science and cryptography. I once ran Bitcoin core on my laptop but realized that it occupied too much space in my computer.
I understand the idea that Bitcoin network is secure as long as any individual can run the full node of Bitcoin core. However, what concerns me is that one day the full node may become too large to run on a normal computer. Do Bitcoin developers have any solutions to the problem? Or is it really a problem?
I would like to hear some responses from cryptocurrency developers or specialists in computer science.
bitcoin-core full-node block-size-increase storage-footprint cost-of-node-option
add a comment
|
I am a Bitcoin enthusiast, with no background knowledge in computer science and cryptography. I once ran Bitcoin core on my laptop but realized that it occupied too much space in my computer.
I understand the idea that Bitcoin network is secure as long as any individual can run the full node of Bitcoin core. However, what concerns me is that one day the full node may become too large to run on a normal computer. Do Bitcoin developers have any solutions to the problem? Or is it really a problem?
I would like to hear some responses from cryptocurrency developers or specialists in computer science.
bitcoin-core full-node block-size-increase storage-footprint cost-of-node-option
add a comment
|
I am a Bitcoin enthusiast, with no background knowledge in computer science and cryptography. I once ran Bitcoin core on my laptop but realized that it occupied too much space in my computer.
I understand the idea that Bitcoin network is secure as long as any individual can run the full node of Bitcoin core. However, what concerns me is that one day the full node may become too large to run on a normal computer. Do Bitcoin developers have any solutions to the problem? Or is it really a problem?
I would like to hear some responses from cryptocurrency developers or specialists in computer science.
bitcoin-core full-node block-size-increase storage-footprint cost-of-node-option
I am a Bitcoin enthusiast, with no background knowledge in computer science and cryptography. I once ran Bitcoin core on my laptop but realized that it occupied too much space in my computer.
I understand the idea that Bitcoin network is secure as long as any individual can run the full node of Bitcoin core. However, what concerns me is that one day the full node may become too large to run on a normal computer. Do Bitcoin developers have any solutions to the problem? Or is it really a problem?
I would like to hear some responses from cryptocurrency developers or specialists in computer science.
bitcoin-core full-node block-size-increase storage-footprint cost-of-node-option
bitcoin-core full-node block-size-increase storage-footprint cost-of-node-option
edited 16 mins ago
Raghav Sood
9,7812 gold badges12 silver badges29 bronze badges
9,7812 gold badges12 silver badges29 bronze badges
asked 8 hours ago
Libertarian Monarchist BotLibertarian Monarchist Bot
1285 bronze badges
1285 bronze badges
add a comment
|
add a comment
|
1 Answer
1
active
oldest
votes
Bitcoin has to maintain some balance to be able to maintain the ability to be decentralized. As you've correctly established, this is partly down to making sure that the resource requirements of fully validating the block chain are not unreasonable. There's a push form the consumer side of things to constantly increase the resource usage of Bitcoin for convenience or lower transaction fees, but this comes at the cost of decentralization. What the limits for this are is debatable to some extent, but there are clearly hard limits to avoid.
Most notably it's difficult to undo mistakes in this area of the system design, going too far with resource usage is effectively a permanent decision so you will find that choices have been made as conservative as reasonable.
We have the following limitations to contend with:
Size of the block chain. It is often advantageous to store previously validated information, even if it is not immediately required for the operation of the system. It makes wallet management easier and allows for easy re-synchronization if required. The limit of this can not be above reasonable cost or the size of volumes available to the mass-market.
Growth of the block chain size on dish currently is bounded by the block size limit. Knowing that 6 blocks happen on average per hour, and they can be a maximum of 4MiB when fully saturated with Segwit transactions, we have a bounded growth of 210 GiB per year.
Nodes can disgard these blocks once validated to avoid storing them, but the entire set must be transmitted to them in some form, typically over the internet but could be done via Bluray, Fedex, or carrier pigeon if this was cheaper or required. This is a limit for the size of the growth, as it must not exceed the ability of users to obtain the data needed to validate.
Size of the UTXO. The Unspent Transaction Outputs database is the storage of units of Bitcoin that have not been spent. This is consensus critical and must be stored by all nodes in the network who are fully validating. This database has looser restrictions on its growth, and is effectively bounded by the block size as well, as it implicitly limits the number of entries which can be added to the database. Unfortunately there is not much to be done to reduce the impact of this 4 GiB storage on disk without much larger economic changes to Bitcoin to allow for entries to be "archived".
Validation complexity. The transactions within the chain have a cost associated with their validation. ECDSA is used in Bitcoin due to it being extremely space efficient and suitable for the task, but it is not particularly fast even with completely optimized implementations. Synchronizing the Bitcoin chain involves billions of individual SHA256 and ECDSA operations which puts a hard limit on the number which can be performed by a consumer processor in a reasonable amount of time. The growth of the chain should not exceed the ability for reasonable, consumer hardware to complete validation.
add a comment
|
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "308"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fbitcoin.stackexchange.com%2fquestions%2f90819%2fwill-the-size-of-bitcoin-core-full-node-be-too-big-to-run-on-a-normal-computer%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Bitcoin has to maintain some balance to be able to maintain the ability to be decentralized. As you've correctly established, this is partly down to making sure that the resource requirements of fully validating the block chain are not unreasonable. There's a push form the consumer side of things to constantly increase the resource usage of Bitcoin for convenience or lower transaction fees, but this comes at the cost of decentralization. What the limits for this are is debatable to some extent, but there are clearly hard limits to avoid.
Most notably it's difficult to undo mistakes in this area of the system design, going too far with resource usage is effectively a permanent decision so you will find that choices have been made as conservative as reasonable.
We have the following limitations to contend with:
Size of the block chain. It is often advantageous to store previously validated information, even if it is not immediately required for the operation of the system. It makes wallet management easier and allows for easy re-synchronization if required. The limit of this can not be above reasonable cost or the size of volumes available to the mass-market.
Growth of the block chain size on dish currently is bounded by the block size limit. Knowing that 6 blocks happen on average per hour, and they can be a maximum of 4MiB when fully saturated with Segwit transactions, we have a bounded growth of 210 GiB per year.
Nodes can disgard these blocks once validated to avoid storing them, but the entire set must be transmitted to them in some form, typically over the internet but could be done via Bluray, Fedex, or carrier pigeon if this was cheaper or required. This is a limit for the size of the growth, as it must not exceed the ability of users to obtain the data needed to validate.
Size of the UTXO. The Unspent Transaction Outputs database is the storage of units of Bitcoin that have not been spent. This is consensus critical and must be stored by all nodes in the network who are fully validating. This database has looser restrictions on its growth, and is effectively bounded by the block size as well, as it implicitly limits the number of entries which can be added to the database. Unfortunately there is not much to be done to reduce the impact of this 4 GiB storage on disk without much larger economic changes to Bitcoin to allow for entries to be "archived".
Validation complexity. The transactions within the chain have a cost associated with their validation. ECDSA is used in Bitcoin due to it being extremely space efficient and suitable for the task, but it is not particularly fast even with completely optimized implementations. Synchronizing the Bitcoin chain involves billions of individual SHA256 and ECDSA operations which puts a hard limit on the number which can be performed by a consumer processor in a reasonable amount of time. The growth of the chain should not exceed the ability for reasonable, consumer hardware to complete validation.
add a comment
|
Bitcoin has to maintain some balance to be able to maintain the ability to be decentralized. As you've correctly established, this is partly down to making sure that the resource requirements of fully validating the block chain are not unreasonable. There's a push form the consumer side of things to constantly increase the resource usage of Bitcoin for convenience or lower transaction fees, but this comes at the cost of decentralization. What the limits for this are is debatable to some extent, but there are clearly hard limits to avoid.
Most notably it's difficult to undo mistakes in this area of the system design, going too far with resource usage is effectively a permanent decision so you will find that choices have been made as conservative as reasonable.
We have the following limitations to contend with:
Size of the block chain. It is often advantageous to store previously validated information, even if it is not immediately required for the operation of the system. It makes wallet management easier and allows for easy re-synchronization if required. The limit of this can not be above reasonable cost or the size of volumes available to the mass-market.
Growth of the block chain size on dish currently is bounded by the block size limit. Knowing that 6 blocks happen on average per hour, and they can be a maximum of 4MiB when fully saturated with Segwit transactions, we have a bounded growth of 210 GiB per year.
Nodes can disgard these blocks once validated to avoid storing them, but the entire set must be transmitted to them in some form, typically over the internet but could be done via Bluray, Fedex, or carrier pigeon if this was cheaper or required. This is a limit for the size of the growth, as it must not exceed the ability of users to obtain the data needed to validate.
Size of the UTXO. The Unspent Transaction Outputs database is the storage of units of Bitcoin that have not been spent. This is consensus critical and must be stored by all nodes in the network who are fully validating. This database has looser restrictions on its growth, and is effectively bounded by the block size as well, as it implicitly limits the number of entries which can be added to the database. Unfortunately there is not much to be done to reduce the impact of this 4 GiB storage on disk without much larger economic changes to Bitcoin to allow for entries to be "archived".
Validation complexity. The transactions within the chain have a cost associated with their validation. ECDSA is used in Bitcoin due to it being extremely space efficient and suitable for the task, but it is not particularly fast even with completely optimized implementations. Synchronizing the Bitcoin chain involves billions of individual SHA256 and ECDSA operations which puts a hard limit on the number which can be performed by a consumer processor in a reasonable amount of time. The growth of the chain should not exceed the ability for reasonable, consumer hardware to complete validation.
add a comment
|
Bitcoin has to maintain some balance to be able to maintain the ability to be decentralized. As you've correctly established, this is partly down to making sure that the resource requirements of fully validating the block chain are not unreasonable. There's a push form the consumer side of things to constantly increase the resource usage of Bitcoin for convenience or lower transaction fees, but this comes at the cost of decentralization. What the limits for this are is debatable to some extent, but there are clearly hard limits to avoid.
Most notably it's difficult to undo mistakes in this area of the system design, going too far with resource usage is effectively a permanent decision so you will find that choices have been made as conservative as reasonable.
We have the following limitations to contend with:
Size of the block chain. It is often advantageous to store previously validated information, even if it is not immediately required for the operation of the system. It makes wallet management easier and allows for easy re-synchronization if required. The limit of this can not be above reasonable cost or the size of volumes available to the mass-market.
Growth of the block chain size on dish currently is bounded by the block size limit. Knowing that 6 blocks happen on average per hour, and they can be a maximum of 4MiB when fully saturated with Segwit transactions, we have a bounded growth of 210 GiB per year.
Nodes can disgard these blocks once validated to avoid storing them, but the entire set must be transmitted to them in some form, typically over the internet but could be done via Bluray, Fedex, or carrier pigeon if this was cheaper or required. This is a limit for the size of the growth, as it must not exceed the ability of users to obtain the data needed to validate.
Size of the UTXO. The Unspent Transaction Outputs database is the storage of units of Bitcoin that have not been spent. This is consensus critical and must be stored by all nodes in the network who are fully validating. This database has looser restrictions on its growth, and is effectively bounded by the block size as well, as it implicitly limits the number of entries which can be added to the database. Unfortunately there is not much to be done to reduce the impact of this 4 GiB storage on disk without much larger economic changes to Bitcoin to allow for entries to be "archived".
Validation complexity. The transactions within the chain have a cost associated with their validation. ECDSA is used in Bitcoin due to it being extremely space efficient and suitable for the task, but it is not particularly fast even with completely optimized implementations. Synchronizing the Bitcoin chain involves billions of individual SHA256 and ECDSA operations which puts a hard limit on the number which can be performed by a consumer processor in a reasonable amount of time. The growth of the chain should not exceed the ability for reasonable, consumer hardware to complete validation.
Bitcoin has to maintain some balance to be able to maintain the ability to be decentralized. As you've correctly established, this is partly down to making sure that the resource requirements of fully validating the block chain are not unreasonable. There's a push form the consumer side of things to constantly increase the resource usage of Bitcoin for convenience or lower transaction fees, but this comes at the cost of decentralization. What the limits for this are is debatable to some extent, but there are clearly hard limits to avoid.
Most notably it's difficult to undo mistakes in this area of the system design, going too far with resource usage is effectively a permanent decision so you will find that choices have been made as conservative as reasonable.
We have the following limitations to contend with:
Size of the block chain. It is often advantageous to store previously validated information, even if it is not immediately required for the operation of the system. It makes wallet management easier and allows for easy re-synchronization if required. The limit of this can not be above reasonable cost or the size of volumes available to the mass-market.
Growth of the block chain size on dish currently is bounded by the block size limit. Knowing that 6 blocks happen on average per hour, and they can be a maximum of 4MiB when fully saturated with Segwit transactions, we have a bounded growth of 210 GiB per year.
Nodes can disgard these blocks once validated to avoid storing them, but the entire set must be transmitted to them in some form, typically over the internet but could be done via Bluray, Fedex, or carrier pigeon if this was cheaper or required. This is a limit for the size of the growth, as it must not exceed the ability of users to obtain the data needed to validate.
Size of the UTXO. The Unspent Transaction Outputs database is the storage of units of Bitcoin that have not been spent. This is consensus critical and must be stored by all nodes in the network who are fully validating. This database has looser restrictions on its growth, and is effectively bounded by the block size as well, as it implicitly limits the number of entries which can be added to the database. Unfortunately there is not much to be done to reduce the impact of this 4 GiB storage on disk without much larger economic changes to Bitcoin to allow for entries to be "archived".
Validation complexity. The transactions within the chain have a cost associated with their validation. ECDSA is used in Bitcoin due to it being extremely space efficient and suitable for the task, but it is not particularly fast even with completely optimized implementations. Synchronizing the Bitcoin chain involves billions of individual SHA256 and ECDSA operations which puts a hard limit on the number which can be performed by a consumer processor in a reasonable amount of time. The growth of the chain should not exceed the ability for reasonable, consumer hardware to complete validation.
answered 8 hours ago
AnonymousAnonymous
9,6591 gold badge12 silver badges31 bronze badges
9,6591 gold badge12 silver badges31 bronze badges
add a comment
|
add a comment
|
Thanks for contributing an answer to Bitcoin Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fbitcoin.stackexchange.com%2fquestions%2f90819%2fwill-the-size-of-bitcoin-core-full-node-be-too-big-to-run-on-a-normal-computer%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown