Scalability And Efficiency

Discussion in 'Technical Development' started by CraigM, Dec 31, 2015.

  1. 2017/12/15 - Decred v1.1.2 released! → Release Notes  → Downloads
  1. CraigM

    CraigM New Member

    Dec 31, 2015
    How does the scalability of Decred compare to Bitcoin? If we just get the same thing with a few constant factor improvements, thats nice, but no real long term solution. Is the assumption these issues will be addressed in the future, and Decred will do better than bitcoin at it due to the better voting and funding system? I know these won't be issues at launch, but we all have seen what happened to bitcoin... I'm not going to bother syncing my bitcoin node ever again probably...

    1: Block chain size vs total transactions in history: If I understand correctly, all validating nodes still store some data for ever transaction that has been made. The “Segregated Witness” approach cuts this to about half if you don't bother to store the set of inputs, though I don't see how you can verify an output is unspent without that, so I'm not sure who could actually discard that. Is there any other optimizations in the disc/memory consumption area for Decred (ex: a way to have a full collection of unspent outputs in a constant bounded size that does not grow linearly with the number of transactions that have occurred)?

    2: Transaction latency: Whats the block rate? Any differences with respect to bitcoin as far as how many blocks gives how much confidence?

    3: Bandwidth to run a full validating node: And improvements over bitcoin, or does ever node still have to carry a copy of every single transaction (Aka, inherently not cheap to operate or scaleable as most commercial trust based systems)?

    4: Computation costs for running a full validating node: Any improvements here compared to bitcoin? The more flexible scripting options seems like it could significantly up the costs.

    5: What is the expected maximum transaction rate? Will it be hard limited? How will the resources needed compare to running a bitcoin node at its current maximum rate compare?

    6: What are the options for running less than a full validating node? Some details on dropping the inputs data (aka "Segregated Witness”) would be good, but also any other options. If this currency is going to last, something that has bounded space and computation bounds use would be good.

  2. sambiohazard

    sambiohazard Sr. Member

    Jan 21, 2016
    1. Seg Wit is an accounting trick, it doesn't make block smaller by excluding any data but rather it stores sigs in a separate space in block so now we have more room in space allocated for tx data. Also this is soft fork as older nodes just ignore separately stored sigs. I also didn't understand how will old nodes verify if they just ignore the sigs? Can old nodes be fooled?
    2. If you mean block time, it is 5 mins.
    I will leave rest for devs to answer, although i made a suggestion to "hire" or contact Dave Hudson of so that we can run tests for statistical models of future growth & decide parameters based on that rather than spitballing it or using random numbers. I think we need to learn from bitcoin here & try to make most parameters auto-adjusting like the difficulty.
    Renato Abreu and tacotime like this.

Share This Page