blocksize - What is the estimated size of complete ...

AMA: Ask Mike Anything

Hello again. It's been a while.
People have been emailing me about once a week or so for the last year to ask if I'm coming back to Bitcoin now that Bitcoin Cash exists. And a couple of weeks ago I was summoned on a thread called "Ask Mike Hearn Anything", but that was nothing to do with me and I was on holiday in Japan at the time. So I figured I should just answer all the different questions and answers in one place rather than keep doing it individually over email.
Firstly, thanks for the kind words on this sub. I don't take part anymore but I still visit occasionally to see what people are talking about, and the people posting nice messages is a pleasant change from three years ago.
Secondly, who am I? Some new Bitcoiners might not know.
I am Satoshi.
Just kidding. I'm not Satoshi. I was a Bitcoin developer for about five years, from 2010-2015. I was also one of the first Bitcoin users, sending my first coins in April 2009 (to SN), about 4 months after the genesis block. I worked on various things:
You can see a trend here - I was always interested in developing peer to peer decentralised applications that used Bitcoin.
But what I'm best known for is my role in the block size debate/civil war, documented by Nathaniel Popper in the New York Times. I spent most of 2015 writing extensively about why various proposals from the small-block/Blockstream faction weren't going to work (e.g. on replace by fee, lightning network, what would occur if no hard fork happened, soft forks, scaling conferences etc). After Blockstream successfully took over Bitcoin Core and expelled anyone who opposed them, Gavin and I forked Bitcoin Core to create Bitcoin XT, the first alternative node implementation to gain any serious usage. The creation of XT led to the imposition of censorship across all Bitcoin discussion forums and news outlets, resulted in the creation of this sub, and Core supporters paid a botnet operator to force XT nodes offline with DDoS attacks. They also convinced the miners and wider community to do nothing for years, resulting in the eventual overload of the main network.
I left the project at the start of 2016, documenting my reasons and what I expected to happen in my final essay on Bitcoin in which I said I considered it a failed experiment. Along with the article in the New York Times this pierced the censorship, made the wider world aware of what was going on, and thus my last gift to the community was a 20% drop in price (it soon recovered).

The last two years

Left Bitcoin ... but not decentralisation. After all that went down I started a new project called Corda. You can think of Corda as Bitcoin++, but modified for industrial use cases where a decentralised p2p database is more immediately useful than a new coin.
Corda incorporates many ideas I had back when I was working on Bitcoin but couldn't implement due to lack of time, resources, because of ideological wars or because they were too technically radical for the community. So even though it's doesn't provide a new cryptocurrency out of the box, it might be interesting for the Bitcoin Cash community to study anyway. By resigning myself to Bitcoin's fate and joining R3 I could go back to the drawing board and design with a lot more freedom, creating something inspired by Bitcoin's protocol but incorporating all the experience we gained writing Bitcoin apps over the years.
The most common question I'm asked is whether I'd come back and work on Bitcoin again. The obvious followup question is - come back and work on what? If you want to see some of the ideas I'd have been exploring if things had worked out differently, go read the Corda tech white paper. Here's a few of the things it might be worth asking about:
I don't plan on returning to Bitcoin but if you'd like to know what sort of things I'd have been researching or doing, ask about these things.
edit: Richard pointed out some essays he wrote that might be useful, Enterprise blockchains for cryptocurrency experts and New to Corda? Start here!
submitted by mike_hearn to btc [link] [comments]

ColossusXT Q2 AMA Ends!

Thank you for being a part of the ColossusXT Reddit AMA! Below we will summarize the questions and answers. The team responded to 78 questions! If you question was not included, it may have been answered in a previous question. The ColossusXT team will do a Reddit AMA at the end of every quarter.
The winner of the Q2 AMA Contest is: Shenbatu
Q: Why does your blockchain exist and what makes it unique?
A: ColossusXT exists to provide an energy efficient method of supercomputing. ColossusXT is unique in many ways. Some coins have 1 layer of privacy. ColossusXT and the Colossus Grid will utilize 2 layers of privacy through Obfuscation Zerocoin Protocol, and I2P and these will protect users of the Colossus Grid as they utilize grid resources. There are also Masternodes and Proof of Stake which both can contribute to reducing 51% attacks, along with instant transactions and zero-fee transactions. This protection is paramount as ColossusXT evolves into the Colossus Grid. Grid Computing will have a pivotal role throughout the world, and what this means is that users will begin to experience the Internet as a seamless computational universe. Software applications, databases, sensors, video and audio streams-all will be reborn as services that live in cyberspace, assembling and reassembling themselves on the fly to meet the tasks at hand. Once plugged into the grid, a desktop machine will draw computational horsepower from all the other computers on the grid.
Q: What is the Colossus Grid?
A: ColossusXT is an anonymous blockchain through obfuscation, Zerocoin Protocol, along with utilization of I2P. These features will protect end user privacy as ColossusXT evolves into the Colossus Grid. The Colossus Grid will connect devices in a peer-to-peer network enabling users and applications to rent the cycles and storage of other users’ machines. This marketplace of computing power and storage will exclusively run on COLX currency. These resources will be used to complete tasks requiring any amount of computation time and capacity, or allow end users to store data anonymously across the COLX decentralized network. Today, such resources are supplied by entities such as centralized cloud providers which are constrained by closed networks, proprietary payment systems, and hard-coded provisioning operations. Any user ranging from a single PC owner to a large data center can share resources through Colossus Grid and get paid in COLX for their contributions. Renters of computing power or storage space, on the other hand, may do so at low prices compared to the usual market prices because they are only using resources that already exist.
Q: When will zerocoin be fully integrated?
A: Beta has been released for community testing on Test-Net. As soon as all the developers consider the code ready for Main-Net, it will be released. Testing of the code on a larger test network network will ensure a smooth transition.
Q: Is the end goal for the Colossus Grid to act as a decentralized cloud service, a resource pool for COLX users, or something else?
A: Colossus Grid will act as a grid computing resource pool for any user running a COLX node. How and why we apply the grid to solve world problems will be an ever evolving story.
Q: What do you think the marketing role in colx.? When ll be the inwallet shared nodes available...i know its been stated in roadmap but as u dont follow roadmap and offer everything in advance...i hope shared MN's to be avilable soon.
A: The ColossusXT (COLX) roadmap is a fluid design philosophy. As the project evolves, and our community grows. Our goal is to deliver a working product to the market while at the same time adding useful features for the community to thrive on, perhaps the Colossus Grid and Shared Masternodes will be available both by the end of Q4 2018.
Q: When will your github be open to the public?
A: The GitHub has been open to the public for a few months now.
You can view the GitHub here: https://github.com/ColossusCoinXT
The latest commits here: https://github.com/ColossusCoinXT/ColossusCoinXT/commits/master
Q: Why should I use COLX instead of Monero?
A: ColossusXT offers Proof of Stake and Masternodes both which contribute layers in protection from 51% attacks often attributed with Proof of Work consensus, and in being Proof of Work(Monero) ColossusXT is environmentally friendly compared to Proof of Work (Monero). You can generate passive income from Proof of Stake, and Masternodes. Along with helping secure the network.What really sets ColossusXT apart from Monero, and many other privacy projects being worked on right now, is the Colossus Grid. Once plugged into the Colossus Grid, a desktop machine will draw computational horsepower from all the other computers on the grid. Blockchain, was built on the core value of decentralization and ColossusXT adhere to these standards with end-user privacy in mind in the technology sector.
Q: With so many coins out with little to no purpose let alone a definitive use case, how will COLX distinguish itself from the crowd?
A: You are right, there are thousands of other coins. Many have no purpose, and we will see others “pumping” from day to day. It is the nature of markets, and crypto as groups move from coin to coin to make a quick profit. As blockchain regulations and information is made more easily digestible projects like ColossusXT will rise. Our goal is to produce a quality product that will be used globally to solve technical problems, in doing so grid computing on the ColossusXT network could create markets of its own within utilizing Super-computing resources. ColossusXT is more than just a currency, and our steadfast approach to producing technical accomplishments will not go unnoticed.
Q: Tell the crowd something about the I2P integration plan in the roadmap? 🙂
A: ColossusXT will be moving up the I2P network layer in the roadmap to meet a quicker development pace of the Colossus Grid. The I2P layer will serve as an abstraction layer further obfuscating the users of ColossusXT (COLX) nodes. Abstraction layer allows two parties to communicate in an anonymous manner. This network is optimised for anonymous file-sharing.
Q: What kind of protocols, if any, are being considered to prevent or punish misuse of Colossus Grid resources by bad actors, such as participation in a botnet/denial of service attack or the storage of stolen information across the Grid?
A: What defines bad actors? ColossusXT plans on marketing to governments and cyber security companies globally. Entities and individuals who will certainly want their privacy protected. There is a grey area between good and bad, and that is something we can certainly explore as a community. Did you have any ideas to contribute to this evolving variable?What we mean when we say marketing towards security companies and governments is being utilized for some of the projects and innovating new ways of grid computing.
Security: https://wiki.ncsa.illinois.edu/display/cybersec/Projects+and+Software
Governments: https://www.techwalla.com/articles/what-are-the-uses-of-a-supercomputer
Q: The Colossus Grid is well defined but I don't feel easily digestible. Has their been any talk of developing an easier to understand marketing plan to help broaden the investoadoptor base?
A: As we get closer to the release of the Colossus Grid marketing increase for the Colossus Grid. It will have a user friendly UI, and we will provide Guides and FAQ’s with the release that any user intending to share computing power will be able to comprehend.
Q: Can you compare CollossusXT and Golem?
A: Yes. The Colosssus Grid is similar to other grid computing projects. The difference is that ColossusXT is on it’s own blockchain, and does not rely on the speed or congestion of a 3rd party blockchain. The Colossus Grid has a privacy focus and will market to companies, and individuals who would like to be more discreet when buying or selling resources by offering multiple levels of privacy protections.
Q: How do you guys want to achieve to be one of the leaders as a privacy coin?
A: Being a privacy coin leader is not our end game. Privacy features are just a small portion of our framework. The Colossus Grid will include privacy features, but a decentralized Supercomputer is what will set us apart and we intend to be leading this industry in the coming years as our vision, and development continue to grow and scale with technology.
Q: With multiple coins within this space, data storage and privacy, how do you plan to differentiate COLX from the rest? Any further partnerships planned?
A: The Colossus Grid will differentiate ColossusXT from coins within the privacy space. The ColossusXT blockchain will differentiate us from the DATA storage space. Combining these two features with the ability to buy and sell computing power to complete different computational tasks through a decentralized marketplace. We intend to involve more businesses and individuals within the community and will invite many companies to join in connecting the grid to utilize shared resources and reduce energy waste globally when the BETA is available.
Q: Has colossus grid had the best come up out of all crypto coins?
A: Possibly. ColossusXT will continue to “come up” as we approach the launch of the Colossus Grid network.
Q: How far have Colossus gone in the ATM integration
A: ColossusXT intends to and will play an important role in the mass adoption of cryptocurrencies. We already have an ongoing partnership with PolisPay which will enable use of COLX via master debit cards. Along with this established relationship, ColossusXT team is in touch with possible companies to use colx widely where these can only be disclosed upon mutual agreement.
Q: How does COLX intend to disrupt the computing industry through Grid Computing?
A: Using the Colossus Grid on the ColossusXT blockchain, strengthens the network. Computers sit idly by for huge portions of the day. Connecting to the Colossus Grid and contributing those idle resources can make use of all the computing power going to waste, and assist in advancing multiple technology sectors and solving issues. Reducing costs, waste, and increased speed in technology sectors such as scientific research, machine learning, cyber security, and making it possible for anyone with a desktop PC to contribute resources to the Colossus Grid and earn passive income.
Q: What kind of partnerships do you have planned and can you share any of them? :)
A: The ColossusXT team will announce partnerships when they are available. It’s important to finalize all information and create strong avenues of communication between partners ColossusXT works with in the future. We are currently speaking with many different exchanges, merchants, and discussing options within our technology sector for utilizing the Colossus Grid.
Q: Will shared Masternodes be offered by the COLX team? Or will there be any partnerships with something like StakingLab, StakeUnited, or SimplePosPool? StakingLab allows investors of any size to join their shared Masternodes, so any investor of any size can join. Is this a possibility in the future?
A: ColossusXT has already partnered with StakingLab. We also plan to implement shared Masternodes in the desktop wallet.
Q: How innovative is the Colossus Grid in the privacy coin space?
A: Most privacy coins are focused on being just a currency / form of payment. No other project is attempting to do what we are doing with a focus on user privacy.
Q: Hey guys do you think to integrated with some other plataforms like Bancor? I would like it!
A: ColossusXT is in touch with many exchange platforms, however, due to non disclosure agreements details cannot be shared until it is mutually decided with the partners. We will always be looking for new platforms to spread the use of colx in different parts of the world and crypto space.
Q: What is the reward system for the master node owners?
A: From block 388.800 onwards, block reward is 1200 colx and this is split based on masternode ownestaker ratio. This split is based on see-saw algorithm. With an increasing number of masternodes the see-saw algorithm disincentivizes the establishment of even more masternodes because it lowers their profitability. To be precise, as soon as more than 41.5% of the total COLX coin supply is locked in masternodes, more than 50% of the block reward will be distributed to regular staking nodes. As long as the amount of locked collateral funds is below the threshold of 41.5%, the see-saw algorithm ensure that running a masternode is financially more attractive than running a simple staking node, to compensate for the additional effort that a masternode requires in comparison to a simple staking node.Please refer to our whitepaper for more information.
Q: What other marketplaces has the COLX team been in contact with?
Thanks guys! Love the coin and staff
A: ColossusXT gets in touch for different platforms based on community request and also based on partnership requests received upon ColossusXT business team’s mutual agreement. Unfortunately, these possibilities cannot be shared until they are mutually agreed between the partners and ColossusXT team due to non disclosure agreements.
Q: What do you think about the new rules that will soon govern crypto interactions in the EU? they are against anonymous payments
A: Blockchain technology is just now starting to become clear to different governments.
ColossusXT's privacy features protect the end-user from oversharing personal information. As you are probably aware from the multiple emails you've received recently from many websites.
Privacy policies are always being updated and expanded upon. The use of privacy features with utility coins like ColossusXT should be a regular norm throughout blockchain. This movement is part is about decentralization as much as it is about improving technology.
While this news may have a role to play. I don't think it is THE role that will continuously be played as blockchain technology is implemented throughout the world.
Q: Any hints on the next big feature implementation you guys are working on? According to road map - really excited to hear more about the Shared MN and the scale of the marketplace!
A: Current work is focused on the privacy layer of Colossus Grid and completing the updated wallet interface.
Q: Why choose COLX, or should I say why should we believe in COLX becoming what you promise in the roadmap. What are you different from all the other privacy coins with block chain establishment already in effect?
A: ColossusXT is an environmentally friendly Proof of Stake, with Masternode technology that provide dual layers of protection from 51% attacks. It includes privacy features that protect the user while the utilize resources from the Colossus Grid. Some of the previous questions within this AMA may also answer this question.
Q: What tradeoffs do you have using the Colossus Grid versus the more typical distribution?
A: The advantage of supercomputers is that since data can move between processors rapidly, all of the processors can work together on the same tasks. Supercomputers are suited for highly-complex, real-time applications and simulations. However, supercomputers are very expensive to build and maintain, as they consist of a large array of top-of-the-line processors, fast memory, custom hardware, and expensive cooling systems. They also do not scale well, since their complexity makes it difficult to easily add more processors to such a precisely designed and finely tuned system.By contrast, the advantage of distributed systems (Like Colossus Grid) is that relative to supercomputers they are much less expensive. Many distributed systems make use of cheap, off-the-shelf computers for processors and memory, which only require minimal cooling costs. In addition, they are simpler to scale, as adding an additional processor to the system often consists of little more than connecting it to the network. However, unlike supercomputers, which send data short distances via sophisticated and highly optimized connections, distributed systems must move data from processor to processor over slower networks making them unsuitable for many real-time applications.
Q: Why should I choose Colossus instead of another 100,000 altcoins?
A: Many of these alt-coins are all very different projects. ColossusXT is the only Grid computing project with a focus on user privacy. We have instant transactions, and zero-fee transactions and ColossusXT is one of the very few coins to offer live support. Check out our Whitepaper!
Q: Will there be an option (in the future) to choose between an anonymous or public transaction?
A: Zerocoin is an evolution of the current coin mixing feature. Both allow an individual to decide how they would like to send their transactions.
Q: What exchange has highest volume for ColossusXT, and are there any plans for top exchanges soon ?
A: Currently Cryptopia carries the majority of ColossusXT volume. We are speaking with many different exchanges, and preparing requested documentation for different exchanges. ColossusXT intends to be traded on every major exchange globally.
Q: What is the TPS speed that colx blockchain achieves?
A: ColossusXT achieves between 65-67 TPS depending on network conditions currently.
Q: Plans on expanding the dev team?
A: As development funds allow it, the team will be expanded. Development costs are high for a unique product like ColossusXT, and a good majority of our budget is allocated to it.
Q: Can you explain what is and what are the full porpose of the COLOSSUSXT GRID PROJECT ?
A: Colossus Grid is explained in the whitepaper. The uses for grid computing and storage are vast, and we are only starting to scratch the surface on what this type of computing power can do. There is also a description within the formatting context within the AMA of the Colossus Grid.
Q: Is there mobile wallet for Android and iOS? If not, is there a roadmap?
A: There Android wallet is out of beta and on the Google PlayStore: iOS wallet is planned for development.
The roadmap can be found here: https://colossusxt.io/roadmap/
Q: Is ColossusXT planning on partnering up with other cryptocurrency projects? Such as: Bread and EQUAL.
A: ColossusXT plans on partnering with other crypto projects that make sense. We look for projects that can help alleviate some of our development work / provide quality of life upgrades to our investors so that we can focus on Colossus Grid development. When absolutely love it when the community comes to us with great projects to explore.
Q: Did you ever considered a coinburn? Don't you think a coin burn will increase COLX price and sustain mass adoption? Do you plan on keeping the price of COLX in a range so the potential big investors can invest in a not so much volatile project?
A**:** There are no plans to do a coinburn at this time. Please check out our section in the whitepaper about the supply.
Q: what is the next big exchange for colx to be listed ?
A: There are several exchanges that will be listing ColossusXT soon. Stay tuned for updates within the community as some have already been announced and future announcements.
  1. CryptalDash
  2. NextExchange
  3. CoinPulse
  4. CoinSwitch (Crowdfunding)
  5. Plaak (Crowdfunding)
Q: How will Colx compete with other privacy coins which claim to be better like Privacy?
A: ColossusXT is not competing with other privacy coins. ColossusXT will evolve into the Colossus Grid, which is built on the backbone of a privacy blockchain. In our vision, all these other privacy coins are competing for relevancy with ColossusXT. There are also similar responses to question that may hit on specifics.
Q: Does COLX have a finite number of coins like bitcoin?
A: No, ColossusXT is Proof of Stake. https://en.wikipedia.org/wiki/Proof-of-stake
Q: What are the advantages of COLX over other competitor coins (eg. ECA)?
A: The only similarities between ColossusXT and Electra is that we are both privacy blockchains. ColossusXT is very much an entirely different project that any other privacy coin in the blockchain world today. The Colossus Grid will be a huge advantage over any other privacy coin. Offering the ability for a desktop machine to rent power from others contributing to the Colossus Grid and perform and compute high level tasks.
Q: How do you feel about some countries frowning upon privacy coins and how do you plan to change their minds (and what do you plan to do about it?)
A: The ColossusXT team tries to view opinions from multiple perspectives so that we can understand each line of thinking. As blockchain technology becomes more widely adopted, so will the understanding of the importance of the privacy features within ColossusXT. Privacy is freedom.
Q: How do you see COLX in disrupting cloud gaming services such as PlayStation Now?
A: Cloud gaming services have not been discussed. Initial marketing of our private grid computing framework will be targeted at homes users, governments, and cyber security firms who may require more discretion / anonymity in their work.
Q: Since colx is a privacy coin and is known for its privacy in the transactions due to which lot of money laundering and scams could take place, would colx and its community be affected due to it? And if does then how could we try to prevent it?
A: ColossusXT intends to be known for the Colossus Grid. The Colossus Grid development will be moved up from Q1 2019 to Q3 2018 to reflect this message and prevent further miscommunication about what privacy means for the future of ColossusXT. Previous answers within this AMA may further elaborate on this question.
Q: When do you plan to list your coin on other "bigger" exchanges?
A: ColossusXT is speaking with many different exchanges. These things have many different factors. Exchanges decide on listing dates and we expect to see ColossusXT listed on larger exchanges as we approach the Colossus Grid Beta. The governance system can further assist in funding.
Q: What was the rationale behind naming your coin ColossusXT?
A: Colossus was a set of computers developed by British codebreakers in the years 1943–1945. XT symbolises ‘extended’ as the coin was forked from the original Cv2 coin.
Q: Can you give any details about the E Commerce Marketplace, and its progress?
A: The Ecommerce Marketplace is a project that will receive attention after our development pass on important privacy features for the grid. In general, our roadmap will be changing to put an emphasis on grid development.
Q: How will someone access the grid, and how will you monetize using the grid? Will there be an interface that charges COLX for time on the grid or data usage?
A: The Colossus Grid will be integrated within the ColossusXT wallet. Buying & Selling resources will happen within the wallet interface. You won't be able to charge for "time" on the grid, and have access to unlimited resources. The goal is to have users input what resources they need, and the price they are willing to pay. The Colossus Grid will then look for people selling resources at a value the buyer is willing to pay. Time may come into play based on which resources you are specifically asking for.
Q: Are there any plans to launch an official YouTube channel with instructional videos about basic use of the wallets and features of COLX? Most people are visually set and learn much faster about wallets when actually seeing it happen before they try themselves. This might attract people to ColossusXT and also teach people about basic use of blockchain and cryptocurrency wallets. I ask this because I see a lot of users on Discord and Telegram that are still learning and are asking a lot of real basic questions.
A: ColossusXT has an official YT account with instructional videos: https://www.youtube.com/channel/UCCmMLUSK4YoxKvrLoKJnzng
Q: What are the usp's of colx in comparing to other privacy coins?
A: Privacy coins are a dime a dozen. ColossusXT has different end goals than most privacy coins, and this cannot be stated enough. Our goal is not just to be another currency, but to build a sophisticated computing resource sharing architecture on top of the privacy blockchain.
Q: A new exchange will probably gain more liquidity for our coin. If you might choose 3 exchanges to get COLX listed, what would be your top 3?
A: ColossusXT intends to be listed on all major exchanges globally. :)
Q: What is the future of privacy coins? What will be the future colx userbase (beyond the first adopters and enthusiasts)?
A: The future of privacy is the same it has always been. Privacy is something each and everyone person owns, until they give it away to someone else. Who is in control of your privacy? You or another person or entity?The future of the ColossusXT user base will comprise of early adopters, enthusiast, computer science professionals, artificial intelligence, and computational linguistics professionals for which these users can utilize the Colossus Grid a wide range of needs.
Q: Will ColossusXT join more exchanges soon??
A: Yes. :)
Q: So when will Colossus put out lots of advertisement to the various social media sites to get better known? Like Youtube videos etc.
A: As we get closer to a product launch of the Colossus Grid, you’ll begin to see more advertisements, YouTubers, and interviews. We’re looking to also provide some presentations at blockchain conferences in 2018, and 2019.
Q: In your opinion, what are some of the issues holding COLX back from wider adoption? In that vein, what are some of the steps the team is considering to help address those issues?
A: One of the main issues that is holding ColossusXT back from a wider adoption is our endgame is very different from other privacy coins. The Colossus Grid. In order to address this issue, the ColossusXT team intends to have a Colossus Grid Beta out by the end of Q4 and we will move development of the Colossus Grid from Q1 2019 to Q3 2018.
Q: Or to see it from another perspective - what are some of the biggest issues with crypto-currency and how does COLX address those issues?
A: Biggest issue is that cryptocurrency is seen as a means to make quick money, what project is going to get the biggest “pump” of the week, and there is not enough focus on building blockchain technologies that solve problems or creating legitimate business use cases.
For the most part we believe the base of ColossusXT supporters see our end-game, and are willing to provide us with the time and support to complete our vision. The ColossusXT team keeps its head down and keeps pushing forward.
Q: I know it's still early in the development phase but can you give a little insight into what to look forward to regarding In-wallet voting and proposals system for the community? How much power will the community have regarding the direction COLX development takes in the future?
A: The budget and proposal system is detailed in the whitepaper. Masternode owners vote on and guide the development of ColossusXT by voting on proposals put forth by the community and business partners.
Our goal is to make this process as easy and accessible as possible to our community.
Q: Will there be an article explaining the significance of each partnership formed thus far?
A: Yes, the ColossusXT team will announce partners on social media, and community outlets. A detailed article of what partnerships mean will be available on our Medium page: https://medium.com/@colossusxt
Q: What potential output from the Grid is expected and what would it's use be?
For example, x teraflops which could process y solutions to protein folding in z time.
A: There are many uses for grid computing. A crypto enthusiast mining crypto, a cyber security professional cracking a password using brute force, or a scientist producing climate prediction models.
The resources available to put towards grid projects will be determined by the number of nodes sharing resources, and the amount of resources an individual is willing to purchase with COLX.
All individuals will not have access to infinite grid resources.
Q: Is there a paper wallet available?
A: Yes, see https://mycolxwallet.org
Q: Is there a possibility of implementing quantum computer measures in the future?
A: This is a great idea for potentially another project in the future. Currently this is not possible with the Colossus Grid. Instead of bits, which conventional computers use, a quantum computer uses quantum bits—known as qubits. In classical computing, a bit is a single piece of information that can exist in two states – 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.
Q: Do you plan to do a coin burn?
A: No future coin burns are planned. Anything like this would go through a governance proposal and Masternode owners would vote on this. This is not anything we’ve seen within the community being discussed.
Q: Can I check the exact number of current COLX master node and COLX staking node?
A: Yes. You can view the Masternodes and the amount of ColossusXT (COLX) being staked by viewing the block explorer.
Block explorer: https://chainz.cryptoid.info/colx/#!extraction
Q: What incentive could we give a youtuber to do the BEST video of ColossusXT (COLX)?
A: We've been approached by several YouTubers. The best thing a YouTuber can do is understand what ColossusXT is, join the community, ask questions if there is something they don't understand.
The problem with many YouTubers is that some of them are just trying to get paid, they don't really care to provide context or research a project.
Disclaimer: This is not all YouTubers, but many.
Q: In which ways is the ColossusGrid different from other supercomputer / distributed computing projects out there. Golem comes to mind. Thanks!
A: The main difference is that we are focused on the end users privacy, and the types of users that we will be targeting will be those that need more discretion / anonymity in their work. We are building framework that will continue to push the boundaries of user privacy as it relates to grid computing.
Q: Can we please complete our roadmap ahead of schedule? I find most other coins that do this actually excell in terms of price and community members. Keep on top of the game :)
A: The Colossus XT roadmap is a very fluid document, and it is always evolving. Some items are moved up in priority, and others are moved back. The roadmap should not be thought of something that is set in stone.
Q: Does COLX have master nodes?
A: Yes. ColossusXT has masternodes.
Q: Have thought about providing a method to insert a form of payment in colx in any page that wants to use cryptocurrencies in a fast and simple way in order to masive adoption????
A: There is already this option.https://mycryptocheckout.com/coins/
Q: What do you think your community progress till now?
A: The community has grown greatly in the last 3 months. We’re very excited to go from 13 to 100 questions in our quarterly AMA. Discord, Telegram, and Twitter are growing everyday.
Q: I noticed on Roadmap: Coinomi and ahapeshift wallet integration. Can you tell me more about this? I am new in crypto and new ColX investor so I don't know much about this. Thanks and keep a good work.
A: Coinomi is a universal wallet. ColossusXT will have multiple wallet platforms available to it. Shapeshift allows you to switch one crypto directly for another without the use of a coupler (BTC).
Q: Is "A general-purpose decentralized marketplace" written in the whitepaper the same as "E-COMMERCE MARKETPLACE" written on the roadmap?
Please tell me about "A general-purpose decentralized marketplace" or "E-COMMERCE MARKETPLACE" in detail.
A: Details will be posted as we get closer to the marketplace. It will be similar to other marketplaces within blockchain. Stay tuned for more information by following us on Twitter.
Q: History has shown that feature-based technologies always get replaced by technologies with platforms that incorporate those features; what is colossius big picture?
A: The Colossus Grid. Which has been explained within this AMA in a few different ways.
Q: What are the main objectives for COLX team this year? Provide me 5 reason why COLX will survive in a long term perspective? Do you consider masternodes working in a private easy to setup wallet on a DEX network? Already big fan, have a nice day!
A: Getting into Q3 our main object is to get a working product of the Colossus Grid by the end of Q4.
  1. Community - Our community is growing everyday as knowledge about what we’re building grows. When the Colossus Grid is online we expect expansion to grow at a rapid pace as users connect to share resources.
  2. Team - The ColossusXT team will continue to grow. We are stewards of a great community and an amazing project. Providing a level of support currently unseen in many other projects through Discord. The team cohesion and activity within the community is a standard we intend to set within the blockchain communities.
  3. Features - ColossusXT and The Colossus Grid will have user friendly AI. We understand the difficulties when users first enter blockchain products. The confusion between keys, sending/receiving addresses, and understanding available features within. Guides will always be published for Windows/Mac/Linux with updates so that these features can be easily understood.
  4. Colossus Grid - The Colossus Grid answers real world problems, and provides multiple solutions while also reducing energy consumption.
  5. Use Case - Many of the 1000+ other coins on the market don’t have the current use-case that ColossusXT has, let alone the expansion of utility use-cases in multiple sectors.
Q: Will the whitepaper be available in Portuguese?
A: Yes. We will be adding some language bounties to the website in the future. Stay tuned.
Q: Notice in your white paper there are future plans for decentralised governance and masternode voting. While all that is great, how do you plan on mitigating malicious proposals from getting through by gaming the system (i.e. bot votes, multiple accounts, spam,etc)?
A: You cannot game the system. Masternode owners get 1 vote.
Q: Been a massive fan of this project since Dec last year, anyways what was the reason you guys thought of putting XT at the end of Colossus. :)
A: XT symbolizes ‘extended’ as the coin was forked from the original Cv2 coin.
Q: Do you plan a partnership within the banking industry to capitalize on such large amounts of money being moved continuously?
A: The focus will be on the Colossus Grid and Grid computing, with the option to participate in the financial sector of Blockchain through Polis Pay, and other partnerships that can be announced in the future.
Q: When will be COLX supported By The Ledger Wallet?
A: Integration with cold storage wallet is planned. I myself (PioyPioyPioy) have a Nano Ledger S and I love it!
Q: Where do you see yourself in five years?
A: The goal 5 years from now would be to be a leading competitor in cloud computing and storage. Providing government, private cybersecurity, and individuals with efficient solutions to Super-computing, cloud storage through Blockchain infrastructure. I would like to see hardware options of connecting to the grid to utilize resources after the Colossus Grid is online, and I think this can contribute to many use-case scenarios.
Q: How can I suggest business partnerships and strategic ideas etc to the ColossusXT team?
A: Join us in Discord. Members of the team here are active daily, you can also contact us at: [[email protected]](mailto:[email protected])
Q: A great project requires good funding. How do you plan to incorporate fund sourcing and management into the long-term planning of this project
A: Check out our governance section within the whitepaper. :)
Website: https://colossusxt.io
Whitepaper: https://colossuscoinxt.org/whitepape
Roadmap: https://colossuscoinxt.org/roadmap/
Follow ColossusXT on:
Twitter: https://twitter.com/colossuscoinxt
Facebook Page: https://www.facebook.com/ColossusCoin/
Telegram: https://web.telegram.org/#/im?p=s1245563208_12241980906364004453
Discord: https://discord.gg/WrnAPcx
Apply to join the team: https://docs.google.com/forms/d/1YcOoY6nyCZ6aggJNyMU-Y5me8_gLTHkuDY4SrQPRe-4/viewform?edit_requested=true
Contribute an idea: https://colossusxt.fider.io/
Q2 AMA Questions: https://www.reddit.com/ColossuscoinX/comments/8ppkxf/official_colossusxt_ama_q2/
Previous AMA: https://www.reddit.com/ColossuscoinX/comments/8bia7o/official_colossusxt_ama/
submitted by PioyPioyPioy to ColossuscoinX [link] [comments]

The Origins of the Blocksize Debate

On May 4, 2015, Gavin Andresen wrote on his blog:
I was planning to submit a pull request to the 0.11 release of Bitcoin Core that will allow miners to create blocks bigger than one megabyte, starting a little less than a year from now. But this process of peer review turned up a technical issue that needs to get addressed, and I don’t think it can be fixed in time for the first 0.11 release.
I will be writing a series of blog posts, each addressing one argument against raising the maximum block size, or against scheduling a raise right now... please send me an email ([email protected]) if I am missing any arguments
In other words, Gavin proposed a hard fork via a series of blog posts, bypassing all developer communication channels altogether and asking for personal, private emails from anyone interested in discussing the proposal further.
On May 5 (1 day after Gavin submitted his first blog post), Mike Hearn published The capacity cliff on his Medium page. 2 days later, he posted Crash landing. In these posts, he argued:
A common argument for letting Bitcoin blocks fill up is that the outcome won’t be so bad: just a market for fees... this is wrong. I don’t believe fees will become high and stable if Bitcoin runs out of capacity. Instead, I believe Bitcoin will crash.
...a permanent backlog would start to build up... as the backlog grows, nodes will start running out of memory and dying... as Core will accept any transaction that’s valid without any limit a node crash is eventually inevitable.
He also, in the latter article, explained that he disagreed with Satoshi's vision for how Bitcoin would mature[1][2]:
Neither me nor Gavin believe a fee market will work as a substitute for the inflation subsidy.
Gavin continued to publish the series of blog posts he had announced while Hearn made these predictions. [1][2][3][4][5][6][7]
Matt Corallo brought Gavin's proposal up on the bitcoin-dev mailing list after a few days. He wrote:
Recently there has been a flurry of posts by Gavin at http://gavinandresen.svbtle.com/ which advocate strongly for increasing the maximum block size. However, there hasnt been any discussion on this mailing list in several years as far as I can tell...
So, at the risk of starting a flamewar, I'll provide a little bait to get some responses and hope the discussion opens up into an honest comparison of the tradeoffs here. Certainly a consensus in this kind of technical community should be a basic requirement for any serious commitment to blocksize increase.
Personally, I'm rather strongly against any commitment to a block size increase in the near future. Long-term incentive compatibility requires that there be some fee pressure, and that blocks be relatively consistently full or very nearly full. What we see today are transactions enjoying next-block confirmations with nearly zero pressure to include any fee at all (though many do because it makes wallet code simpler).
This allows the well-funded Bitcoin ecosystem to continue building systems which rely on transactions moving quickly into blocks while pretending these systems scale. Thus, instead of working on technologies which bring Bitcoin's trustlessness to systems which scale beyond a blockchain's necessarily slow and (compared to updating numbers in a database) expensive settlement, the ecosystem as a whole continues to focus on building centralized platforms and advocate for changes to Bitcoin which allow them to maintain the status quo
Shortly thereafter, Corallo explained further:
The point of the hard block size limit is exactly because giving miners free rule to do anything they like with their blocks would allow them to do any number of crazy attacks. The incentives for miners to pick block sizes are no where near compatible with what allows the network to continue to run in a decentralized manner.
Tier Nolan considered possible extensions and modifications that might improve Gavin's proposal and argued that soft caps could be used to mitigate against the dangers of a blocksize increase. Tom Harding voiced support for Gavin's proposal
Peter Todd mentioned that a limited blocksize provides the benefit of protecting against the "perverse incentives" behind potential block withholding attacks.
Slush didn't have a strong opinion one way or the other, and neither did Eric Lombrozo, though Eric was interested in developing hard-fork best practices and wanted to:
explore all the complexities involved with deployment of hard forks. Let’s not just do a one-off ad-hoc thing.
Matt Whitlock voiced his opinion:
I'm not so much opposed to a block size increase as I am opposed to a hard fork... I strongly fear that the hard fork itself will become an excuse to change other aspects of the system in ways that will have unintended and possibly disastrous consequences.
Bryan Bishop strongly opposed Gavin's proposal, and offered a philosophical perspective on the matter:
there has been significant public discussion... about why increasing the max block size is kicking the can down the road while possibly compromising blockchain security. There were many excellent objections that were raised that, sadly, I see are not referenced at all in the recent media blitz. Frankly I can't help but feel that if contributions, like those from #bitcoin-wizards, have been ignored in lieu of technical analysis, and the absence of discussion on this mailing list, that I feel perhaps there are other subtle and extremely important technical details that are completely absent from this--and other-- proposals.
Secured decentralization is the most important and most interesting property of bitcoin. Everything else is rather trivial and could be achieved millions of times more efficiently with conventional technology. Our technical work should be informed by the technical nature of the system we have constructed.
There's no doubt in my mind that bitcoin will always see the most extreme campaigns and the most extreme misunderstandings... for development purposes we must hold ourselves to extremely high standards before proposing changes, especially to the public, that have the potential to be unsafe and economically unsafe.
There are many potential technical solutions for aggregating millions (trillions?) of transactions into tiny bundles. As a small proof-of-concept, imagine two parties sending transactions back and forth 100 million times. Instead of recording every transaction, you could record the start state and the end state, and end up with two transactions or less. That's a 100 million fold, without modifying max block size and without potentially compromising secured decentralization.
The MIT group should listen up and get to work figuring out how to measure decentralization and its security.. Getting this measurement right would be really beneficial because we would have a more academic and technical understanding to work with.
Gregory Maxwell echoed and extended that perspective:
When Bitcoin is changed fundamentally, via a hard fork, to have different properties, the change can create winners or losers...
There are non-trivial number of people who hold extremes on any of these general belief patterns; Even among the core developers there is not a consensus on Bitcoin's optimal role in society and the commercial marketplace.
there is a at least a two fold concern on this particular ("Long term Mining incentives") front:
One is that the long-held argument is that security of the Bitcoin system in the long term depends on fee income funding autonomous, anonymous, decentralized miners profitably applying enough hash-power to make reorganizations infeasible.
For fees to achieve this purpose, there seemingly must be an effective scarcity of capacity.
The second is that when subsidy has fallen well below fees, the incentive to move the blockchain forward goes away. An optimal rational miner would be best off forking off the current best block in order to capture its fees, rather than moving the blockchain forward...
tools like the Lightning network proposal could well allow us to hit a greater spectrum of demands at once--including secure zero-confirmation (something that larger blocksizes reduce if anything), which is important for many applications. With the right technology I believe we can have our cake and eat it too, but there needs to be a reason to build it; the security and decentralization level of Bitcoin imposes a hard upper limit on anything that can be based on it.
Another key point here is that the small bumps in blocksize which wouldn't clearly knock the system into a largely centralized mode--small constants--are small enough that they don't quantitatively change the operation of the system; they don't open up new applications that aren't possible today
the procedure I'd prefer would be something like this: if there is a standing backlog, we-the-community of users look to indicators to gauge if the network is losing decentralization and then double the hard limit with proper controls to allow smooth adjustment without fees going to zero (see the past proposals for automatic block size controls that let miners increase up to a hard maximum over the median if they mine at quadratically harder difficulty), and we don't increase if it appears it would be at a substantial increase in centralization risk. Hardfork changes should only be made if they're almost completely uncontroversial--where virtually everyone can look at the available data and say "yea, that isn't undermining my property rights or future use of Bitcoin; it's no big deal". Unfortunately, every indicator I can think of except fee totals has been going in the wrong direction almost monotonically along with the blockchain size increase since 2012 when we started hitting full blocks and responded by increasing the default soft target. This is frustrating
many people--myself included--have been working feverishly hard behind the scenes on Bitcoin Core to increase the scalability. This work isn't small-potatoes boring software engineering stuff; I mean even my personal contributions include things like inventing a wholly new generic algebraic optimization applicable to all EC signature schemes that increases performance by 4%, and that is before getting into the R&D stuff that hasn't really borne fruit yet, like fraud proofs. Today Bitcoin Core is easily >100 times faster to synchronize and relay than when I first got involved on the same hardware, but these improvements have been swallowed by the growth. The ironic thing is that our frantic efforts to keep ahead and not lose decentralization have both not been enough (by the best measures, full node usage is the lowest its been since 2011 even though the user base is huge now) and yet also so much that people could seriously talk about increasing the block size to something gigantic like 20MB. This sounds less reasonable when you realize that even at 1MB we'd likely have a smoking hole in the ground if not for existing enormous efforts to make scaling not come at a loss of decentralization.
Peter Todd also summarized some academic findings on the subject:
In short, without either a fixed blocksize or fixed fee per transaction Bitcoin will will not survive as there is no viable way to pay for PoW security. The latter option - fixed fee per transaction - is non-trivial to implement in a way that's actually meaningful - it's easy to give miners "kickbacks" - leaving us with a fixed blocksize.
Even a relatively small increase to 20MB will greatly reduce the number of people who can participate fully in Bitcoin, creating an environment where the next increase requires the consent of an even smaller portion of the Bitcoin ecosystem. Where does that stop? What's the proposed mechanism that'll create an incentive and social consensus to not just 'kick the can down the road'(3) and further centralize but actually scale up Bitcoin the hard way?
Some developers (e.g. Aaron Voisine) voiced support for Gavin's proposal which repeated Mike Hearn's "crash landing" arguments.
Pieter Wuille said:
I am - in general - in favor of increasing the size blocks...
Controversial hard forks. I hope the mailing list here today already proves it is a controversial issue. Independent of personal opinions pro or against, I don't think we can do a hard fork that is controversial in nature. Either the result is effectively a fork, and pre-existing coins can be spent once on both sides (effectively failing Bitcoin's primary purpose), or the result is one side forced to upgrade to something they dislike - effectively giving a power to developers they should never have. Quoting someone: "I did not sign up to be part of a central banker's committee".
The reason for increasing is "need". If "we need more space in blocks" is the reason to do an upgrade, it won't stop after 20 MB. There is nothing fundamental possible with 20 MB blocks that isn't with 1 MB blocks.
Misrepresentation of the trade-offs. You can argue all you want that none of the effects of larger blocks are particularly damaging, so everything is fine. They will damage something (see below for details), and we should analyze these effects, and be honest about them, and present them as a trade-off made we choose to make to scale the system better. If you just ask people if they want more transactions, of course you'll hear yes. If you ask people if they want to pay less taxes, I'm sure the vast majority will agree as well.
Miner centralization. There is currently, as far as I know, no technology that can relay and validate 20 MB blocks across the planet, in a manner fast enough to avoid very significant costs to mining. There is work in progress on this (including Gavin's IBLT-based relay, or Greg's block network coding), but I don't think we should be basing the future of the economics of the system on undemonstrated ideas. Without those (or even with), the result may be that miners self-limit the size of their blocks to propagate faster, but if this happens, larger, better-connected, and more centrally-located groups of miners gain a competitive advantage by being able to produce larger blocks. I would like to point out that there is nothing evil about this - a simple feedback to determine an optimal block size for an individual miner will result in larger blocks for better connected hash power. If we do not want miners to have this ability, "we" (as in: those using full nodes) should demand limitations that prevent it. One such limitation is a block size limit (whatever it is).
Ability to use a full node.
Skewed incentives for improvements... without actual pressure to work on these, I doubt much will change. Increasing the size of blocks now will simply make it cheap enough to continue business as usual for a while - while forcing a massive cost increase (and not just a monetary one) on the entire ecosystem.
Fees and long-term incentives.
I don't think 1 MB is optimal. Block size is a compromise between scalability of transactions and verifiability of the system. A system with 10 transactions per day that is verifiable by a pocket calculator is not useful, as it would only serve a few large bank's settlements. A system which can deal with every coffee bought on the planet, but requires a Google-scale data center to verify is also not useful, as it would be trivially out-competed by a VISA-like design. The usefulness needs in a balance, and there is no optimal choice for everyone. We can choose where that balance lies, but we must accept that this is done as a trade-off, and that that trade-off will have costs such as hardware costs, decreasing anonymity, less independence, smaller target audience for people able to fully validate, ...
Choose wisely.
Mike Hearn responded:
this list is not a good place for making progress or reaching decisions.
if Bitcoin continues on its current growth trends it will run out of capacity, almost certainly by some time next year. What we need to see right now is leadership and a plan, that fits in the available time window.
I no longer believe this community can reach consensus on anything protocol related.
When the money supply eventually dwindles I doubt it will be fee pressure that funds mining
What I don't see from you yet is a specific and credible plan that fits within the next 12 months and which allows Bitcoin to keep growing.
Peter Todd then pointed out that, contrary to Mike's claims, developer consensus had been achieved within Core plenty of times recently. Btc-drak asked Mike to "explain where the 12 months timeframe comes from?"
Jorge Timón wrote an incredibly prescient reply to Mike:
We've successfully reached consensus for several softfork proposals already. I agree with others that hardfork need to be uncontroversial and there should be consensus about them. If you have other ideas for the criteria for hardfork deployment all I'm ears. I just hope that by "What we need to see right now is leadership" you don't mean something like "when Gaving and Mike agree it's enough to deploy a hardfork" when you go from vague to concrete.
Oh, so your answer to "bitcoin will eventually need to live on fees and we would like to know more about how it will look like then" it's "no bitcoin long term it's broken long term but that's far away in the future so let's just worry about the present". I agree that it's hard to predict that future, but having some competition for block space would actually help us get more data on a similar situation to be able to predict that future better. What you want to avoid at all cost (the block size actually being used), I see as the best opportunity we have to look into the future.
this is my plan: we wait 12 months... and start having full blocks and people having to wait 2 blocks for their transactions to be confirmed some times. That would be the beginning of a true "fee market", something that Gavin used to say was his #1 priority not so long ago (which seems contradictory with his current efforts to avoid that from happening). Having a true fee market seems clearly an advantage. What are supposedly disastrous negative parts of this plan that make an alternative plan (ie: increasing the block size) so necessary and obvious. I think the advocates of the size increase are failing to explain the disadvantages of maintaining the current size. It feels like the explanation are missing because it should be somehow obvious how the sky will burn if we don't increase the block size soon. But, well, it is not obvious to me, so please elaborate on why having a fee market (instead of just an price estimator for a market that doesn't even really exist) would be a disaster.
Some suspected Gavin/Mike were trying to rush the hard fork for personal reasons.
Mike Hearn's response was to demand a "leader" who could unilaterally steer the Bitcoin project and make decisions unchecked:
No. What I meant is that someone (theoretically Wladimir) needs to make a clear decision. If that decision is "Bitcoin Core will wait and watch the fireworks when blocks get full", that would be showing leadership
I will write more on the topic of what will happen if we hit the block size limit... I don't believe we will get any useful data out of such an event. I've seen distributed systems run out of capacity before. What will happen instead is technological failure followed by rapid user abandonment...
we need to hear something like that from Wladimir, or whoever has the final say around here.
Jorge Timón responded:
it is true that "universally uncontroversial" (which is what I think the requirement should be for hard forks) is a vague qualifier that's not formally defined anywhere. I guess we should only consider rational arguments. You cannot just nack something without further explanation. If his explanation was "I will change my mind after we increase block size", I guess the community should say "then we will just ignore your nack because it makes no sense". In the same way, when people use fallacies (purposely or not) we must expose that and say "this fallacy doesn't count as an argument". But yeah, it would probably be good to define better what constitutes a "sensible objection" or something. That doesn't seem simple though.
it seems that some people would like to see that happening before the subsidies are low (not necessarily null), while other people are fine waiting for that but don't want to ever be close to the scale limits anytime soon. I would also like to know for how long we need to prioritize short term adoption in this way. As others have said, if the answer is "forever, adoption is always the most important thing" then we will end up with an improved version of Visa. But yeah, this is progress, I'll wait for your more detailed description of the tragedies that will follow hitting the block limits, assuming for now that it will happen in 12 months. My previous answer to the nervous "we will hit the block limits in 12 months if we don't do anything" was "not sure about 12 months, but whatever, great, I'm waiting for that to observe how fees get affected". But it should have been a question "what's wrong with hitting the block limits in 12 months?"
Mike Hearn again asserted the need for a leader:
There must be a single decision maker for any given codebase.
Bryan Bishop attempted to explain why this did not make sense with git architecture.
Finally, Gavin announced his intent to merge the patch into Bitcoin XT to bypass the peer review he had received on the bitcoin-dev mailing list.
submitted by sound8bits to Bitcoin [link] [comments]

Samson Mow’s on 8BTC’s AMA: BU Are Low-Level CopyCats and We Do Support Onchain Scaling

Samson Mow, the COO of BTCC, has completed his AMA on 8btc on 2 Dec.
Samson has faced all the harsh questions raised and said BU is “awful” and he supports onchain Scaling.
We have move all the answers typed by Mr. Mow in person here.
Let’s see:
Q: How do you comment on BU?
A: For BU, I think it’s indeed an awful software. Actually it’s just a redesign based on Bitcoin Core as 99% of the codes are still those of Bitcoin Core. BU just has made some tiny changes. In developing BU, there are serval bugs in BU but they claim these bugs are just bugs from Bitcoin Core itself. Members from Core can tell the so called “bugs from Bitcoin Core itself” are simply caused by BU’s developers. BU is bad at coding and BU has not been through thorough tests. Many coders including Chinese and Westerners all thought BU’s codes are bad.
Besides, BU team actually has achieved nothing till now. If we say Bitcoin is a Ferrari with 100 Core members maintaining it, then BU team simply don’t know what a Ferrari is. BU only repairs bikes or even bikes are beyond their ability. All of these are because BU never has created or maintained any crypto currency. They even have never released any altcoin. I would rather believe in MaidSafeCoin or Dash’s teams than believe in BU.
Furthermore. BU changed the bitcoin’s rule of “ consensus-based principle”. BU is not based on consensus. Bitcoin’s rules are not made for mining but for users to decide the blocksize based on consensus. In order to gain support, BU now suddenly say bitcoin is created for mining, which is actually not even the thought of the developers of BU. Developers(of BU) also said they need to make some changes to conform to the consensus-based principle of bitcoin.
BU is just a form of political maneuvering that is being taken advantage of, just like Bitcoin XT and Bitcoin Classic. Those who support BU are actually not all for BU. The want to achieve their ulterior motives by supporting BU, say they want to scale blocksize, to alienate Core team or they just want to prove they are correct. Their reasons for supporting BU are all far-fetched or wrong.
(from ID Bitcoiners) Q :Does BTCC support onchain scaling ?
Yes, BTCC support onchain scaling.:)
We support any plan of scaling both on and off chain as long as they are safe and have been under thorough tests. SegeWit in essence is onchain scaling as it can make the block size bigger and enlarge the effective capacity of the blockchain for bitcoins.
Many people still think SW is not onchain scaling. But in fact SW is the fastest scaling onchain plan till now. Most of the people within the community oppose a hasty hard-fork; If we can reach consensus on SW, then we can achieve onchain scaling in several months, making it a reality to have bigger blocksize and capacity for more transactions.
Q:BTCC supports SW as mining pool(miners) or as an exchange.
A: we support SW as believe it can improve bitcoin and enlarge the capacity of block, making outstanding technologies like lighting possible. This will bring an all win situation for bitcoin’s traders, miners, buyers or holders. We have made the supportive decision based on our analysis of it and its future potential.
Q: under what circumstances will BTCC give up running a bitcoin app in production with activated SW soft-fork?
I don’t think I have any reason to give up SegeWit till now as it will bring many improvements to bitcoin. It fixes bitcoin’s malleability. If SW is activated, the use of lightening network becomes possible. So from technical angle, I will not give up SW.
But there are also chances for us to give up SW. Like if other mining pools give us pressure then we may make concessions. If the activation phase of SW comes to an end, then we might also give up SW. But in general, till now I do not see any reason not to support SW. SW is a technical progress instead of a political fight. It should not be affected by others’ emotion or preferences. SW is a technical changes of bitcoin’s the core codes.
If political fight in the bitcoin community results in joint pressure mounting to us, I would say this is not the situation we want to see. We need to make decisions based on the pros and cons of the SW, and on the consensus of the Core’s team members as Core’s members are all excellent programmers. These coders spent a lot of time considering the situation to explore the best scaling solution to fix problems that most of the ordinary people feel hard to understand. If others’ pressure makes us unable to run SW or we press others to run SW, the situation will be bad. I think every should make decisions based on the pros and cons of technicals.
Currently there are many rumors and misgiving within Chinese community. Many people are maligning SW. Like some people are claiming Core will change the POW into PoS; SW is poison; SW is not onchain scaling, or the lighting network will carve up miners’ benefits…all of these are rumors without any source. SW is indeed onchain scaling. Except BU, no developer or engineer would say SW is not onchain sacling.
Q: Won’t BTCC follow the 2015 Beijing Pool Declaration and 2016 Hong Kong Consensus anymore?
A: This seems to be a question of common concerns. I would like to reply in details. Wish it can be clearer for all.
For 2015 Beijng Mining Pool Declaration, there is a long story behind it. You can’t say what happed a year ago equally applies in today’s situation as both internet world and crypto area are evolving fast. The Consensus was actually response to Bitcoin XT, when Gavin Andresen and Mike Hearn firstly incited political fight within bitcoin community which has been witnessed by many mining pools.
At that time. Mike and Gavin tried to contact us quite frequently. They lobbied us and wanted us to use their Bitcoin XT. They said it can scale the blocksize into a 20MB one. They said the block was going to be full and actions must be taken. It’s until now that we are aware that it’s natural for the block to be full. If there is no full block, then there is no profits for the miners. The block space must maintain its scarcity to be valuable. But at that time we were not familiar with technical stuff and didn’t know how capable the Mike and Gavin were. We just knew 20MB was really bigger than 1MB and many other mining pools also felt the need to act so we were also a bit worried. But after some consideration, we believe to have 8MB block size was rather safe. To scale to 8 MB is referred to the Bitcoin XT’s plan of scaling to 20MB. We even didn’t intend to scale to 8MB blocksize. After the Beijing conference, Bitcoin XT distorted our intention by saying that our roadmap is to scale from 8MB to 8GB size. Many mining pools felt they were betrayed.
I don’t think that anyone should be required to conform to the 2015 Beijing Mining Pool Consensus. If it’s a must for everyone to conform to it, then BU should not have gained any support since we just need to scale to 8MB.
For 2016 Hong Kong consensus, it was actually the response to Bitcoin Classic. Bitcoin Classics misled us by saying that all people were supportive of them. Actually everyone at that time believed other people all support Bitcoin Classics so it turned out all people were for Bitcoin Classics. In was in the context that we held that Hong Kong Conference. The consensus stated that Core would write hard-fork codes. So many people thought it was an agreement between BTCC and Core. But actually the consensus was a response for Bitcoin Classic. There were 5 Core members at the site and they signed the consensus. But Bitcoin Core is neither a company nor an organization. It’s only some individuals and companies who support the development of Bitcoin Core. No one can compel Bitcoin Core to do anything and Core will not compel others to do anything. either. This is just the feature of bitcoin. Bitcoin is alive. It’s not a company which can post something on its official site. Likely, Bitcoin Core’s software will not update automatically. (Apple and Microsoft will send you a new version and you have to update). The update of Bitcoin Core is out of your free choices and you can also downgrade the system.
In fact, there are others things in Hong Kong consensus that have not been followed like Core hasn’t completed the development of SW in time. But this just proved their prudence. They will not accept a SW without thorough and sound tests. We have made some mistakes during the Hong Kong Consensus period. We were not familiar with the development of bitcoins. We have kept on learning and improving these years.
Actually Core team indeed has written the hard-fork codes which are published in BIP draft. To seem please find: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-February/012377.html https://petertodd.org/2016/hardforks-after-the-segwit-blocksize-increase https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-February/012342.html https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-Decembe013332.html
In the conference in San Francisco this summer, Core has displayed these codes but the community didn’t give many responses. Core members are trying their best to write code and the process is continuing. They can’t compel the Core to publish the hard-fork publicly as it requires the consensus within the whole Core members. There is no leader in Core.
Core also release 0.13, a version without SW for those who wants the most updated technique but are not willing to use SW. This version contains the most updated techniques like Schnorr signing.
Q: Does BTCC have any contingency plan for any bug which has been discussed on reddit?
Reddit is only a platform for people to share news or discuss anything. The so-called bug discussed on /btc are only the random guess by those who do not know technical stuff.
If you really want to discuss bug issues of SW, please subscribe Bitcoin Core’s email and go to their IRC chatting room. That’s where bug issue should be discussed effectively. Core has all of the communication records of Slack, IRC and subscription list published on the internet, though people won’t go there and see. People like to go to reddit. Reddit is not for technical discussion. It’s for…catfight. These so-called bugs have already been discussed between core members. It is because of these discussions of bugs’ elimination and tests that SW has come out later than expected, Core wants to provide reliable and bug-free codes to support its 11 billion USD worth industry of bitcoin.
Now we look at BU, it hasn’t had many test reports. Actually Core has reported bugs of BU and BU didn’t give any response.
Activity of BU on GitHub Imgur
Activity of Core Imgur
Core has done many tests and they even found bugs of library used by C++. https://github.com/bitcoin/bitcoin/commit/507145d78595e052ce13368e122f72c85093992c https://github.com/bitcoin/bitcoin/pull/9229
Q: Dose BTTC support that 1M blocksize should remain so permanently or believe it should be scaled at a proper time in the future?
Imgur
It is a misunderstanding commonly seen in Chinese community that Core wants the block size to remain 1MB forever.
Core’s road map is just hard fork. But optimization should proceed the hard-fork. Core never said they will hold IMB block size permanently. We don’t want a block with only 1MB size, either. But If bitcoin doesn’t possess the feature of decentralization, then bitcoin is useless. It would be something like a database. Thus the smaller the blocksize is, the better bitcoin is as everyone can run it. You can’t just take care of yourself. A hard disk may be extremely expensive for the poor people. Since those who boast bigger size do not represent all the community, what we could do is to lower the threshold as possibly as we can.
Many people may never involve themselves in Ethereum community. We wanted to run our ETC mining pool but we have encountered many problems only because the block size is too big. You can’t only envision inserting the blocksize in a disk without considering communication, synchronization and orphanage rate. Scaling is not that easy. What many people do not understand is that scaling shouldn’t be done without due consideration. If we put all the date from google and YouTube in everyone’s computer like ledgers of blockchain, then to double the data of Google and YouTube means to double the data of everyone. This will lead to an increasing pressure of the whole network. You have to pay the price for scaling. Those who think the costs are nothing for them simply can not represent everyone.
SW indeed will scale the blocksize and Core team have some techniques for omptimization like the Schnorr signing. Schnorr can compress the transactions of 16MB into a 1MB block under perfect condition. Now the theoretical size is to compress 4MB data into a 1MB blocksize. There are many other methods to make 1MB size block size handle more data. But if needed, we can scale the blocksize into 2MB.
Added: Core team is highly transparent. All their meetings are available on the internet. See https://bitcoincore.org/en/meetings/
Q: Has BTCC Pool’s support of SW gained understanding and support from miners in your pool? In another way, has BTCC pool explained pros and cons of various options? Any relevant explanatory information can be shared to other pools for reference?
A: we have a professional management team for mining pools and we have maintained active communication with them. Last week I just went to Chengdu of Sichuan Province to meet miners there. We have explained the benefits of SW to the miners of Chengdu and they expressed their supportive attitude. BTCC indeed will explain to our miners the pros and cons of different scaling plans. In the meantime, we also provide reference documents on our Weibo and Wechat to miners, traders and bitcoins fans. We invited one Lightening founder to Shanghai for a meeting with friends in Shanghai. Next week (11th NOV), We will also invite some Core members to be in Shanghai to discuss SW with friends present. We have provided the information of Bitcoin, SW and scaling plans to not only miners and but all users of BTCC.
Q: Has BTCC pool done extensive test on 0.13.1 SegWit code? Can you release test report?
A: Sure. Thorough tests need to be done. In early April 2016, Core has contacted China’s miners including BTCC, F2Pool, AntPool BW to test SW on SegNet; In later April our pool has mined the blocks containing SW transactions; In May, mining pools including BW all completed the tests of SegNet and they have mined SW block; in October, BTCC began to test Bitcoin Core 0.13.1 and the improvements of 0.13.1 has begun since; 18th Oct, the vote of SW officially kicks off. Sorry I don’t have test files for you. But till now, judging from the mining pool’s operation, everything is fine.
The AMA is conducted in Chinese.
Knowing that this AMA really matters for the both Chinese and Western community to know the ideas and thoughts of others, we have tried our best to keep the original meaning and tones in plain English.
To see the original Chinese AMA text,
Please first sign in on news.8btc.com , the international site of 8tbc, and then go directly to the thread:
http://8btc.com/thread-42814-1-1.html
Tune In http://news.8btc.com/ for more fist hand information on CN community.
submitted by 8btccom to Bitcoin [link] [comments]

Samson Mow’s on 8BTC’s AMA: BU Are terrible and We Do Support Onchain Scaling

Samson Mow, the COO of BTCC, has completed his AMA on 8btc. Samson has faced all the harsh questions raised and said BU is “awful” and he supports onchain Scaling.
We have move all the answers typed by Mr. Mow in person here.
Let’s see:
Q: How do you comment on BU?
A: For BU, I think it’s indeed an awful software. Actually it’s just a redesign based on Bitcoin Core as 99% of the codes are still those of Bitcoin Core. BU just has made some tiny changes. In developing BU, there are serval bugs in BU but they claim these bugs are just bugs from Bitcoin Core itself. Members from Core can tell the so called “bugs from Bitcoin Core itself” are simply caused by BU’s developers. BU is bad at coding and BU has not been through thorough tests. Many coders including Chinese and Westerners all thought BU’s codes are bad.
Besides, BU team actually has achieved nothing till now. If we say Bitcoin is a Ferrari with 100 Core members maintaining it, then BU team simply don’t know what a Ferrari is. BU only repairs bikes or even bikes are beyond their ability. All of these are because BU never has created or maintained any crypto currency. They even have never released any altcoin. I would rather believe in MaidSafeCoin or Dash’s teams than believe in BU.
Furthermore. BU changed the bitcoin’s rule of “ consensus-based principle”. BU is not based on consensus. Bitcoin’s rules are not made for mining but for users to decide the blocksize based on consensus. In order to gain support, BU now suddenly say bitcoin is created for mining, which is actually not even the thought of the developers of BU. Developers(of BU) also said they need to make some changes to conform to the consensus-based principle of bitcoin.
BU is just a form of political maneuvering that is being taken advantage of, just like Bitcoin XT and Bitcoin Classic. Those who support BU are actually not all for BU. The want to achieve their ulterior motives by supporting BU, say they want to scale blocksize, to alienate Core team or they just want to prove they are correct. Their reasons for supporting BU are all far-fetched or wrong.
(from ID Bitcoiners) Q :Does BTCC support onchain scaling ?
Yes, BTCC support onchain scaling.:)
We support any plan of scaling both on and off chain as long as they are safe and have been under thorough tests. SegeWit in essence is onchain scaling as it can make the block size bigger and enlarge the effective capacity of the blockchain for bitcoins.
Many people still think SW is not onchain scaling. But in fact SW is the fastest scaling onchain plan till now. Most of the people within the community oppose a hasty hard-fork; If we can reach consensus on SW, then we can achieve onchain scaling in several months, making it a reality to have bigger blocksize and capacity for more transactions.
Q:BTCC supports SW as mining pool(miners) or as an exchange.
A: we support SW as believe it can improve bitcoin and enlarge the capacity of block, making outstanding technologies like lighting possible. This will bring an all win situation for bitcoin’s traders, miners, buyers or holders. We have made the supportive decision based on our analysis of it and its future potential.
Q: under what circumstances will BTCC give up running a bitcoin app in production with activated SW soft-fork?
I don’t think I have any reason to give up SegeWit till now as it will bring many improvements to bitcoin. It fixes bitcoin’s malleability. If SW is activated, the use of lightening network becomes possible. So from technical angle, I will not give up SW.
But there are also chances for us to give up SW. Like if other mining pools give us pressure then we may make concessions. If the activation phase of SW comes to an end, then we might also give up SW. But in general, till now I do not see any reason not to support SW. SW is a technical progress instead of a political fight. It should not be affected by others’ emotion or preferences. SW is a technical changes of bitcoin’s the core codes.
If political fight in the bitcoin community results in joint pressure mounting to us, I would say this is not the situation we want to see. We need to make decisions based on the pros and cons of the SW, and on the consensus of the Core’s team members as Core’s members are all excellent programmers. These coders spent a lot of time considering the situation to explore the best scaling solution to fix problems that most of the ordinary people feel hard to understand. If others’ pressure makes us unable to run SW or we press others to run SW, the situation will be bad. I think every should make decisions based on the pros and cons of technicals.
Currently there are many rumors and misgiving within Chinese community. Many people are maligning SW. Like some people are claiming Core will change the POW into PoS; SW is poison; SW is not onchain scaling, or the lighting network will carve up miners’ benefits…all of these are rumors without any source. SW is indeed onchain scaling. Except BU, no developer or engineer would say SW is not onchain sacling.
Q: Won’t BTCC follow the 2015 Beijing Pool Declaration and 2016 Hong Kong Consensus anymore?
A: This seems to be a question of common concerns. I would like to reply in details. Wish it can be clearer for all.
For 2015 Beijng Mining Pool Declaration, there is a long story behind it. You can’t say what happed a year ago equally applies in today’s situation as both internet world and crypto area are evolving fast. The Consensus was actually response to Bitcoin XT, when Gavin Andresen and Mike Hearn firstly incited political fight within bitcoin community which has been witnessed by many mining pools.
At that time. Mike and Gavin tried to contact us quite frequently. They lobbied us and wanted us to use their Bitcoin XT. They said it can scale the blocksize into a 20MB one. They said the block was going to be full and actions must be taken. It’s until now that we are aware that it’s natural for the block to be full. If there is no full block, then there is no profits for the miners. The block space must maintain its scarcity to be valuable. But at that time we were not familiar with technical stuff and didn’t know how capable the Mike and Gavin were. We just knew 20MB was really bigger than 1MB and many other mining pools also felt the need to act so we were also a bit worried. But after some consideration, we believe to have 8MB block size was rather safe. To scale to 8 MB is referred to the Bitcoin XT’s plan of scaling to 20MB. We even didn’t intend to scale to 8MB blocksize. After the Beijing conference, Bitcoin XT distorted our intention by saying that our roadmap is to scale from 8MB to 8GB size. Many mining pools felt they were betrayed.
I don’t think that anyone should be required to conform to the 2015 Beijing Mining Pool Consensus. If it’s a must for everyone to conform to it, then BU should not have gained any support since we just need to scale to 8MB.
For 2016 Hong Kong consensus, it was actually the response to Bitcoin Classic. Bitcoin Classics misled us by saying that all people were supportive of them. Actually everyone at that time believed other people all support Bitcoin Classics so it turned out all people were for Bitcoin Classics. In was in the context that we held that Hong Kong Conference. The consensus stated that Core would write hard-fork codes. So many people thought it was an agreement between BTCC and Core. But actually the consensus was a response for Bitcoin Classic. There were 5 Core members at the site and they signed the consensus. But Bitcoin Core is neither a company nor an organization. It’s only some individuals and companies who support the development of Bitcoin Core. No one can compel Bitcoin Core to do anything and Core will not compel others to do anything. either. This is just the feature of bitcoin. Bitcoin is alive. It’s not a company which can post something on its official site. Likely, Bitcoin Core’s software will not update automatically. (Apple and Microsoft will send you a new version and you have to update). The update of Bitcoin Core is out of your free choices and you can also downgrade the system.
In fact, there are others things in Hong Kong consensus that have not been followed like Core hasn’t completed the development of SW in time. But this just proved their prudence. They will not accept a SW without thorough and sound tests. We have made some mistakes during the Hong Kong Consensus period. We were not familiar with the development of bitcoins. We have kept on learning and improving these years.
Actually Core team indeed has written the hard-fork codes which are published in BIP draft. To seem please find: https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-February/012377.html https://petertodd.org/2016/hardforks-after-the-segwit-blocksize-increase https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-February/012342.html https://lists.linuxfoundation.org/pipermail/bitcoin-dev/2016-Decembe013332.html
In the conference in San Francisco this summer, Core has displayed these codes but the community didn’t give many responses. Core members are trying their best to write code and the process is continuing. They can’t compel the Core to publish the hard-fork publicly as it requires the consensus within the whole Core members. There is no leader in Core.
Core also release 0.13, a version without SW for those who wants the most updated technique but are not willing to use SW. This version contains the most updated techniques like Schnorr signing.
Q: Does BTCC have any contingency plan for any bug which has been discussed on reddit?
Reddit is only a platform for people to share news or discuss anything. The so-called bug discussed on /btc are only the random guess by those who do not know technical stuff.
If you really want to discuss bug issues of SW, please subscribe Bitcoin Core’s email and go to their IRC chatting room. That’s where bug issue should be discussed effectively. Core has all of the communication records of Slack, IRC and subscription list published on the internet, though people won’t go there and see. People like to go to reddit. Reddit is not for technical discussion. It’s for…catfight. These so-called bugs have already been discussed between core members. It is because of these discussions of bugs’ elimination and tests that SW has come out later than expected, Core wants to provide reliable and bug-free codes to support its 11 billion USD worth industry of bitcoin.
Now we look at BU, it hasn’t had many test reports. Actually Core has reported bugs of BU and BU didn’t give any response.
Activity of BU on GitHub http://i.imgur.com/ElZ71vv.jpg
Activity of Core GitHub http://i.imgur.com/XbNGUqz.jpg
Core has done many tests and they even found bugs of library used by C++. https://github.com/bitcoin/bitcoin/commit/507145d78595e052ce13368e122f72c85093992c https://github.com/bitcoin/bitcoin/pull/9229
Q: Dose BTTC support that 1M blocksize should remain so permanently or believe it should be scaled at a proper time in the future?
http://i.imgur.com/P1duZTn.png
It is a misunderstanding commonly seen in Chinese community that Core wants the block size to remain 1MB forever.
Core’s road map is just hard fork. But optimization should proceed the hard-fork. Core never said they will hold IMB block size permanently. We don’t want a block with only 1MB size, either. But If bitcoin doesn’t possess the feature of decentralization, then bitcoin is useless. It would be something like a database. Thus the smaller the blocksize is, the better bitcoin is as everyone can run it. You can’t just take care of yourself. A hard disk may be extremely expensive for the poor people. Since those who boast bigger size do not represent all the community, what we could do is to lower the threshold as possibly as we can.
Many people may never involve themselves in Ethereum community. We wanted to run our ETC mining pool but we have encountered many problems only because the block size is too big. You can’t only envision inserting the blocksize in a disk without considering communication, synchronization and orphanage rate. Scaling is not that easy. What many people do not understand is that scaling shouldn’t be done without due consideration. If we put all the date from google and YouTube in everyone’s computer like ledgers of blockchain, then to double the data of Google and YouTube means to double the data of everyone. This will lead to an increasing pressure of the whole network. You have to pay the price for scaling. Those who think the costs are nothing for them simply can not represent everyone.
SW indeed will scale the blocksize and Core team have some techniques for omptimization like the Schnorr signing. Schnorr can compress the transactions of 16MB into a 1MB block under perfect condition. Now the theoretical size is to compress 4MB data into a 1MB blocksize. There are many other methods to make 1MB size block size handle more data. But if needed, we can scale the blocksize into 2MB.
Added: Core team is highly transparent. All their meetings are available on the internet. See https://bitcoincore.org/en/meetings/
Q: Has BTCC Pool’s support of SW gained understanding and support from miners in your pool? In another way, has BTCC pool explained pros and cons of various options? Any relevant explanatory information can be shared to other pools for reference?
A: we have a professional management team for mining pools and we have maintained active communication with them. Last week I just went to Chengdu of Sichuan Province to meet miners there. We have explained the benefits of SW to the miners of Chengdu and they expressed their supportive attitude. BTCC indeed will explain to our miners the pros and cons of different scaling plans. In the meantime, we also provide reference documents on our Weibo and Wechat to miners, traders and bitcoins fans. We invited one Lightening founder to Shanghai for a meeting with friends in Shanghai. Next week (11th NOV), We will also invite some Core members to be in Shanghai to discuss SW with friends present. We have provided the information of Bitcoin, SW and scaling plans to not only miners and but all users of BTCC.
Q: Has BTCC pool done extensive test on 0.13.1 SegWit code? Can you release test report?
A: Sure. Thorough tests need to be done. In early April 2016, Core has contacted China’s miners including BTCC, F2Pool, AntPool BW to test SW on SegNet; In later April our pool has mined the blocks containing SW transactions; In May, mining pools including BW all completed the tests of SegNet and they have mined SW block; in October, BTCC began to test Bitcoin Core 0.13.1 and the improvements of 0.13.1 has begun since; 18th Oct, the vote of SW officially kicks off. Sorry I don’t have test files for you. But till now, judging from the mining pool’s operation, everything is fine.
The AMA is conducted in Chinese.
Knowing that this AMA really matters for the both Chinese and Western community to know the ideas and thoughts of others, we have tried our best to keep the original meaning and tones in plain English.
To see the original Chinese AMA text,
Please first sign in on news.8btc.com, the international site of 8tbc, and then go directly to the thread:
http://8btc.com/thread-42814-1-1.html
Tune In http://news.8btc.com/ for more fist hand information on CN community.
submitted by 8btccom to btc [link] [comments]

The Origins of the (Modern) Blocksize Debate

On May 4, 2015, Gavin Andresen wrote on his blog:
I was planning to submit a pull request to the 0.11 release of Bitcoin Core that will allow miners to create blocks bigger than one megabyte, starting a little less than a year from now. But this process of peer review turned up a technical issue that needs to get addressed, and I don’t think it can be fixed in time for the first 0.11 release.
I will be writing a series of blog posts, each addressing one argument against raising the maximum block size, or against scheduling a raise right now... please send me an email ([email protected]) if I am missing any arguments
In other words, Gavin proposed a hard fork via a series of blog posts, bypassing all developer communication channels altogether and asking for personal, private emails from anyone interested in discussing the proposal further.
On May 5 (1 day after Gavin submitted his first blog post), Mike Hearn published The capacity cliff on his Medium page. 2 days later, he posted Crash landing. In these posts, he argued:
A common argument for letting Bitcoin blocks fill up is that the outcome won’t be so bad: just a market for fees... this is wrong. I don’t believe fees will become high and stable if Bitcoin runs out of capacity. Instead, I believe Bitcoin will crash.
...a permanent backlog would start to build up... as the backlog grows, nodes will start running out of memory and dying... as Core will accept any transaction that’s valid without any limit a node crash is eventually inevitable.
He also, in the latter article, explained that he disagreed with Satoshi's vision for how Bitcoin would mature[1][2]:
Neither me nor Gavin believe a fee market will work as a substitute for the inflation subsidy.
Gavin continued to publish the series of blog posts he had announced while Hearn made these predictions. [1][2][3][4][5][6][7]
Matt Corallo brought Gavin's proposal up on the bitcoin-dev mailing list after a few days. He wrote:
Recently there has been a flurry of posts by Gavin at http://gavinandresen.svbtle.com/ which advocate strongly for increasing the maximum block size. However, there hasnt been any discussion on this mailing list in several years as far as I can tell...
So, at the risk of starting a flamewar, I'll provide a little bait to get some responses and hope the discussion opens up into an honest comparison of the tradeoffs here. Certainly a consensus in this kind of technical community should be a basic requirement for any serious commitment to blocksize increase.
Personally, I'm rather strongly against any commitment to a block size increase in the near future. Long-term incentive compatibility requires that there be some fee pressure, and that blocks be relatively consistently full or very nearly full. What we see today are transactions enjoying next-block confirmations with nearly zero pressure to include any fee at all (though many do because it makes wallet code simpler).
This allows the well-funded Bitcoin ecosystem to continue building systems which rely on transactions moving quickly into blocks while pretending these systems scale. Thus, instead of working on technologies which bring Bitcoin's trustlessness to systems which scale beyond a blockchain's necessarily slow and (compared to updating numbers in a database) expensive settlement, the ecosystem as a whole continues to focus on building centralized platforms and advocate for changes to Bitcoin which allow them to maintain the status quo
Shortly thereafter, Corallo explained further:
The point of the hard block size limit is exactly because giving miners free rule to do anything they like with their blocks would allow them to do any number of crazy attacks. The incentives for miners to pick block sizes are no where near compatible with what allows the network to continue to run in a decentralized manner.
Tier Nolan considered possible extensions and modifications that might improve Gavin's proposal and argued that soft caps could be used to mitigate against the dangers of a blocksize increase. Tom Harding voiced support for Gavin's proposal
Peter Todd mentioned that a limited blocksize provides the benefit of protecting against the "perverse incentives" behind potential block withholding attacks.
Slush didn't have a strong opinion one way or the other, and neither did Eric Lombrozo, though Eric was interested in developing hard-fork best practices and wanted to:
explore all the complexities involved with deployment of hard forks. Let’s not just do a one-off ad-hoc thing.
Matt Whitlock voiced his opinion:
I'm not so much opposed to a block size increase as I am opposed to a hard fork... I strongly fear that the hard fork itself will become an excuse to change other aspects of the system in ways that will have unintended and possibly disastrous consequences.
Bryan Bishop strongly opposed Gavin's proposal, and offered a philosophical perspective on the matter:
there has been significant public discussion... about why increasing the max block size is kicking the can down the road while possibly compromising blockchain security. There were many excellent objections that were raised that, sadly, I see are not referenced at all in the recent media blitz. Frankly I can't help but feel that if contributions, like those from #bitcoin-wizards, have been ignored in lieu of technical analysis, and the absence of discussion on this mailing list, that I feel perhaps there are other subtle and extremely important technical details that are completely absent from this--and other-- proposals.
Secured decentralization is the most important and most interesting property of bitcoin. Everything else is rather trivial and could be achieved millions of times more efficiently with conventional technology. Our technical work should be informed by the technical nature of the system we have constructed.
There's no doubt in my mind that bitcoin will always see the most extreme campaigns and the most extreme misunderstandings... for development purposes we must hold ourselves to extremely high standards before proposing changes, especially to the public, that have the potential to be unsafe and economically unsafe.
There are many potential technical solutions for aggregating millions (trillions?) of transactions into tiny bundles. As a small proof-of-concept, imagine two parties sending transactions back and forth 100 million times. Instead of recording every transaction, you could record the start state and the end state, and end up with two transactions or less. That's a 100 million fold, without modifying max block size and without potentially compromising secured decentralization.
The MIT group should listen up and get to work figuring out how to measure decentralization and its security.. Getting this measurement right would be really beneficial because we would have a more academic and technical understanding to work with.
Gregory Maxwell echoed and extended that perspective:
When Bitcoin is changed fundamentally, via a hard fork, to have different properties, the change can create winners or losers...
There are non-trivial number of people who hold extremes on any of these general belief patterns; Even among the core developers there is not a consensus on Bitcoin's optimal role in society and the commercial marketplace.
there is a at least a two fold concern on this particular ("Long term Mining incentives") front:
One is that the long-held argument is that security of the Bitcoin system in the long term depends on fee income funding autonomous, anonymous, decentralized miners profitably applying enough hash-power to make reorganizations infeasible.
For fees to achieve this purpose, there seemingly must be an effective scarcity of capacity.
The second is that when subsidy has fallen well below fees, the incentive to move the blockchain forward goes away. An optimal rational miner would be best off forking off the current best block in order to capture its fees, rather than moving the blockchain forward...
tools like the Lightning network proposal could well allow us to hit a greater spectrum of demands at once--including secure zero-confirmation (something that larger blocksizes reduce if anything), which is important for many applications. With the right technology I believe we can have our cake and eat it too, but there needs to be a reason to build it; the security and decentralization level of Bitcoin imposes a hard upper limit on anything that can be based on it.
Another key point here is that the small bumps in blocksize which wouldn't clearly knock the system into a largely centralized mode--small constants--are small enough that they don't quantitatively change the operation of the system; they don't open up new applications that aren't possible today
the procedure I'd prefer would be something like this: if there is a standing backlog, we-the-community of users look to indicators to gauge if the network is losing decentralization and then double the hard limit with proper controls to allow smooth adjustment without fees going to zero (see the past proposals for automatic block size controls that let miners increase up to a hard maximum over the median if they mine at quadratically harder difficulty), and we don't increase if it appears it would be at a substantial increase in centralization risk. Hardfork changes should only be made if they're almost completely uncontroversial--where virtually everyone can look at the available data and say "yea, that isn't undermining my property rights or future use of Bitcoin; it's no big deal". Unfortunately, every indicator I can think of except fee totals has been going in the wrong direction almost monotonically along with the blockchain size increase since 2012 when we started hitting full blocks and responded by increasing the default soft target. This is frustrating
many people--myself included--have been working feverishly hard behind the scenes on Bitcoin Core to increase the scalability. This work isn't small-potatoes boring software engineering stuff; I mean even my personal contributions include things like inventing a wholly new generic algebraic optimization applicable to all EC signature schemes that increases performance by 4%, and that is before getting into the R&D stuff that hasn't really borne fruit yet, like fraud proofs. Today Bitcoin Core is easily >100 times faster to synchronize and relay than when I first got involved on the same hardware, but these improvements have been swallowed by the growth. The ironic thing is that our frantic efforts to keep ahead and not lose decentralization have both not been enough (by the best measures, full node usage is the lowest its been since 2011 even though the user base is huge now) and yet also so much that people could seriously talk about increasing the block size to something gigantic like 20MB. This sounds less reasonable when you realize that even at 1MB we'd likely have a smoking hole in the ground if not for existing enormous efforts to make scaling not come at a loss of decentralization.
Peter Todd also summarized some academic findings on the subject:
In short, without either a fixed blocksize or fixed fee per transaction Bitcoin will will not survive as there is no viable way to pay for PoW security. The latter option - fixed fee per transaction - is non-trivial to implement in a way that's actually meaningful - it's easy to give miners "kickbacks" - leaving us with a fixed blocksize.
Even a relatively small increase to 20MB will greatly reduce the number of people who can participate fully in Bitcoin, creating an environment where the next increase requires the consent of an even smaller portion of the Bitcoin ecosystem. Where does that stop? What's the proposed mechanism that'll create an incentive and social consensus to not just 'kick the can down the road'(3) and further centralize but actually scale up Bitcoin the hard way?
Some developers (e.g. Aaron Voisine) voiced support for Gavin's proposal which repeated Mike Hearn's "crash landing" arguments.
Pieter Wuille said:
I am - in general - in favor of increasing the size blocks...
Controversial hard forks. I hope the mailing list here today already proves it is a controversial issue. Independent of personal opinions pro or against, I don't think we can do a hard fork that is controversial in nature. Either the result is effectively a fork, and pre-existing coins can be spent once on both sides (effectively failing Bitcoin's primary purpose), or the result is one side forced to upgrade to something they dislike - effectively giving a power to developers they should never have. Quoting someone: "I did not sign up to be part of a central banker's committee".
The reason for increasing is "need". If "we need more space in blocks" is the reason to do an upgrade, it won't stop after 20 MB. There is nothing fundamental possible with 20 MB blocks that isn't with 1 MB blocks.
Misrepresentation of the trade-offs. You can argue all you want that none of the effects of larger blocks are particularly damaging, so everything is fine. They will damage something (see below for details), and we should analyze these effects, and be honest about them, and present them as a trade-off made we choose to make to scale the system better. If you just ask people if they want more transactions, of course you'll hear yes. If you ask people if they want to pay less taxes, I'm sure the vast majority will agree as well.
Miner centralization. There is currently, as far as I know, no technology that can relay and validate 20 MB blocks across the planet, in a manner fast enough to avoid very significant costs to mining. There is work in progress on this (including Gavin's IBLT-based relay, or Greg's block network coding), but I don't think we should be basing the future of the economics of the system on undemonstrated ideas. Without those (or even with), the result may be that miners self-limit the size of their blocks to propagate faster, but if this happens, larger, better-connected, and more centrally-located groups of miners gain a competitive advantage by being able to produce larger blocks. I would like to point out that there is nothing evil about this - a simple feedback to determine an optimal block size for an individual miner will result in larger blocks for better connected hash power. If we do not want miners to have this ability, "we" (as in: those using full nodes) should demand limitations that prevent it. One such limitation is a block size limit (whatever it is).
Ability to use a full node.
Skewed incentives for improvements... without actual pressure to work on these, I doubt much will change. Increasing the size of blocks now will simply make it cheap enough to continue business as usual for a while - while forcing a massive cost increase (and not just a monetary one) on the entire ecosystem.
Fees and long-term incentives.
I don't think 1 MB is optimal. Block size is a compromise between scalability of transactions and verifiability of the system. A system with 10 transactions per day that is verifiable by a pocket calculator is not useful, as it would only serve a few large bank's settlements. A system which can deal with every coffee bought on the planet, but requires a Google-scale data center to verify is also not useful, as it would be trivially out-competed by a VISA-like design. The usefulness needs in a balance, and there is no optimal choice for everyone. We can choose where that balance lies, but we must accept that this is done as a trade-off, and that that trade-off will have costs such as hardware costs, decreasing anonymity, less independence, smaller target audience for people able to fully validate, ...
Choose wisely.
Mike Hearn responded:
this list is not a good place for making progress or reaching decisions.
if Bitcoin continues on its current growth trends it will run out of capacity, almost certainly by some time next year. What we need to see right now is leadership and a plan, that fits in the available time window.
I no longer believe this community can reach consensus on anything protocol related.
When the money supply eventually dwindles I doubt it will be fee pressure that funds mining
What I don't see from you yet is a specific and credible plan that fits within the next 12 months and which allows Bitcoin to keep growing.
Peter Todd then pointed out that, contrary to Mike's claims, developer consensus had been achieved within Core plenty of times recently. Btc-drak asked Mike to "explain where the 12 months timeframe comes from?"
Jorge Timón wrote an incredibly prescient reply to Mike:
We've successfully reached consensus for several softfork proposals already. I agree with others that hardfork need to be uncontroversial and there should be consensus about them. If you have other ideas for the criteria for hardfork deployment all I'm ears. I just hope that by "What we need to see right now is leadership" you don't mean something like "when Gaving and Mike agree it's enough to deploy a hardfork" when you go from vague to concrete.
Oh, so your answer to "bitcoin will eventually need to live on fees and we would like to know more about how it will look like then" it's "no bitcoin long term it's broken long term but that's far away in the future so let's just worry about the present". I agree that it's hard to predict that future, but having some competition for block space would actually help us get more data on a similar situation to be able to predict that future better. What you want to avoid at all cost (the block size actually being used), I see as the best opportunity we have to look into the future.
this is my plan: we wait 12 months... and start having full blocks and people having to wait 2 blocks for their transactions to be confirmed some times. That would be the beginning of a true "fee market", something that Gavin used to say was his #1 priority not so long ago (which seems contradictory with his current efforts to avoid that from happening). Having a true fee market seems clearly an advantage. What are supposedly disastrous negative parts of this plan that make an alternative plan (ie: increasing the block size) so necessary and obvious. I think the advocates of the size increase are failing to explain the disadvantages of maintaining the current size. It feels like the explanation are missing because it should be somehow obvious how the sky will burn if we don't increase the block size soon. But, well, it is not obvious to me, so please elaborate on why having a fee market (instead of just an price estimator for a market that doesn't even really exist) would be a disaster.
Some suspected Gavin/Mike were trying to rush the hard fork for personal reasons.
Mike Hearn's response was to demand a "leader" who could unilaterally steer the Bitcoin project and make decisions unchecked:
No. What I meant is that someone (theoretically Wladimir) needs to make a clear decision. If that decision is "Bitcoin Core will wait and watch the fireworks when blocks get full", that would be showing leadership
I will write more on the topic of what will happen if we hit the block size limit... I don't believe we will get any useful data out of such an event. I've seen distributed systems run out of capacity before. What will happen instead is technological failure followed by rapid user abandonment...
we need to hear something like that from Wladimir, or whoever has the final say around here.
Jorge Timón responded:
it is true that "universally uncontroversial" (which is what I think the requirement should be for hard forks) is a vague qualifier that's not formally defined anywhere. I guess we should only consider rational arguments. You cannot just nack something without further explanation. If his explanation was "I will change my mind after we increase block size", I guess the community should say "then we will just ignore your nack because it makes no sense". In the same way, when people use fallacies (purposely or not) we must expose that and say "this fallacy doesn't count as an argument". But yeah, it would probably be good to define better what constitutes a "sensible objection" or something. That doesn't seem simple though.
it seems that some people would like to see that happening before the subsidies are low (not necessarily null), while other people are fine waiting for that but don't want to ever be close to the scale limits anytime soon. I would also like to know for how long we need to prioritize short term adoption in this way. As others have said, if the answer is "forever, adoption is always the most important thing" then we will end up with an improved version of Visa. But yeah, this is progress, I'll wait for your more detailed description of the tragedies that will follow hitting the block limits, assuming for now that it will happen in 12 months. My previous answer to the nervous "we will hit the block limits in 12 months if we don't do anything" was "not sure about 12 months, but whatever, great, I'm waiting for that to observe how fees get affected". But it should have been a question "what's wrong with hitting the block limits in 12 months?"
Mike Hearn again asserted the need for a leader:
There must be a single decision maker for any given codebase.
Bryan Bishop attempted to explain why this did not make sense with git architecture.
Finally, Gavin announced his intent to merge the patch into Bitcoin XT to bypass the peer review he had received on the bitcoin-dev mailing list.
submitted by sound8bits to sound8bits [link] [comments]

FOR IMMEDIATE RELEASE: After Butterfly Labs collapses, engineers find new jobs at 21 Inc.

BEGIN BLOG POST

After Butterfly Labs collapses, engineers find new jobs at 21 Inc.

A bitcoin miner has shipped on time. Yes, that is news. A new venture-capital backed company, 21 Inc., has released a miniature bitcoin miner that they call a "Bitcoin computer". For $399.99, you get a Raspberry Pi, an SHA-256 ASIC board, and a giant fan.
Again, this is news: normally, a manufacturer of bitcoin miners would overdesign and underengineer their equipment, or, if they managed to ship something functional, it would be so poorly engineered -- and over budget -- that it be an explosion waiting to happen and/or priced comparably to a four-door sedan.
21 Inc. has done something remarkable in the Bitcoin world: they started a company that operates like a legitimate business. They're even listed on Amazon.com, a company that's so strict with vendors that Nintendo was kicked off their system for not kissing enough customer ass.
Okay, enough with the praise.

This thing sucks.

The 21.co "computer" certainly deserves a place in the VC world, along with the other products consisting of wild promises and inane use cases. For the price of 4 Raspberry Pi computer kits, you get the following:
(If you have a remote desire to develop applications that use bitcoin, stop here. Go through that list and buy just those items above. You don't need anything else. If you're looking for comedy, or if you're a sucker with too much money, read on...)

Is that all I get for my money?

Those products alone don't allow you to make Bitcoin applications, apparently. You need these things, too:

How about the software demos?

It's difficult to justify developing a $400 computer that can't do much. So, to entice some customers, 21 Inc. included demos that try really hard to make customers feel inspired. Here are just a few things that 21 Inc. claims were totally impossible before their product existed:

What are the real customers saying?

The packaging is slick:
"This @21dotco computer came already opened..."
The hardware is reliable:
"...it must have lost power, which caused my SSH keys to become corrupted."
The software is revolutionary:
"...it will be more expensive to pay for your spotify subscription via your electricity bill, but a lot of people don't care."

I want to buy it anyway!

Go ahead. I won't stop you. Oh, and 21 Inc. doesn't accept bitcoins.
END BLOG POST
submitted by theirmoss to Buttcoin [link] [comments]

🔹How Did it all Begin?

🔹How Did it all Begin?
Blockchain on the surface can seem very abstract, with very little meaning for those who don’t know about it. Blockchain or previously known as blockchain, is a growing list of records called blocks, which are linked using cryptography.
Before blockchain was ever in use of cryptocurrency, it had its beginnings in the field of computer science, especially in cryptography and data structures.
Stuart Haber and W. Scott Stornetta in 1991 first described their work on a cryptographically secured chain of blocks, where they tried to build a system where document timestamps could not be tampered with. From there, the first blockchain was conceptualized in 2008, which improved upon the previous design. The following year, the design was implemented on the cryptocurrency bitcoin. The blockchain size grew ever since.
By 2014, a new term “blockchain 2.0” was introduced. It refers to the new applications of the distributed blockchain database. Soon, the implementation capabilities of the second-generation programmable blockchain was described as - a programming language which will allow people to write more sophisticated smart contracts and hence, creating invoices that would pay themselves when a shipment arrives.
Looking at the structure of blockchain, it is a decentralized, distributed and public digital ledger which records transactions across many different computers - making it impossible to alter any particular record without the alteration of all subsequent blocks.
The “ledger” is referred to the record that the blockchain keeps of all the data exchanges. Each data exchange is a transaction which is verified by a distributed system - a peer to peer network of nodes. Once signed and verified, the transaction is added to the list and it cannot be altered.
The blockchain is many times referred to, or described as a value exchange protocol. Exchange through blockchain is completed in a quicker, safer and cheaper process, as compared to traditional systems.
https://preview.redd.it/mnzz5yycnjd21.jpg?width=863&format=pjpg&auto=webp&s=9d70fc0fda6d58082cbb2c871b4585f2f352babe
#bitcoin #btc #starttheblockchain #blockchain #cryptocurrency #crypto#bitcoins #digitaltechnology
submitted by Benice_tools to u/Benice_tools [link] [comments]

Testnet is currently on non-BIP101 (legacy) chains. More fork tests will follow.

We turned off the hashpower on the BIP101 fork at the end of our testing today. After I went to sleep, it seems someone started hashing on the legacy fork, and the work on that chain overtook the BIP101 fork. These are from an XT node that I know was following the big-block fork, and has several 9 MB blocks in its database:
./bitcoin-cli getblock 000000003199e2651d08bb2282d0896128d04ac3bf82f344d3ab92bf0061c80b | grep height "height" : 585471 ./bitcoin-cli getblockhash 585471 0000000000324544abe2531548faec6525a856999f3b46ecf9128a3f5b273d24 
Those are not the same hash. Not the same block. The first block was orphaned at height 585471, but the second block is currently confirmed. Here's another interesting block for the records:
./bitcoin-cli getblock 0000000015cbe557995bf95861d66231d01513ba99f06d23854522272c9049c3 | less "hash" : "0000000015cbe557995bf95861d66231d01513ba99f06d23854522272c9049c3", "confirmations" : -1, "size" : 9115445, "height" : 586921, "version" : 536870919, "merkleroot" : "6fcc61f29be259c74bc95e8f57c249678572b82b343b6b2bb62897ce1cfd7283", tx: [ .... (9.1 MB of tx go here) ... ] "time" : 1447195582, "nonce" : 3149336320, "bits" : "1c276440", "difficulty" : 6.49874805, "chainwork" : "0000000000000000000000000000000000000000000000067d572257e0d8ef4b", "previousblockhash" : "0000000020e32b6f8f80950b3bb888294a8ece8e61b6ee4b0fed6800d0a44209" } 
That's from a sequence of (IIRC) about five 9.1 MB blocks that we mined over the course of 7 minutes. It's since been reorged out by an empty Legacy block:
./bitcoin-cli getblock `./bitcoin-cli getblockhash 586921` { "hash" : "00000000004572422af0fd54ac158ff430bf60003c54db27eb2e953d4ae2aa4c", "confirmations" : 14985, "size" : 262, "height" : 586921, "version" : 4, "merkleroot" : "857b0723d7999341cd0b9ff9fa0e112dbbb19535dc32c7f06d3721a528c0021a", "tx" : [ "857b0723d7999341cd0b9ff9fa0e112dbbb19535dc32c7f06d3721a528c0021a" ], "time" : 1447140611, "nonce" : 2663112535, "bits" : "1c24a880", "difficulty" : 6.98332357, "chainwork" : "00000000000000000000000000000000000000000000000677759d3b1ee4737d", "previousblockhash" : "00000000007947bac4fb9dbc759704958bc38c2050f7cd236c2a7e0b8c859958", "nextblockhash" : "000000000048c36f0ed2bb68145ea4376ed4facfbbc020726204162f95692387" } 
The second block was mined 54971 seconds or 15.27 hours later.
So far, the testnet chain has forked twice when it should have, and it has reorged twice when it should have. The first reorg was caused by us switching to Core to overtake the BIP101 branch. The second reorg was caused by us switching off our hashpower, and someone else overtaking the BIP101 branch. So far I have not found any behavior that is incorrect.
Edit Nov 12th: We're currently focusing on developing tools to collect data from subsequent tests right now. We should be able to fork pretty much whenever we want as long as we leave the hashrate off when we're not using it. If we leave the hashrate on, it's (a) expensive, and (b) harder to test the forking behavior since we'd have to hash on Core to get it to catch up first.
submitted by jtoomim to bitcoinxt [link] [comments]

Is anyone else freaked out by this whole blocksize debate? Does anyone else find themself often agreeing with *both* sides - depending on whichever argument you happen to be reading at the moment? And do we need some better algorithms and data structures?

Why do both sides of the debate seem “right” to me?
I know, I know, a healthy debate is healthy and all - and maybe I'm just not used to the tumult and jostling which would be inevitable in a real live open major debate about something as vital as Bitcoin.
And I really do agree with the starry-eyed idealists who say Bitcoin is vital. Imperfect as it may be, it certainly does seem to represent the first real chance we've had in the past few hundred years to try to steer our civilization and our planet away from the dead-ends and disasters which our government-issued debt-based currencies keep dragging us into.
But this particular debate, about the blocksize, doesn't seem to be getting resolved at all.
Pretty much every time I read one of the long-form major arguments contributed by Bitcoin "thinkers" who I've come to respect over the past few years, this weird thing happens: I usually end up finding myself nodding my head and agreeing with whatever particular piece I'm reading!
But that should be impossible - because a lot of these people vehemently disagree!
So how can both sides sound so convincing to me, simply depending on whichever piece I currently happen to be reading?
Does anyone else feel this way? Or am I just a gullible idiot?
Just Do It?
When you first look at it or hear about it, increasing the size seems almost like a no-brainer: The "big-block" supporters say just increase the blocksize to 20 MB or 8 MB, or do some kind of scheduled or calculated regular increment which tries to take into account the capabilities of the infrastructure and the needs of the users. We do have the bandwidth and the memory to at least increase the blocksize now, they say - and we're probably gonna continue to have more bandwidth and memory in order to be able to keep increasing the blocksize for another couple decades - pretty much like everything else computer-based we've seen over the years (some of this stuff is called by names such as "Moore's Law").
On the other hand, whenever the "small-block" supporters warn about the utter catastrophe that a failed hard-fork would mean, I get totally freaked by their possible doomsday scenarios, which seem totally plausible and terrifying - so I end up feeling that the only way I'd want to go with a hard-fork would be if there was some pre-agreed "triggering" mechanism where the fork itself would only actually "switch on" and take effect provided that some "supermajority" of the network (of who? the miners? the full nodes?) had signaled (presumably via some kind of totally reliable p2p trustless software-based voting system?) that they do indeed "pre-agree" to actually adopt the pre-scheduled fork (and thereby avoid any possibility whatsoever of the precious blockchain somehow tragically splitting into two and pretty much killing this cryptocurrency off in its infancy).
So in this "conservative" scenario, I'm talking about wanting at least 95% pre-adoption agreement - not the mere 75% which I recall some proposals call for, which seems like it could easily lead to a 75/25 blockchain split.
But this time, with this long drawn-out blocksize debate, the core devs, and several other important voices who have become prominent opinion shapers over the past few years, can't seem to come to any real agreement on this.
Weird split among the devs
As far as I can see, there's this weird split: Gavin and Mike seem to be the only people among the devs who really want a major blocksize increase - and all the other devs seem to be vehemently against them.
But then on the other hand, the users seem to be overwhelmingly in favor of a major increase.
And there are meta-questions about governance, about about why this didn't come out as a BIP, and what the availability of Bitcoin XT means.
And today or yesterday there was this really cool big-blockian exponential graph based on doubling the blocksize every two years for twenty years, reminding us of the pure mathematical fact that 210 is indeed about 1000 - but not really addressing any of the game-theoretic points raised by the small-blockians. So a lot of the users seem to like it, but when so few devs say anything positive about it, I worry: is this just yet more exponential chart porn?
On the one hand, Gavin's and Mike's blocksize increase proposal initially seemed like a no-brainer to me.
And on the other hand, all the other devs seem to be against them. Which is weird - not what I'd initially expected at all (but maybe I'm just a fool who's seduced by exponential chart porn?).
Look, I don't mean to be rude to any of the core devs, and I don't want to come off like someone wearing a tinfoil hat - but it has to cross people's minds that the powers that be (the Fed and the other central banks and the governments that use their debt-issued money to run this world into a ditch) could very well be much more scared shitless than they're letting on. If we assume that the powers that be are using their usual playbook and tactics, then it could be worth looking at the book "Confessions of an Economic Hitman" by John Perkins, to get an idea of how they might try to attack Bitcoin. So, what I'm saying is, they do have a track record of sending in "experts" to try to derail projects and keep everyone enslaved to the Creature from Jekyll Island. I'm just saying. So, without getting ad hominem - let's just make sure that our ideas can really stand scrutiny on their own - as Nick Szabo says, we need to make sure there is "more computer science, less noise" in this debate.
When Gavin Andresen first came out with the 20 MB thing - I sat back and tried to imagine if I could download 20 MB in 10 minutes (which seems to be one of the basic mathematical and technological constraints here - right?)
I figured, "Yeah, I could download that" - even with my crappy internet connection.
And I guess the telecoms might be nice enough to continue to double our bandwidth every two years for the next couple decades – if we ask them politely?
On the other hand - I think we should be careful about entrusting the financial freedom of the world into the greedy hands of the telecoms companies - given all their shady shenanigans over the past few years in many countries. After decades of the MPAA and the FBI trying to chip away at BitTorrent, lately PirateBay has been hard to access. I would say it's quite likely that certain persons at institutions like JPMorgan and Goldman Sachs and the Fed might be very, very motivated to see Bitcoin fail - so we shouldn't be too sure about scaling plans which depend on the willingness of companies Verizon and AT&T to double our bandwith every two years.
Maybe the real important hardware buildout challenge for a company like 21 (and its allies such as Qualcomm) to take on now would not be "a miner in every toaster" but rather "Google Fiber Download and Upload Speeds in every Country, including China".
I think I've read all the major stuff on the blocksize debate from Gavin Andresen, Mike Hearn, Greg Maxwell, Peter Todd, Adam Back, and Jeff Garzick and several other major contributors - and, oddly enough, all their arguments seem reasonable - heck even Luke-Jr seems reasonable to me on the blocksize debate, and I always thought he was a whackjob overly influenced by superstition and numerology - and now today I'm reading the article by Bram Cohen - the inventor of BitTorrent - and I find myself agreeing with him too!
I say to myself: What's going on with me? How can I possibly agree with all of these guys, if they all have such vehemently opposing viewpoints?
I mean, think back to the glory days of a couple of years ago, when all we were hearing was how this amazing unprecedented grassroots innovation called Bitcoin was going to benefit everyone from all walks of life, all around the world:
...basically the entire human race transacting everything into the blockchain.
(Although let me say that I think that people's focus on ideas like driverless cabs creating realtime fare markets based on supply and demand seems to be setting our sights a bit low as far as Bitcoin's abilities to correct the financial world's capital-misallocation problems which seem to have been made possible by infinite debt-based fiat. I would have hoped that a Bitcoin-based economy would solve much more noble, much more urgent capital-allocation problems than driverless taxicabs creating fare markets or refrigerators ordering milk on the internet of things. I was thinking more along the lines that Bitcoin would finally strangle dead-end debt-based deadly-toxic energy industries like fossil fuels and let profitable clean energy industries like Thorium LFTRs take over - but that's another topic. :=)
Paradoxes in the blocksize debate
Let me summarize the major paradoxes I see here:
(1) Regarding the people (the majority of the core devs) who are against a blocksize increase: Well, the small-blocks arguments do seem kinda weird, and certainly not very "populist", in the sense that: When on earth have end-users ever heard of a computer technology whose capacity didn't grow pretty much exponentially year-on-year? All the cool new technology we've had - from hard drives to RAM to bandwidth - started out pathetically tiny and grew to unimaginably huge over the past few decades - and all our software has in turn gotten massively powerful and big and complex (sometimes bloated) to take advantage of the enormous new capacity available.
But now suddenly, for the first time in the history of technology, we seem to have a majority of the devs, on a major p2p project - saying: "Let's not scale the system up. It could be dangerous. It might break the whole system (if the hard-fork fails)."
I don't know, maybe I'm missing something here, maybe someone else could enlighten me, but I don't think I've ever seen this sort of thing happen in the last few decades of the history of technology - devs arguing against scaling up p2p technology to take advantage of expected growth in infrastructure capacity.
(2) But... on the other hand... the dire warnings of the small-blockians about what could happen if a hard-fork were to fail - wow, they do seem really dire! And these guys are pretty much all heavyweight, experienced programmers and/or game theorists and/or p2p open-source project managers.
I must say, that nearly all of the long-form arguments I've read - as well as many, many of the shorter comments I've read from many users in the threads, whose names I at least have come to more-or-less recognize over the past few months and years on reddit and bitcointalk - have been amazingly impressive in their ability to analyze all aspects of the lifecycle and management of open-source software projects, bringing up lots of serious points which I could never have come up with, and which seem to come from long experience with programming and project management - as well as dealing with economics and human nature (eg, greed - the game-theory stuff).
So a lot of really smart and experienced people with major expertise in various areas ranging from programming to management to game theory to politics to economics have been making some serious, mature, compelling arguments.
But, as I've been saying, the only problem to me is: in many of these cases, these arguments are vehemently in opposition to each other! So I find myself agreeing with pretty much all of them, one by one - which means the end result is just a giant contradiction.
I mean, today we have Bram Cohen, the inventor of BitTorrent, arguing (quite cogently and convincingly to me), that it would be dangerous to increase the blocksize. And this seems to be a guy who would know a few things about scaling out a massive global p2p network - since the protocol which he invented, BitTorrent, is now apparently responsible for like a third of the traffic on the internet (and this despite the long-term concerted efforts of major evil players such as the MPAA and the FBI to shut the whole thing down).
Was the BitTorrent analogy too "glib"?
By the way - I would like to go on a slight tangent here and say that one of the main reasons why I felt so "comfortable" jumping on the Bitcoin train back a few years ago, when I first heard about it and got into it, was the whole rough analogy I saw with BitTorrent.
I remembered the perhaps paradoxical fact that when a torrent is more popular (eg, a major movie release that just came out last week), then it actually becomes faster to download. More people want it, so more people have a few pieces of it, so more people are able to get it from each other. A kind of self-correcting economic feedback loop, where more demand directly leads to more supply.
(BitTorrent manages to pull this off by essentially adding a certain structure to the file being shared, so that it's not simply like an append-only list of 1 MB blocks, but rather more like an random-access or indexed array of 1 MB chunks. Say you're downloading a film which is 700 MB. As soon as your "client" program has downloaded a single 1-MB chunk - say chunk #99 - your "client" program instantly turns into a "server" program as well - offering that chunk #99 to other clients. From my simplistic understanding, I believe the Bitcoin protocol does something similar, to provide a p2p architecture. Hence my - perhaps naïve - assumption that Bitcoin already had the right algorithms / architecture / data structure to scale.)
The efficiency of the BitTorrent network seemed to jive with that "network law" (Metcalfe's Law?) about fax machines. This law states that the more fax machines there are, the more valuable the network of fax machines becomes. Or the value of the network grows on the order of the square of the number of nodes.
This is in contrast with other technology like cars, where the more you have, the worse things get. The more cars there are, the more traffic jams you have, so things start going downhill. I guess this is because highway space is limited - after all, we can't pave over the entire countryside, and we never did get those flying cars we were promised, as David Graeber laments in a recent essay in The Baffler magazine :-)
And regarding the "stress test" supposedly happening right now in the middle of this ongoing blocksize debate, I don't know what worries me more: the fact that it apparently is taking only $5,000 to do a simple kind of DoS on the blockchain - or the fact that there are a few rumors swirling around saying that the unknown company doing the stress test shares the same physical mailing address with a "scam" company?
Or maybe we should just be worried that so much of this debate is happening on a handful of forums which are controlled by some guy named theymos who's already engaged in some pretty "contentious" or "controversial" behavior like blowing a million dollars on writing forum software (I guess he never heard that reddit.com software is open-source)?
So I worry that the great promise of "decentralization" might be more fragile than we originally thought.
Scaling
Anyways, back to Metcalfe's Law: with virtual stuff, like torrents and fax machines, the more the merrier. The more people downloading a given movie, the faster it arrives - and the more people own fax machines, the more valuable the overall fax network.
So I kindof (naïvely?) assumed that Bitcoin, being "virtual" and p2p, would somehow scale up the same magical way BitTorrrent did. I just figured that more people using it would somehow automatically make it stronger and faster.
But now a lot of devs have started talking in terms of the old "scarcity" paradigm, talking about blockspace being a "scarce resource" and talking about "fee markets" - which seems kinda scary, and antithetical to much of the earlier rhetoric we heard about Bitcoin (the stuff about supporting our favorite creators with micropayments, and the stuff about Africans using SMS to send around payments).
Look, when some asshole is in line in front of you at the cash register and he's holding up the line so they can run his credit card to buy a bag of Cheeto's, we tend to get pissed off at the guy - clogging up our expensive global electronic payment infrastructure to make a two-dollar purchase. And that's on a fairly efficient centralized system - and presumably after a year or so, VISA and the guy's bank can delete or compress the transaction in their SQL databases.
Now, correct me if I'm wrong, but if some guy buys a coffee on the blockchain, or if somebody pays an online artist $1.99 for their work - then that transaction, a few bytes or so, has to live on the blockchain forever?
Or is there some "pruning" thing that gets rid of it after a while?
And this could lead to another question: Viewed from the perspective of double-entry bookkeeping, is the blockchain "world-wide ledger" more like the "balance sheet" part of accounting, i.e. a snapshot showing current assets and liabilities? Or is it more like the "cash flow" part of accounting, i.e. a journal showing historical revenues and expenses?
When I think of thousands of machines around the globe having to lug around multiple identical copies of a multi-gigabyte file containing some asshole's coffee purchase forever and ever... I feel like I'm ideologically drifting in one direction (where I'd end up also being against really cool stuff like online micropayments and Africans banking via SMS)... so I don't want to go there.
But on the other hand, when really experienced and battle-tested veterans with major experience in the world of open-souce programming and project management (the "small-blockians") warn of the catastrophic consequences of a possible failed hard-fork, I get freaked out and I wonder if Bitcoin really was destined to be a settlement layer for big transactions.
Could the original programmer(s) possibly weigh in?
And I don't mean to appeal to authority - but heck, where the hell is Satoshi Nakamoto in all this? I do understand that he/she/they would want to maintain absolute anonymity - but on the other hand, I assume SN wants Bitcoin to succeed (both for the future of humanity - or at least for all the bitcoins SN allegedly holds :-) - and I understand there is a way that SN can cryptographically sign a message - and I understand that as the original developer of Bitcoin, SN had some very specific opinions about the blocksize... So I'm kinda wondering of Satoshi could weigh in from time to time. Just to help out a bit. I'm not saying "Show us a sign" like a deity or something - but damn it sure would be fascinating and possibly very helpful if Satoshi gave us his/hetheir 2 satoshis worth at this really confusing juncture.
Are we using our capacity wisely?
I'm not a programming or game-theory whiz, I'm just a casual user who has tried to keep up with technology over the years.
It just seems weird to me that here we have this massive supercomputer (500 times more powerful than the all the supercomputers in the world combined) doing fairly straightforward "embarassingly parallel" number-crunching operations to secure a p2p world-wide ledger called the blockchain to keep track of a measly 2.1 quadrillion tokens spread out among a few billion addresses - and a couple of years ago you had people like Rick Falkvinge saying the blockchain would someday be supporting multi-million-dollar letters of credit for international trade and you had people like Andreas Antonopoulos saying the blockchain would someday allow billions of "unbanked" people to send remittances around the village or around the world dirt-cheap - and now suddenly in June 2015 we're talking about blockspace as a "scarce resource" and talking about "fee markets" and partially centralized, corporate-sponsored "Level 2" vaporware like Lightning Network and some mysterious company is "stess testing" or "DoS-ing" the system by throwing away a measly $5,000 and suddenly it sounds like the whole system could eventually head right back into PayPal and Western Union territory again, in terms of expensive fees.
When I got into Bitcoin, I really was heavily influenced by vague analogies with BitTorrent: I figured everyone would just have tiny little like utorrent-type program running on their machine (ie, Bitcoin-QT or Armory or Mycelium etc.).
I figured that just like anyone can host a their own blog or webserver, anyone would be able to host their own bank.
Yeah, Google and and Mozilla and Twitter and Facebook and WhatsApp did come along and build stuff on top of TCP/IP, so I did expect a bunch of companies to build layers on top of the Bitcoin protocol as well. But I still figured the basic unit of bitcoin client software powering the overall system would be small and personal and affordable and p2p - like a bittorrent client - or at the most, like a cheap server hosting a blog or email server.
And I figured there would be a way at the software level, at the architecture level, at the algorithmic level, at the data structure level - to let the thing scale - if not infinitely, at least fairly massively and gracefully - the same way the BitTorrent network has.
Of course, I do also understand that with BitTorrent, you're sharing a read-only object (eg, a movie) - whereas with Bitcoin, you're achieving distributed trustless consensus and appending it to a write-only (or append-only) database.
So I do understand that the problem which BitTorrent solves is much simpler than the problem which Bitcoin sets out to solve.
But still, it seems that there's got to be a way to make this thing scale. It's p2p and it's got 500 times more computing power than all the supercomputers in the world combined - and so many brilliant and motivated and inspired people want this thing to succeed! And Bitcoin could be our civilization's last chance to steer away from the oncoming debt-based ditch of disaster we seem to be driving into!
It just seems that Bitcoin has got to be able to scale somehow - and all these smart people working together should be able to come up with a solution which pretty much everyone can agree - in advance - will work.
Right? Right?
A (probably irrelevant) tangent on algorithms and architecture and data structures
I'll finally weigh with my personal perspective - although I might be biased due to my background (which is more on the theoretical side of computer science).
My own modest - or perhaps radical - suggestion would be to ask whether we're really looking at all the best possible algorithms and architectures and data structures out there.
From this perspective, I sometimes worry that the overwhelming majority of the great minds working on the programming and game-theory stuff might come from a rather specific, shall we say "von Neumann" or "procedural" or "imperative" school of programming (ie, C and Python and Java programmers).
It seems strange to me that such a cutting-edge and important computer project would have so little participation from the great minds at the other end of the spectrum of programming paradigms - namely, the "functional" and "declarative" and "algebraic" (and co-algebraic!) worlds.
For example, I was struck in particular by statements I've seen here and there (which seemed rather hubristic or lackadaisical to me - for something as important as Bitcoin), that the specification of Bitcoin and the blockchain doesn't really exist in any form other than the reference implementation(s) (in procedural languages such as C or Python?).
Curry-Howard anyone?
I mean, many computer scientists are aware of the Curry-Howard isomorophism, which basically says that the relationship between a theorem and its proof is equivalent to the relationship between a specification and its implementation. In other words, there is a long tradition in mathematics (and in computer programming) of:
And it's not exactly "turtles all the way down" either: a specification is generally simple and compact enough that a good programmer can usually simply visually inspect it to determine if it is indeed "correct" - something which is very difficult, if not impossible, to do with a program written in a procedural, implementation-oriented language such as C or Python or Java.
So I worry that we've got this tradition, from the open-source github C/Java programming tradition, of never actually writing our "specification", and only writing the "implementation". In mission-critical military-grade programming projects (which often use languages like Ada or Maude) this is simply not allowed. It would seem that a project as mission-critical as Bitcoin - which could literally be crucial for humanity's continued survival - should also use this kind of military-grade software development approach.
And I'm not saying rewrite the implementations in these kind of theoretical languages. But it might be helpful if the C/Python/Java programmers in the Bitcoin imperative programming world could build some bridges to the Maude/Haskell/ML programmers of the functional and algebraic programming worlds to see if any kind of useful cross-pollination might take place - between specifications and implementations.
For example, the JavaFAN formal analyzer for multi-threaded Java programs (developed using tools based on the Maude language) was applied to the Remote Agent AI program aboard NASA's Deep Space 1 shuttle, written in Java - and it took only a few minutes using formal mathematical reasoning to detect a potential deadlock which would have occurred years later during the space mission when the damn spacecraft was already way out around Pluto.
And "the Maude-NRL (Naval Research Laboratory) Protocol Analyzer (Maude-NPA) is a tool used to provide security proofs of cryptographic protocols and to search for protocol flaws and cryptosystem attacks."
These are open-source formal reasoning tools developed by DARPA and used by NASA and the US Navy to ensure that program implementations satisfy their specifications. It would be great if some of the people involved in these kinds of projects could contribute to help ensure the security and scalability of Bitcoin.
But there is a wide abyss between the kinds of programmers who use languages like Maude and the kinds of programmers who use languages like C/Python/Java - and it can be really hard to get the two worlds to meet. There is a bit of rapprochement between these language communities in languages which might be considered as being somewhere in the middle, such as Haskell and ML. I just worry that Bitcoin might be turning into being an exclusively C/Python/Java project (with the algorithms and practitioners traditionally of that community), when it could be more advantageous if it also had some people from the functional and algebraic-specification and program-verification community involved as well. The thing is, though: the theoretical practitioners are big on "semantics" - I've heard them say stuff like "Yes but a C / C++ program has no easily identifiable semantics". So to get them involved, you really have to first be able to talk about what your program does (specification) - before proceeding to describe how it does it (implementation). And writing high-level specifications is typically very hard using the syntax and semantics of languages like C and Java and Python - whereas specs are fairly easy to write in Maude - and not only that, they're executable, and you state and verify properties about them - which provides for the kind of debate Nick Szabo was advocating ("more computer science, less noise").
Imagine if we had an executable algebraic specification of Bitcoin in Maude, where we could formally reason about and verify certain crucial game-theoretical properties - rather than merely hand-waving and arguing and deploying and praying.
And so in the theoretical programming community you've got major research on various logics such as Girard's Linear Logic (which is resource-conscious) and Bruni and Montanari's Tile Logic (which enables "pasting" bigger systems together from smaller ones in space and time), and executable algebraic specification languages such as Meseguer's Maude (which would be perfect for game theory modeling, with its functional modules for specifying the deterministic parts of systems and its system modules for specifiying non-deterministic parts of systems, and its parameterized skeletons for sketching out the typical architectures of mobile systems, and its formal reasoning and verification tools and libraries which have been specifically applied to testing and breaking - and fixing - cryptographic protocols).
And somewhat closer to the practical hands-on world, you've got stuff like Google's MapReduce and lots of Big Data database languages developed by Google as well. And yet here we are with a mempool growing dangerously big for RAM on a single machine, and a 20-GB append-only list as our database - and not much debate on practical results from Google's Big Data databases.
(And by the way: maybe I'm totally ignorant for asking this, but I'll ask anyways: why the hell does the mempool have to stay in RAM? Couldn't it work just as well if it were stored temporarily on the hard drive?)
And you've got CalvinDB out of Yale which apparently provides an ACID layer on top of a massively distributed database.
Look, I'm just an armchair follower cheering on these projects. I can barely manage to write a query in SQL, or read through a C or Python or Java program. But I would argue two points here: (1) these languages may be too low-level and "non-formal" for writing and modeling and formally reasoning about and proving properties of mission-critical specifications - and (2) there seem to be some Big Data tools already deployed by institutions such as Google and Yale which support global petabyte-size databases on commodity boxes with nice properties such as near-real-time and ACID - and I sometimes worry that the "core devs" might be failing to review the literature (and reach out to fellow programmers) out there to see if there might be some formal program-verification and practical Big Data tools out there which could be applied to coming up with rock-solid, 100% consensus proposals to handle an issue such as blocksize scaling, which seems to have become much more intractable than many people might have expected.
I mean, the protocol solved the hard stuff: the elliptical-curve stuff and the Byzantine General stuff. How the heck can we be falling down on the comparatively "easier" stuff - like scaling the blocksize?
It just seems like defeatism to say "Well, the blockchain is already 20-30 GB and it's gonna be 20-30 TB ten years from now - and we need 10 Mbs bandwidth now and 10,000 Mbs bandwidth 20 years from - assuming the evil Verizon and AT&T actually give us that - so let's just become a settlement platform and give up on buying coffee or banking the unbanked or doing micropayments, and let's push all that stuff into some corporate-controlled vaporware without even a whitepaper yet."
So you've got Peter Todd doing some possibly brilliant theorizing and extrapolating on the idea of "treechains" - there is a Let's Talk Bitcoin podcast from about a year ago where he sketches the rough outlines of this idea out in a very inspiring, high-level way - although the specifics have yet to be hammered out. And we've got Blockstream also doing some hopeful hand-waving about the Lightning Network.
Things like Peter Todd's treechains - which may be similar to the spark in some devs' eyes called Lightning Network - are examples of the kind of algorithm or architecture which might manage to harness the massive computing power of miners and nodes in such a way that certain kinds of massive and graceful scaling become possible.
It just seems like a kindof tiny dev community working on this stuff.
Being a C or Python or Java programmer should not be a pre-req to being able to help contribute to the specification (and formal reasoning and program verification) for Bitcoin and the blockchain.
XML and UML are crap modeling and specification languages, and C and Java and Python are even worse (as specification languages - although as implementation languages, they are of course fine).
But there are serious modeling and specification languages out there, and they could be very helpful at times like this - where what we're dealing with is questions of modeling and specification (ie, "needs and requirements").
One just doesn't often see the practical, hands-on world of open-source github implementation-level programmers and the academic, theoretical world of specification-level programmers meeting very often. I wish there were some way to get these two worlds to collaborate on Bitcoin.
Maybe a good first step to reach out to the theoretical people would be to provide a modular executable algebraic specification of the Bitcoin protocol in a recognized, military/NASA-grade specification language such as Maude - because that's something the theoretical community can actually wrap their heads around, whereas it's very hard to get them to pay attention to something written only as a C / Python / Java implementation (without an accompanying specification in a formal language).
They can't check whether the program does what it's supposed to do - if you don't provide a formal mathematical definition of what the program is supposed to do.
Specification : Implementation :: Theorem : Proof
You have to remember: the theoretical community is very aware of the Curry-Howard isomorphism. Just like it would be hard to get a mathematician's attention by merely showing them a proof without telling also telling them what theorem the proof is proving - by the same token, it's hard to get the attention of a theoretical computer scientist by merely showing them an implementation without showing them the specification that it implements.
Bitcoin is currently confronted with a mathematical or "computer science" problem: how to secure the network while getting high enough transactional throughput, while staying within the limited RAM, bandwidth and hard drive space limitations of current and future infrastructure.
The problem only becomes a political and economic problem if we give up on trying to solve it as a mathematical and "theoretical computer science" problem.
There should be a plethora of whitepapers out now proposing algorithmic solutions to these scaling issues. Remember, all we have to do is apply the Byzantine General consensus-reaching procedure to a worldwide database which shuffles 2.1 quadrillion tokens among a few billion addresses. The 21 company has emphatically pointed out that racing to compute a hash to add a block is an "embarrassingly parallel" problem - very easy to decompose among cheap, fault-prone, commodity boxes, and recompose into an overall solution - along the lines of Google's highly successful MapReduce.
I guess what I'm really saying is (and I don't mean to be rude here), is that C and Python and Java programmers might not be the best qualified people to develop and formally prove the correctness of (note I do not say: "test", I say "formally prove the correctness of") these kinds of algorithms.
I really believe in the importance of getting the algorithms and architectures right - look at Google Search itself, it uses some pretty brilliant algorithms and architectures (eg, MapReduce, Paxos) which enable it to achieve amazing performance - on pretty crappy commodity hardware. And look at BitTorrent, which is truly p2p, where more demand leads to more supply.
So, in this vein, I will close this lengthy rant with an oddly specific link - which may or may not be able to make some interesting contributions to finding suitable algorithms, architectures and data structures which might help Bitcoin scale massively. I have no idea if this link could be helpful - but given the near-total lack of people from the Haskell and ML and functional worlds in these Bitcoin specification debates, I thought I'd be remiss if I didn't throw this out - just in case there might be something here which could help us channel the massive computing power of the Bitcoin network in such a way as to enable us simply sidestep this kind of desperate debate where both sides seem right because the other side seems wrong.
https://personal.cis.strath.ac.uk/neil.ghani/papers/ghani-calco07
The above paper is about "higher dimensional trees". It uses a bit of category theory (not a whole lot) and a bit of Haskell (again not a lot - just a simple data structure called a Rose tree, which has a wikipedia page) to develop a very expressive and efficient data structure which generalizes from lists to trees to higher dimensions.
I have no idea if this kind of data structure could be applicable to the current scaling mess we apparently are getting bogged down in - I don't have the game-theory skills to figure it out.
I just thought that since the blockchain is like a list, and since there are some tree-like structures which have been grafted on for efficiency (eg Merkle trees) and since many of the futuristic scaling proposals seem to also involve generalizing from list-like structures (eg, the blockchain) to tree-like structures (eg, side-chains and tree-chains)... well, who knows, there might be some nugget of algorithmic or architectural or data-structure inspiration there.
So... TL;DR:
(1) I'm freaked out that this blocksize debate has splintered the community so badly and dragged on so long, with no resolution in sight, and both sides seeming so right (because the other side seems so wrong).
(2) I think Bitcoin could gain immensely by using high-level formal, algebraic and co-algebraic program specification and verification languages (such as Maude including Maude-NPA, Mobile Maude parameterized skeletons, etc.) to specify (and possibly also, to some degree, verify) what Bitcoin does - before translating to low-level implementation languages such as C and Python and Java saying how Bitcoin does it. This would help to communicate and reason about programs with much more mathematical certitude - and possibly obviate the need for many political and economic tradeoffs which currently seem dismally inevitable - and possibly widen the collaboration on this project.
(3) I wonder if there are some Big Data approaches out there (eg, along the lines of Google's MapReduce and BigTable, or Yale's CalvinDB), which could be implemented to allow Bitcoin to scale massively and painlessly - and to satisfy all stakeholders, ranging from millionaires to micropayments, coffee drinkers to the great "unbanked".
submitted by BeYourOwnBank to Bitcoin [link] [comments]

Bitcoin vs. Bitcoin Cash vs. Bitcoin SV: Blockspace Demand!! (2017-2019) Quick Tour of Medium Size Bitcoin Mining Facility EB82 – Mike Hearn - Blocksize Debate At The Breaking Point Teranode: The Transition Plan for Terabyte Size Blocks and Enterprise-Class BSV Software Bitcoin Q&A: Block capacity and embedded data

A subreddit focused on providing open discussion on all things Bitcoin (BSV). jump to content. my subreddits. edit subscriptions. popular-all-random-users AskReddit-news-funny-pics-movies-videos-worldnews-aww-gaming-tifu-todayilearned-Showerthoughts-mildlyinteresting-gifs-Jokes-explainlikeimfive-TwoXChromosomes-IAmA -science-OldSchoolCool-nottheonion-Futurology-books-askscience-Art ... The activation state of the fork is stored in the block tree database; it is written when the threshold is met (and unwritten if the threshold block is re-orged out of the best chain), and read at startup. Code review and bug fixes by Mike Hearn and Michael Ruddy. Loading branch information; gavinandresen authored and mikehearn committed Aug 6, 2015. 1 parent dbefeef commit ... Ran from cmd prompt, kept cache size at 1024MB, added '-debug' and '-txindex' flags: C:\Program Files\BitcoinCore>"C:\Program Files\BitcoinCore\bin\bitcoin-qt.exe" -debug -dbcache=1024 -txindex. The issue reproduced - CPU/HDD/NET utilization went to 0%, number of processed blocks got stuck, after graceful shutdown of BitcoinCore the block database is corrupted. It gets always stuck roughly ... Currently the average block size is 0,4 MB so if the usage remains in the same levels the estimated blockchain size will be around 2,64 terabytes. A minimal block contains a single 1-input/1-output transaction. The tx size in this case is: 1 x 148 + 1 x 34 + 10 + 1 = 193 bytes. Adding 4 bytes for the magic number, 4 bytes for the blocksize ... Bitcoin Core runs as a full network node and maintains a local copy of the block chain. This data independence improves wallet privacy and security. Unlike some SPV wallets that leak addresses to peers, Bitcoin Core stores all transactions locally.With local access to the complete set of headers and transactions, Bitcoin Core can use full verification to tell when peers lie about payments.

[index] [17284] [8353] [34154] [399] [2394] [29172] [13077] [45586] [15776] [27838]

Bitcoin vs. Bitcoin Cash vs. Bitcoin SV: Blockspace Demand!! (2017-2019)

In his presentation, Bitcoin SV Node Project Lead Developer Daniel Connolly shares what would be the benefits of Teranode for the Bitcoin SV ecosystem. Teranode allows terabyte-sized blocks, which ... Data Networking for Medium Size Bitcoin Mining Operations - Duration: 20:35. ... Block Operations 40,110 views. 4:24. Cooling a bitcoin miner S5 with mineral oil - Duration: 6:56. bitcoin master ... How many transactions can fit in a block, with or without SegWit? What other types of data can you put in a block? What is metadata? What is OP_RETURN? What is OpenTimestamps and what are the use ... In this episode, I talk with Samson Mow, Chief Strategy Officer at Blockstream. We talk about censorship, Bitcoin scaling and the Lightning Network, explicit... The mean daily block size in megabytes (MB) for each coin. This allows the demand for blockspace to be visualized from 2017 to 2019. Notice how Bitcoin consistently churns out larger blocks as the ...

#