Homomorphic Encryption & Blockchain - 48coins

Daily General Discussion - February 12, 2018

Welcome to the Daily General Discussion thread of /EthTrader.
Thread guidelines:
Resources and other information:
  • Newcomers who have basic questions about Ethereum can find answers by visiting /EthereumNoobies or our Ethereum Education wiki page, see here.
  • To view live streaming comments for this thread, click here. Account permissions are required to post comments through Reddit-Stream.com.
Enjoy!
submitted by AutoModerator to ethtrader [link] [comments]

Why you should invest in OCEAN Protocol

Why I am investing in Ocean Protocol
tl;dr
Unlocking data for AI
Partnered with; Unilever, Roche, Johnson&Johnson, Aviva, MOBI (BMW, Ford, GM)
Currently at $0.03, IEO price $0.12, ICO price $0.2.
Staking coming Q2.
THE PROBLEM
The world has a data problem. The more we create, the more we are forced to entrust it all to fewer data monopolies to profit from.
Data is also siloed, and generally hosted on proprietary databases across vast systems, geographies and business units. Whilst there have been fixes and APIs that have helped improve the sharing of corporate and public data, fundamentally this doesn’t change the fact that client-server architecture and corporate IT networks are inherently designed to prevent data sharing.
Regulation and privacy laws combine to make organisations concerned about sharing data both internally and publicly unless forced to do so. The Health Insurance Portability and Accountability Act (HIPAA) in the US or the Data Protection Act in the UK explicitly state how and what data can and cannot be shared. But these are complicated policies. The technical difficulty of implementing them, combined with bad UX means people err on the side of caution when approaching these issues. There is simply no incentive to outweigh the risk and hassle of sharing data.
Even where sharing is encouraged, current infrastructure makes monetising data through open source licensing complex and equally difficult to enforce. So ultimately, you are left with two options: give your data away for free (which what most individuals do) or hoard it and see if you can make sense of it at some time in the future (which is what most companies do). Neither is very efficient or effective.
The consequence is a few increasingly powerful companies get the vast majority of data at little cost, and large amounts of valuable data are sat dormant in siloed databases.
Simply put, there is no economic incentive to share data. This is a massive issue in the AI market (expected to be worth $70 billion in 2020 according to BoA Merrill).
The best AI techniques today, such as deep learning, need lots (and lots) of quality and relevant datasets to deliver any kind of meaningful value. Starving most new entrants (such as startups and SMEs) of the ability to compete.
AI expertise and talent is expensive and hard to come by, typically concentrating within organisations that already have the data to play with or promise to generate vast quantities of it in the future. Companies like Google, Facebook, Microsoft and Baidu swallow up almost all the best talent and computer science and AI PhDs before they even come onto the jobs market.
This creates a self-propagating cycle, increasingly benefiting a few established organisations who are able to go on to dominate their respective markets, extracting a premium for the priviledge. Think of Facebook & Google in the Ad Market, Amazon for Retail, now imagine that happening across every single industry vertical. Data leads to data network effects, and subsequent AI advantages which are extremely hard to catch up with once the flywheel starts. The way things are going, the driver-less car market will likely consolidate around one single software provider. As old industries like education, healthcare and utilities digitize their operations and start utilizing data, the same will likely happen there too.
The benefits of the 4th Industrial Revolution are in the hands of fewer and fewer organisations.
Currently the expectation is that companies, rather than trying to compete (if they want to stay in business), are expected to concede their data to one of the big tech clouds like Amazon or Microsoft to be able to extract value from it. Further extending the suppliers’ unfair advantage and increasing their own dependency. Look at autonomous vehicles, German manufacturers unable to compete with Silicon Valley’s AIs for self driving cars could be left simply making the low-value hardware whilst conceding the higher-value (and margin) software to companies that drive the intelligence that control them.
I’ve always argued companies don’t want Big Data. They want actionable intelligence. But currently most large organisations have vast dumb data in silos that they simply don’t know what to do with.
But what if…
they could securely allow AI developers to run algorithms on it whilst keeping it stored encrypted, on-premise.
And open up every database at a ‘planetary level’ and turn them into a single data marketplace.
Who would own or control it? To be frank, it would require unseen levels of trust. Data is generally very sensitive, revealing and something you typically would not want to share with your competitors. Especially in say, consumer health how could that be possible with complex privacy laws?
What’s needed is a decentralised data marketplace to connect AI developers to data owners in a compliant, secure and affordable way. Welcome to Ocean Protocol.
Why decentralised and tokenised?
Primarily because of the need for the provenance of IP, affordable payment channels, and the ensure no single entity becomes a gatekeeper to a hoard of valuable data. Gatekeeper, in the sense that they can arbitrarily ban or censor participants but also to avoid the same honeypot hacking problems we encounter in today’s centralised world.
But aren’t there already decentralised data market projects?
The Ocean team have focused their design on enabling ‘exchange protocols’, resulting in massive potential for partnerships with other players in the domain. As investors in IOTA, understanding how this could work with their Data Marketplace is an interesting case in point.
INNOVATIONS
What we like most about Ocean is they have been deploying many of the constituent parts that underpin this marketplace over the last 4 years via a number of initiatives which they are now bringing together into one unified solution:
(digital ownership & attribution) (high throughput distributed database to allow for high throughput transactions) (Scalability – build on proven BigchainDB / IPDB technology for “planetary scale”) (blockchain-ready, community-driven protocol for intellectual property licensing)
What is being added is a protocol and token designed to incentivize and program rules and behaviours into the marketplace to ensure relevant good quality data is committed, made available and fairly remunerated. The design is prepared for processing confidential data for machine learning and aggregated analysis without exposing the raw data itself. Ocean will facilitate in bringing the processing algorithms to the data through on-premise compute and, eventually, more advanced techniques, like homomorphic encryption, as they mature.
OCEAN Token
Think of the Ocean Token as the ‘crypto asset’ that serves as the commodity in the data economy to incentivise the mass coordination of resources to secure and scale the network to turn in to actionable intelligence.
If Ocean is about trading data, can’t it use an existing cryptocurrency as its token, like Bitcoin or Ether?
While existing tokens might serve as a means of exchange, the Ocean protocol requires a token of its own because it uses its a specific form of monetary policy and rewards. Users get rewarded with newly minted tokens for providing high quality, relevant data and keeping it available. This means the protocol requires control over the money supply and rules out using any existing general purpose protocols or tokens. Furthermore, from the perspective of Ocean users, volatility in an uncorrelated token would disrupt the orderly value exchange between various stakeholders in the marketplace they desire.
OCEAN Data Providers (Supplying Data)
Actors who have data and want to monetise it, can make it available through Ocean for a price. When their data is used by Data Consumers, Data Providers receive tokens in return.
OCEAN Data Curators (Quality Control)
An interesting concept to Ocean is the application of curation markets. Someone needs to decide what data on Ocean is good and which data is bad. As Ocean is a decentralised system, there can’t be a central committee to do this. Instead, anyone with domain expertise can participate as a Data Curator and earn newly minted tokens by separating the wheat from the chaff. Data Curators put an amount of tokens at stake to signal that a certain dataset is of high quality. Every time they correctly do this, they receive newly minted tokens in return.
OCEAN Registry of Actors (Keeping Bad Actors Out)
Because Ocean is an open protocol, not only does it need mechanisms to curate data, it needs a mechanism to curate the participants themselves. For this reason a Registry of Actors is part of Ocean, again applying staking of tokens to make good behaviour more economically attractive than bad behaviour.
OCEAN Keepers (Making Data Available)
The nodes in the Ocean network are called Keepers. They run the Ocean software and make datasets available to the network. Keepers receive newly minted tokens to perform their function. Data Providers need to use one or more Keepers to offer data to the network.
BRINGING IT ALL TOGETHER
Ocean is building a platform to enable a ‘global data commons’. A platform where anyone can share and be rewarded for the data they contribute where the token and protocol is designed specifically to incentivise data sharing and remuneration.
So let’s see that in the context of a single use-case: Clinical Trial Data
Note: that this use-case is provided for illustrative purposes only, to get a feel for how Ocean could work in practice. Some of the specifics of the Ocean protocol have yet to be finalised and published in the white paper, and might turn out different than described here.
Bob is a clinical physician with a data science background who uses Ocean. He knows his industry well and has experience understanding what types of clinical data are useful in trials. Charlie works at a company that regularly runs medical trials. He has collected a large amount of data for a very specific trial which has now concluded, and he believes it could be valuable for others but he doesn’t know exactly how. Charlie publishes the dataset through Ocean and judging its value (based on the cost to produce and therefore replicate), as well as his confidence in its overall quality, he stakes 5 tokens on it (to prove it is his IP, which if people want to use they must pay for). Charlie uses one of the Keeper nodes maintained by his company’s IT department. Bob, as a Data Curator of clinical trial data on Ocean, is notified of its submission, and sees no one has challenged its ownership. By looking at a sample he decides the data is of good quality and based on how broad its utility could be he stakes 10 Ocean tokens to back his judgement. Bob is not alone and quickly a number of other Data Curators with good reputation also evaluate the data and make a stake. By this point a number of AI developers see Charlie’s dataset is becoming popular and purchase it through Ocean. Charlie, Bob and the other curators get rewarded in newly minted tokens, proportional to the amount they staked and the number of downloads. The Keeper node at Charlie’s company regularly receives a request to cryptographically prove it still has the data available. Each time it answers correctly, it also receives some newly minted tokens. When Bob and Charlie signed up to join Ocean, they staked some tokens to get added to the Registry of Actors. Eve also wants to join Ocean. She stakes 100 tokens to get added to The Registry of Actors. Eve is actually a malicious actor. She purchases Charlie’s dataset through Ocean, then claims it’s hers and publishes it under her own account for a slightly lower price. Furthermore, she creates several more “sock puppet” accounts, each with some more tokens staked to join, to serve as Data Curators and vouch for her copy of the dataset. Bob and Charlie discover Eve’s malice. They successfully challenge Eve and her sock puppet accounts in the Registry of Actors. Eve and her sock puppet accounts get removed from the Registry of Actors and she loses all staking tokens.
APPROACH, TRACTION & TEAM
I am greatly encouraged by the fact that Ocean were aligned to building what we term a Community Token Economy (CTE) where multiple stakeholders ( & ) partner early on to bring together complementary skills and assets.
As two existing companies (one already VC backed) they are committing real code and IP already worth several million in value*.
*This is an important point to remember when considering the valuation and token distribution of the offering.
The open, inclusive, transparent nature of IPDB foundation bodes well for how Ocean will be run and how it will solve complex governance issues as the network grows.
I am also impressed with the team’s understanding of the importance of building a community. They understand that networks are only as powerful as the community that supports it. This is why they have already signed key partnerships with XPrize Foundation, SingularityNet, Mattereum, Integration Alpha and ixo Foundation as well as agreeing an MOU with the Government of Singapore to provide coverage and indemnification for sandboxes for data sharing.
The team understands that the decentralisation movement is still in its early stages and that collaborative and partnership is a more effective model than competition and going it alone.
PLACE IN THE CONVERGENCE ECOSYSTEM STACK
Ocean protocol is a fundamental requirement for the Convergence Ecosystem Stack. It is a protocol that enables a thriving AI data marketplace. It is complementary to our other investments in IOTA and SEED both of whom provide a marketplace for machine data and bots respectively.
Marketplaces are critical to the development of the Convergence Ecosystem as they enable new data-based and tokenised business models that have never before been possible to unlock value. Distributed ledgers, blockchains and other decentralization technologies are powerful tools for authenticating, validating, securing and transporting data; but it will be marketplaces that will enable companies to build sustainable businesses and crack open the incumbent data monopolies. IOTA, SEED and now Ocean are unlocking data for more equitable outcomes for users.
submitted by Econcrypt to CryptoMoonShots [link] [comments]

Celare: “Safe deposit box” for data and assets, protecting privacy and security

Celare: “Safe deposit box” for data and assets, protecting privacy and security

https://preview.redd.it/dlbc4vuw3vn41.jpg?width=960&format=pjpg&auto=webp&s=9553b6c323d285750eab6161ead41d5ab4292125
Privacy security is a growing problem.
On March 19, the sale of 538 million users’ data from Weibo in the form of bitcoin on the Deep Web has raised the issue of privacy security again.
The 36kr analyzed the process for the leak of personal information on Weibo. Hackers upload fake mobile phone address books in batches through the relevant interface of Weibo to match the friend’s account information,then he can match the identity information of the user account successfully.
In fact, with the development of technology, the problem of personal information leakage tends to be more and more dangerous. Even if you take precautions, personal information can become part of the thousands of messages that hackers sell.
With the arrival of the information era, data has become a resource that many businesses compete for, which has given rise to a series of privacy security problems and also spawned a series of gray industrial chains. The privacy of users has gradually become a commodity with a precise price tag. Facebook, Microsoft, Apple, and other global giants have all been exposed for collecting users’ privacy, which still happens.
Data networks can help people to a better life. Still, a series of data privacy problem is to violate the rights and interests of users, such as data leakage by people they know and privacy exposure. The protection of privacy and security has become an urgent issue, but in the centralized system, how to use the data depends on the controller’s preference, and the user never has the dominant right.
Decentralized blockchain is now becoming a better solution to privacy and security problems.
Celare anonymous technology solutions created by blockchain will effectively guarantee the privacy security of users.
https://preview.redd.it/y7mlz6e04vn41.jpg?width=4840&format=pjpg&auto=webp&s=ae3e6ea8e50344d2ffc878010901d59407639ca3
Anonymity is safety
How to ensure the privacy of users? The answer is anonymity.
There are many projects with anonymous technology in the blockchain.
Whether it is the Zero-Knowledge proof mechanism of ZCash, the CoinJoin Scheme of Dash, or the Ring confidential transaction mechanism of XMR, it can ensure the anonymity of transactions to a certain extent and guarantee the users privacy and security.
Celare also uses anonymity to protect users’ privacy. It is the first cross-chain anonymous privacy solution of all digital assets on Polkadot ecology, which is based on blockchain decentralization. Celare has designed a new anonymous mechanism,Non-interactive Zero Knowledge Proof,based on the existing technology. Compared with zero-knowledge proof, the non-interactive system has a more reliable anonymous function, which can completely solve the problem of transaction tracking and protect user privacy.
https://preview.redd.it/2j2lqza44vn41.png?width=1738&format=png&auto=webp&s=84284e2d1d1059790318a61c75da7d4d82973f81
When choosing the zk-SNARK Zero-Knowledge proof curve, Celare chose BLS12–381 curve with a higher security coefficient, which is higher than that of BN128, to guarantee Celare’s top privacy and anonymity.
The method of zero-knowledge proof in practical application is as follows:
When the user registers, the identity information is stored on the server in the form of digital commitment. In the process of identity authentication, the user authenticates himself to the server as a member of the registered user by using the member proof scheme, to avoid the user presenting his identity information to the server every time he logs in.
It is just one fundamental part of Celare’s efforts to protect users’ privacy.
Celare also adds a fully homomorphic encryption scheme in the chain, which can perform arbitrary calculations on the ciphertext without decryption. It is just one of the basics of Celare’s efforts to protect users’ privacy. Full homomorphic encryption can perform arbitrary calculations on the ciphertext without decryption. So the problem of data privacy security can be solved quickly without losing computability.
The comprehensive security technology system is one aspect of Celare protection of user privacy. Besides, Celare uses authorization technology to truly realize that the user is the master of the data, allowing users to control their data freely.
https://preview.redd.it/z7o9ips64vn41.jpg?width=900&format=pjpg&auto=webp&s=ad1cef93b7007b8a55931cc29ff3c986bd33a5f4
Safe and efficient
Safety is only the first step.
What Celare seeks is safety and efficiency.
As is known to all, the three anonymous tokens, Dash, XMR, ZEC, are still used in the field of payment and cannot be further expanded. The reason is that the system does not support smart contracts. And scalability is too low for large-scale commercial use, especially at the data interaction level.
To avoid the limits of anonymous cryptocurrency and better promote the anonymous technology into a broader field, Celare innovative introduced intelligent contracts into its system, which significantly improved Celare efficiency and laid a good foundation for its large-scale commercial use.
Since Celare is a public chain developed based on Polkadot Substrate, it follows Polkadot’s PoS consensus algorithm and contract technology. To maintain the speed and efficiency of data transmission on the chain, Celare will establish a large-scale PoS node network capable of supporting nearly a thousand consensus nodes, infinitely reducing the block out time and ultimately determining the delay time of it.
The high TPS brought by large-scale nodes will provide a technical guarantee for Celare’s widespread application. It also means each transaction of users can be conducted at high speed under the anonymous environment, which not only ensures users’ privacy security but also enables them to enjoy the free experience and indeed promotes the implementation of blockchain technology.
https://preview.redd.it/ds5oqkmc4vn41.jpg?width=800&format=pjpg&auto=webp&s=bbde16cee5e55393984853258240d878d7de3f63
Break the information isolated island, link multiple public chains
Security and efficiency are only part of the Celare blockchain infrastructure. Also, Celare has built a cross-chain technology to interconnect multiple public chains.
For a long time, information cannot be transferred, and digital assets cannot be traded between each public chain, which significantly limits the application space of blockchain. Cross-chain technology came into being, among which Polkadot is the outstanding one.
Celare cross-chain technology also relies on Polkadot. Its internal logic is that the user locks the assets on the original chain and then issues the mapped assets on the target chain. At the same time, the user can apply for a withdrawal on the target chain and unlock the original one.
https://preview.redd.it/hvhnjkqg4vn41.png?width=1406&format=png&auto=webp&s=4d7073000da580b1f897128b4d21f71d49dc4a62
Celare cross-chain technology will further protect users’ privacy and security, which means users can quickly transfer their data and digital assets from other chains to the Celare chain. It helps users consolidate all the data on different chains into a Celare account for easy management. With the help of Celare privacy protection technology, the security of users’ private data is truly guaranteed.
Since the establishment of the project, Celare’s mission has always been to protect users’ privacy and security. Therefore, Celare makes various development and further expansion to better service and privacy and ensure users’ privacy and security.
In the future, Celare will break the barrier of cross-chain assets and truly protect user privacy and anonymity.
Contact Us:
Twitter: @CelareCommunity
Telegram: Celare Community
submitted by Celarecommunity to u/Celarecommunity [link] [comments]

Technical: Upcoming Improvements to Lightning Network

Price? Who gives a shit about price when Lightning Network development is a lot more interesting?????
One thing about LN is that because there's no need for consensus before implementing things, figuring out the status of things is quite a bit more difficult than on Bitcoin. In one hand it lets larger groups of people work on improving LN faster without having to coordinate so much. On the other hand it leads to some fragmentation of the LN space, with compatibility problems occasionally coming up.
The below is just a smattering sample of LN stuff I personally find interesting. There's a bunch of other stuff, like splice and dual-funding, that I won't cover --- post is long enough as-is, and besides, some of the below aren't as well-known.
Anyway.....

"eltoo" Decker-Russell-Osuntokun

Yeah the exciting new Lightning Network channel update protocol!

Advantages

Myths

Disadvantages

Multipart payments / AMP

Splitting up large payments into smaller parts!

Details

Advantages

Disadvantages

Payment points / scalars

Using the magic of elliptic curve homomorphism for fun and Lightning Network profits!
Basically, currently on Lightning an invoice has a payment hash, and the receiver reveals a payment preimage which, when inputted to SHA256, returns the given payment hash.
Instead of using payment hashes and preimages, just replace them with payment points and scalars. An invoice will now contain a payment point, and the receiver reveals a payment scalar (private key) which, when multiplied with the standard generator point G on secp256k1, returns the given payment point.
This is basically Scriptless Script usage on Lightning, instead of HTLCs we have Scriptless Script Pointlocked Timelocked Contracts (PTLCs).

Advantages

Disadvantages

Pay-for-data

Ensuring that payers cannot access data or other digital goods without proof of having paid the provider.
In a nutshell: the payment preimage used as a proof-of-payment is the decryption key of the data. The provider gives the encrypted data, and issues an invoice. The buyer of the data then has to pay over Lightning in order to learn the decryption key, with the decryption key being the payment preimage.

Advantages

Disadvantages

Stuckless payments

No more payments getting stuck somewhere in the Lightning network without knowing whether the payee will ever get paid!
(that's actually a bit overmuch claim, payments still can get stuck, but what "stuckless" really enables is that we can now safely run another parallel payment attempt until any one of the payment attempts get through).
Basically, by using the ability to add points together, the payer can enforce that the payee can only claim the funds if it knows two pieces of information:
  1. The payment scalar corresponding to the payment point in the invoice signed by the payee.
  2. An "acknowledgment" scalar provided by the payer to the payee via another communication path.
This allows the payer to make multiple payment attempts in parallel, unlike the current situation where we must wait for an attempt to fail before trying another route. The payer only needs to ensure it generates different acknowledgment scalars for each payment attempt.
Then, if at least one of the payment attempts reaches the payee, the payee can then acquire the acknowledgment scalar from the payer. Then the payee can acquire the payment. If the payee attempts to acquire multiple acknowledgment scalars for the same payment, the payer just gives out one and then tells the payee "LOL don't try to scam me", so the payee can only acquire a single acknowledgment scalar, meaning it can only claim a payment once; it can't claim multiple parallel payments.

Advantages

Disadvantages

Non-custodial escrow over Lightning

The "acknowledgment" scalar used in stuckless can be reused here.
The acknowledgment scalar is derived as an ECDH shared secret between the payer and the escrow service. On arrival of payment to the payee, the payee queries the escrow to determine if the acknowledgment point is from a scalar that the escrow can derive using ECDH with the payer, plus a hash of the contract terms of the trade (for example, to transfer some goods in exchange for Lightning payment). Once the payee gets confirmation from the escrow that the acknowledgment scalar is known by the escrow, the payee performs the trade, then asks the payer to provide the acknowledgment scalar once the trade completes.
If the payer refuses to give the acknowledgment scalar even though the payee has given over the goods to be traded, then the payee contacts the escrow again, reveals the contract terms text, and requests to be paid. If the escrow finds in favor of the payee (i.e. it determines the goods have arrived at the payer as per the contract text) then it gives the acknowledgment scalar to the payee.

Advantages

Disadvantages

Payment decorrelation

Because elliptic curve points can be added (unlike hashes), for every forwarding node, we an add a "blinding" point / scalar. This prevents multiple forwarding nodes from discovering that they have been on the same payment route. This is unlike the current payment hash + preimage, where the same hash is used along the route.
In fact, the acknowledgment scalar we use in stuckless and escrow can simply be the sum of each blinding scalar used at each forwarding node.

Advantages

Disadvantages

submitted by almkglor to Bitcoin [link] [comments]

Greg Maxwell /u/nullc (CTO of Blockstream) has sent me two private messages in response to my other post today (where I said "Chinese miners can only win big by following the market - not by following Core/Blockstream."). In response to his private messages, I am publicly posting my reply, here:

Note:
Greg Maxell nullc sent me 2 short private messages criticizing me today. For whatever reason, he seems to prefer messaging me privately these days, rather than responding publicly on these forums.
Without asking him for permission to publish his private messages, I do think it should be fine for me to respond to them publicly here - only quoting 3 phrases from them, namely: "340GB", "paid off", and "integrity" LOL.
There was nothing particularly new or revealing in his messages - just more of the same stuff we've all heard before. I have no idea why he prefers responding to me privately these days.
Everything below is written by me - I haven't tried to upload his 2 PMs to me, since he didn't give permission (and I didn't ask). The only stuff below from his 2 PMs is the 3 phrases already mentioned: "340GB", "paid off", and "integrity". The rest of this long wall of text is just my "open letter to Greg."
TL;DR: The code that maximally uses the available hardware and infrastructure will win - and there is nothing Core/Blockstream can do to stop that. Also, things like the Berlin Wall or the Soviet Union lasted for a lot longer than people expected - but, conversely, the also got swept away a lot faster than anyone expected. The "vote" for bigger blocks is an ongoing referendum - and Classic is running on 20-25% of the network (and can and will jump up to the needed 75% very fast, when investors demand it due to the inevitable "congestion crisis") - which must be a massive worry for Greg/Adam/Austin and their backers from the Bilderberg Group. The debate will inevitably be decided in favor of bigger blocks - simply because the market demands it, and the hardware / infrastructure supports it.
Hello Greg Maxwell nullc (CTO of Blockstream) -
Thank you for your private messages in response to my post.
I respect (most of) your work on Bitcoin, but I think you were wrong on several major points in your messages, and in your overall economic approach to Bitcoin - as I explain in greater detail below:
Correcting some inappropriate terminology you used
As everybody knows, Classic or Unlimited or Adaptive (all of which I did mention specifically in my post) do not support "340GB" blocks (which I did not mention in my post).
It is therefore a straw-man for you to claim that big-block supporters want "340GB" blocks. Craig Wright may want that - but nobody else supports his crazy posturing and ridiculous ideas.
You should know that what actual users / investors (and Satoshi) actually do want, is to let the market and the infrastructure decide on the size of actual blocks - which could be around 2 MB, or 4 MB, etc. - gradually growing in accordance with market needs and infrastructure capabilities (free from any arbitrary, artificial central planning and obstructionism on the part of Core/Blockstream, and its investors - many of whom have a vested interest in maintaining the current debt-backed fiat system).
You yourself (nullc) once said somewhere that bigger blocks would probably be fine - ie, they would not pose a decentralization risk. (I can't find the link now - maybe I'll have time to look for it later.) I found the link:
https://np.reddit.com/btc/comments/43mond/even_a_year_ago_i_said_i_though_we_could_probably/
I am also surprised that you now seem to be among those making unfounded insinuations that posters such as myself must somehow be "paid off" - as if intelligent observers and participants could not decide on their own, based on the empirical evidence, that bigger blocks are needed, when the network is obviously becoming congested and additional infrastructure is obviously available.
Random posters on Reddit might say and believe such conspiratorial nonsense - but I had always thought that you, given your intellectual abilities, would have been able to determine that people like me are able to arrive at supporting bigger blocks quite entirely on our own, based on two simple empirical facts, ie:
  • the infrastructure supports bigger blocks now;
  • the market needs bigger blocks now.
In the present case, I will simply assume that you might be having a bad day, for you to erroneously and groundlessly insinuate that I must be "paid off" in order to support bigger blocks.
Using Occam's Razor
The much simpler explanation is that bigger-block supporters believe will get "paid off" from bigger gains for their investment in Bitcoin.
Rational investors and users understand that bigger blocks are necessary, based on the apparent correlation (not necessarily causation!) between volume and price (as mentioned in my other post, and backed up with graphs).
And rational network capacity planners (a group which you should be in - but for some mysterious reason, you're not) also understand that bigger blocks are necessary, and quite feasible (and do not pose any undue "centralization risk".)
As I have been on the record for months publicly stating, I understand that bigger blocks are necessary based on the following two objective, rational reasons:
  • because I've seen the graphs; and
  • because I've seen the empirical research in the field (from guys like Gavin and Toomim) showing that the network infrastructure (primarily bandwidth and latency - but also RAM and CPU) would also support bigger blocks now (I believe they showed that 3-4MB blocks would definitely work fine on the network now - possibly even 8 MB - without causing undue centralization).
Bigger-block supporters are being objective; smaller-block supporters are not
I am surprised that you no longer talk about this debate in those kind of objective terms:
  • bandwidth, latency (including Great Firewall of China), RAM, CPU;
  • centralization risk
Those are really the only considerations which we should be discussing in this debate - because those are the only rational considerations which might justify the argument for keeping 1 MB.
And yet you, and Adam Back adam3us, and your company Blockstream (financed by the Bilderberg Group, which has significant overlap with central banks and the legacy, debt-based, violence-backed fiat money system that has been running and slowing destroying our world) never make such objective, technical arguments anymore.
And when you make unfounded conspiratorial, insulting insinuations saying people who disagree with you on the facts must somehow be "paid off", then you are now talking like some "nobody" on Reddit - making wild baseless accusations that people must be "paid off" to support bigger blocks, something I had always thought was "beneath" you.
Instead, Occams's Razor suggests that people who support bigger blocks are merely doing so out of:
  • simple, rational investment policy; and
  • simple, rational capacity planning.
At this point, the burden is on guys like you (nullc) to explain why you support a so-called scaling "roadmap" which is not aligned with:
  • simple, rational investment policy; and
  • simple, rational capacity planning
The burden is also on guys like you to show that you do not have a conflict of interest, due to Blockstream's highly-publicized connections (via insurance giant AXA - whose CED is also the Chairman of the Bilderberg Group; and companies such as the "Big 4" accounting firm PwC) to the global cartel of debt-based central banks with their infinite money-printing.
In a nutshell, the argument of big-block supporters is simple:
If the hardware / network infrastructure supports bigger blocks (and it does), and if the market demands it (and it does), then we certainly should use bigger blocks - now.
You have never provided a counter-argument to this simple, rational proposition - for the past few years.
If you have actual numbers or evidence or facts or even legitimate concerns (regarding "centralization risk" - presumably your only argument) then you should show such evidence.
But you never have. So we can only assume either incompetence or malfeasance on your part.
As I have also publicly and privately stated to you many times, with the utmost of sincerity: We do of course appreciate the wealth of stellar coding skills which you bring to Bitcoin's cryptographic and networking aspects.
But we do not appreciate the obstructionism and centralization which you also bring to Bitcoin's economic and scaling aspects.
Bitcoin is bigger than you.
The simple reality is this: If you can't / won't let Bitcoin grow naturally, then the market is going to eventually route around you, and billions (eventually trillions) of investor capital and user payments will naturally flow elsewhere.
So: You can either be the guy who wrote the software to provide simple and safe Bitcoin scaling (while maintaining "reasonable" decentralization) - or the guy who didn't.
The choice is yours.
The market, and history, don't really care about:
  • which "side" you (nullc) might be on, or
  • whether you yourself might have been "paid off" (or under a non-disclosure agreement written perhaps by some investors associated the Bilderberg Group and the legacy debt-based fiat money system which they support), or
  • whether or not you might be clueless about economics.
Crypto and/or Bitcoin will move on - with or without you and your obstructionism.
Bigger-block supporters, including myself, are impartial
By the way, my two recent posts this past week on the Craig Wright extravaganza...
...should have given you some indication that I am being impartial and objective, and I do have "integrity" (and I am not "paid off" by anybody, as you so insultingly insinuated).
In other words, much like the market and investors, I don't care who provides bigger blocks - whether it would be Core/Blockstream, or Bitcoin Classic, or (the perhaps confusingly-named) "Bitcoin Unlimited" (which isn't necessarily about some kind of "unlimited" blocksize, but rather simply about liberating users and miners from being "limited" by controls imposed by any centralized group of developers, such as Core/Blockstream and the Bilderbergers who fund you).
So, it should be clear by now I don't care one way or the other about Gavin personally - or about you, or about any other coders.
I care about code, and arguments - regardless of who is providing such things - eg:
  • When Gavin didn't demand crypto proof from Craig, and you said you would have: I publicly criticized Gavin - and I supported you.
  • When you continue to impose needless obstactles to bigger blocks, then I continue to criticize you.
In other words, as we all know, it's not about the people.
It's about the code - and what the market wants, and what the infrastructure will bear.
You of all people should know that that's how these things should be decided.
Fortunately, we can take what we need, and throw away the rest.
Your crypto/networking expertise is appreciated; your dictating of economic parameters is not.
As I have also repeatedly stated in the past, I pretty much support everything coming from you, nullc:
  • your crypto and networking and game-theoretical expertise,
  • your extremely important work on Confidential Transactions / homomorphic encryption.
  • your desire to keep Bitcoin decentralized.
And I (and the network, and the market/investors) will always thank you profusely and quite sincerely for these massive contributions which you make.
But open-source code is (fortunately) à la carte. It's mix-and-match. We can use your crypto and networking code (which is great) - and we can reject your cripple-code (artificially small 1 MB blocks), throwing it where it belongs: in the garbage heap of history.
So I hope you see that I am being rational and objective about what I support (the code) - and that I am also always neutral and impartial regarding who may (or may not) provide it.
And by the way: Bitcoin is actually not as complicated as certain people make it out to be.
This is another point which might be lost on certain people, including:
And that point is this:
The crypto code behind Bitcoin actually is very simple.
And the networking code behind Bitcoin is actually also fairly simple as well.
Right now you may be feeling rather important and special, because you're part of the first wave of development of cryptocurrencies.
But if the cryptocurrency which you're coding (Core/Blockstream's version of Bitcoin, as funded by the Bilderberg Group) fails to deliver what investors want, then investors will dump you so fast your head will spin.
Investors care about money, not code.
So bigger blocks will eventually, inevitably come - simply because the market demand is there, and the infrastructure capacity is there.
It might be nice if bigger blocks would come from Core/Blockstream.
But who knows - it might actually be nicer (in terms of anti-fragility and decentralization of development) if bigger blocks were to come from someone other than Core/Blockstream.
So I'm really not begging you - I'm warning you, for your own benefit (your reputation and place in history), that:
Either way, we are going to get bigger blocks.
Simply because the market wants them, and the hardware / infrastructre can provide them.
And there is nothing you can do to stop us.
So the market will inevitably adopt bigger blocks either with or without you guys - given that the crypto and networking tech behind Bitcoin is not all that complex, and it's open-source, and there is massive pent-up investor demand for cryptocurrency - to the tune of multiple billions (or eventually trillions) of dollars.
It ain't over till the fat lady sings.
Regarding the "success" which certain small-block supports are (prematurely) gloating about, during this time when a hard-fork has not happened yet: they should bear in mind that the market has only begun to speak.
And the first thing it did when it spoke was to dump about 20-25% of Core/Blockstream nodes in a matter of weeks. (And the next thing it did was Gemini added Ethereum trading.)
So a sizable percentage of nodes are already using Classic. Despite desperate, irrelevant attempts of certain posters on these forums to "spin" the current situation as a "win" for Core - it is actually a major "fail" for Core.
Because if Core/Blocksteam were not "blocking" Bitcoin's natural, organic growth with that crappy little line of temporary anti-spam kludge-code which you and your minions have refused to delete despite Satoshi explicitly telling you to back in 2010 ("MAX_BLOCKSIZE = 1000000"), then there would be something close to 0% nodes running Classic - not 25% (and many more addable at the drop of a hat).
This vote is ongoing.
This "voting" is not like a normal vote in a national election, which is over in one day.
Unfortunately for Core/Blockstream, the "voting" for Classic and against Core is actually two-year-long referendum.
It is still ongoing, and it can rapidly swing in favor of Classic at any time between now and Classic's install-by date (around January 1, 2018 I believe) - at any point when the market decides that it needs and wants bigger blocks (ie, due to a congestion crisis).
You know this, Adam Back knows this, Austin Hill knows this, and some of your brainwashed supporters on censored forums probably know this too.
This is probably the main reason why you're all so freaked out and feel the need to even respond to us unwashed bigger-block supporters, instead of simply ignoring us.
This is probably the main reason why Adam Back feels the need to keep flying around the world, holding meetings with miners, making PowerPoint presentations in English and Chinese, and possibly also making secret deals behind the scenes.
This is also why Theymos feels the need to censor.
And this is perhaps also why your brainwashed supporters from censored forums feel the need to constantly make their juvenile, content-free, drive-by comments (and perhaps also why you evidently feel the need to privately message me your own comments now).
Because, once again, for the umpteenth time in years, you've seen that we are not going away.
Every day you get another worrisome, painful reminder from us that Classic is still running on 25% of "your" network.
And everyday get another worrisome, painful reminder that Classic could easily jump to 75% in a matter of days - as soon as investors see their $7 billion wealth starting to evaporate when the network goes into a congestion crisis due to your obstructionism and insistence on artificially small 1 MB blocks.
If your code were good enough to stand on its own, then all of Core's globetrotting and campaigning and censorship would be necessary.
But you know, and everyone else knows, that your cripple-code does not include simple and safe scaling - and the competing code (Classic, Unlimited) does.
So your code cannot stand on its own - and that's why you and your supporters feel that it's necessary to keep up the censorship and and the lies and the snark. It's shameful that a smart coder like you would be involved with such tactics.
Oppressive regimes always last longer than everyone expects - but they also also collapse faster than anyone expects.
We already have interesting historical precedents showing how grassroots resistance to centralized oppression and obstructionism tends to work out in the end. The phenomenon is two-fold:
  • The oppression usually drags on much longer than anyone expects; and
  • The liberation usually happens quite abruptly - much faster than anyone expects.
The Berlin Wall stayed up much longer than everyone expected - but it also came tumbling down much faster than everyone expected.
Examples of opporessive regimes that held on surprisingly long, and collapsed surpisingly fast, are rather common - eg, the collapse of the Berlin Wall, or the collapse of the Soviet Union.
(Both examples are actually quite germane to the case of Blockstream/Core/Theymos - as those despotic regimes were also held together by the fragile chewing gum and paper clips of denialism and censorship, and the brainwashed but ultimately complacent and fragile yes-men that inevitably arise in such an environment.)
The Berlin Wall did indeed seem like it would never come down. But the grassroots resistance against it was always there, in the wings, chipping away at the oppression, trying to break free.
And then when it did come down, it happened in a matter of days - much faster than anyone had expected.
That's generally how these things tend to go:
  • oppression and obstructionism drag on forever, and the people oppressing freedom and progress erroneously believe that Core/Blockstream is "winning" (in this case: Blockstream/Core and you and Adam and Austin - and the clueless yes-men on censored forums like r\bitcoin who mindlessly support you, and the obedient Chinese miners who, thus far, have apparently been to polite to oppose you) ;
  • then one fine day, the market (or society) mysteriously and abruptly decides one day that "enough is enough" - and the tsunami comes in and washes the oppressors away in the blink of an eye.
So all these non-entities with their drive-by comments on these threads and their premature gloating and triumphalism are irrelevant in the long term.
The only thing that really matters is investors and users - who are continually applying grassroots pressure on the network, demanding increased capacity to keep the transactions flowing (and the price rising).
And then one day: the Berlin Wall comes tumbling down - or in the case of Bitcoin: a bunch of mining pools have to switch to Classic, and they will do switch so fast it will make your head spin.
Because there will be an emergency congestion crisis where the network is causing the price to crash and threatening to destroy $7 billion in investor wealth.
So it is understandable that your supports might sometimes prematurely gloat, or you might feel the need to try to comment publicly or privately, or Adam might feel the need to jet around the world.
Because a large chunk of people have rejected your code.
And because many more can and will - and they'll do in the blink of an eye.
Classic is still out there, "waiting in the wings", ready to be installed, whenever the investors tell the miners that it is needed.
Fortunately for big-block supporters, in this "election", the polls don't stay open for just one day, like in national elections.
The voting for Classic is on-going - it runs for two years. It is happening now, and it will continue to happen until around January 1, 2018 (which is when Classic-as-an-option has been set to officially "expire").
To make a weird comparison with American presidential politics: It's kinda like if either Hillary or Trump were already in office - but meanwhile there was also an ongoing election (where people could change their votes as often as they want), and the day when people got fed up with the incompetent incumbent, they can throw them out (and install someone like Bernie instead) in the blink of an eye.
So while the inertia does favor the incumbent (because people are lazy: it takes them a while to become informed, or fed up, or panicked), this kind of long-running, basically never-ending election favors the insurgent (because once the incumbent visibly screws up, the insurgent gets adopted - permanently).
Everyone knows that Satoshi explicitly defined Bitcoin to be a voting system, in and of itself. Not only does the network vote on which valid block to append next to the chain - the network also votes on the very definition of what a "valid block" is.
Go ahead and re-read the anonymous PDF that was recently posted on the subject of how you are dangerously centralizing Bitcoin by trying to prevent any votes from taking place:
https://np.reddit.com/btc/comments/4hxlquhoh_a_warning_regarding_the_onset_of_centralised/
The insurgent (Classic, Unlimited) is right (they maximally use available bandwidth) - while the incumbent (Core) is wrong (it needlessly throws bandwidth out the window, choking the network, suppressing volume, and hurting the price).
And you, and Adam, and Austin Hill - and your funders from the Bilderberg Group - must be freaking out that there is no way you can get rid of Classic (due to the open-source nature of cryptocurrency and Bitcoin).
Cripple-code will always be rejected by the network.
Classic is already running on about 20%-25% of nodes, and there is nothing you can do to stop it - except commenting on these threads, or having guys like Adam flying around the world doing PowerPoints, etc.
Everything you do is irrelevant when compared against billions of dollars in current wealth (and possibly trillions more down the road) which needs and wants and will get bigger blocks.
You guys no longer even make technical arguments against bigger blocks - because there are none: Classic's codebase is 99% the same as Core, except with bigger blocks.
So when we do finally get bigger blocks, we will get them very, very fast: because it only takes a few hours to upgrade the software to keep all the good crypto and networking code that Core/Blockstream wrote - while tossing that single line of 1 MB "max blocksize" cripple-code from Core/Blockstream into the dustbin of history - just like people did with the Berlin Wall.
submitted by ydtm to btc [link] [comments]

GMaxwell in 2006, during his Wikipedia vandalism episode: "I feel great because I can still do what I want, and I don't have to worry what rude jerks think about me ... I can continue to do whatever I think is right without the burden of explaining myself to a shreaking [sic] mass of people."

https://en.wikipedia.org/w/index.php?title=User_talk:Gmaxwell&diff=prev&oldid=36330829
Is anyone starting to notice a pattern here?
Now we're starting to see that it's all been part of a long-term pattern of behavior for the last 10 years with Gregory Maxwell, who has deep-seated tendencies towards:
After examining his long record of harmful behavior on open-source software projects, it seems fair to summarize his strengths and weaknesses as follows:
(1) He does have excellent programming skills.
(2) He likes needs to be in control.
(3) He always believes that whatever he's doing is "right" - even if a consensus of other highly qualified people happen to disagree with him (who he rudely dismisses "shrieking masses", etc.)
(4) Because of (1), (2), and (3) we are now seeing how dangerous is can be to let him assume power over an open-source software project.
This whole mess could have been avoided.
This whole only happened because people let Gregory Maxwell "be in charge" of Bitcoin development as CTO of Blockstream;
The whole reason the Bitcoin community is divided right now is simply because Gregory Maxwell is dead-set against any increase in "max blocksize" even to a measly 2 MB (he actually threatened to leave the project if it went over 1 MB).
This whole problem would go away if he could simply be man enough to step up and say to the Bitcoin community:
"I would like to offer my apologies for having been so stubborn and divisive and trying to always be in control. Although it is still my honest personal belief that that a 1 MB 'max blocksize' would be the best for Bitcoin, many others in the community evidently disagree with me strongly on this, as they have been vehement and unrelenting in their opposition to me for over a year now. I now see that any imagined damage to the network resulting from allowing big blocks would be nothing in comparison to the very real damage to the community resulting from forcing small blocks. Therefore I have decided that I will no longer attempt to force my views onto the community, and I shall no longer oppose a 'max blocksize' increase at this time."
Good luck waiting for that kind of an announcement from GMax! We have about as much a chance of GMax voluntarily stepping down as leader of Bitcoin, as Putin voluntarily stepping down as leader of Russia. It's just not in their nature.
As we now know - from his 10-year history of divisiveness and vandalism, and from his past year of stonewalling - he would never compromise like this, compromise is simply not part of his vocabulary.
So he continues to try to impose his wishes on the community, even in the face of ample evidence that the blocksize could easily be not only 2 MB but even 3-4 MB right now - ie, both the infrastructure and the community have been empirically surveyed and it was found that the people and the bandwidth would both easily support 3-4 MB already.
But instead, Greg would rather use his postion as "Blockstream CTO" to overrule everyone who supports bigger blocks, telling us that it's impossible.
And remember, this is the same guy who a few years ago was also telling us that Bitcoin itself was "mathematically impossible".
So here's a great plan get rich:
(1) Find a programmer who's divisive and a control freak and who overrides consensus and who didn't believe that Bitcoin was possible and and doesn't believe that it can do simple "max blocksize"-based scaling (even in the face of massive evidence to the contrary).
(2) Invest $21+55 million in a private company and make him the CTO (and make Adam Back the CEO - another guy who also didn't believe that Bitcoin would work).
(3) ???
(4) Profit!
Greg and his supporters say bigblocks "might" harm Bitcoin someday - but they ignore the fact that smallblocks are already harming Bitcoin now.
Everyone from Core / Blockstream mindlessly repeats Greg's mantra that "allowing 2 MB blocks could harm the network" - somehow, someday (but actually, probably not: see Footnotes [1], [2], [3], and [4] below).
Meanhwhile, the people who foolishly put their trust in Greg are ignoring the fact that "constraining to 1 MB blocks is harming the community" - right now (ie, people's investments and businesses are already starting to suffer).
This is the sad situation we're in.
And everybody could end up paying the price - which could reach millions or billions of dollars if people don't wake up soon and get rid of Greg Maxwell's toxic influence on this project.
At some point, no matter how great Gregory Maxwell's coding skills may be, the "money guys" behind Blockstream (Austin Hill et al.), and their newer partners such as the international accounting consultancy PwC - and also the people who currently hold $5-6 billion dollars in Bitcoin wealth - and the miners - might want to consider the fact that Gregory Maxwell is so divisive and out-of-touch with the community, that by letting him continue to play CTO of Bitcoin, they may be in danger of killing the whole project - and flushing their investments and businesses down the toilet.
Imagine how things could have been right now without GMax.
Just imagine how things would be right now if Gregory Maxwell hadn't wormed his way into getting control of Bitcoin:
There is a place for everyone.
Talented, principled programmers like Greg Maxwell do have their place on software development projects.
Things would have been fine if we had just let him work on some complicated mathematical stuff like Confidential Transactions (Adam Back's "homomorphic encryption") - because he's great for that sort of thing.
(I know Greg keeps taking this as a "back-handed (ie, insincere) compliment" from me nullc - but I do mean it with all sincerity: I think he have great programming and cryptography skills, and I think his work on Confidential Transactions could be a milestone for Bitcoin's privacy and fungibility. But first Bitcoin has to actually survive as a going project, and it might not survive if he continues insist on tring to impose his will in areas where he's obviously less qualified, such as this whole "max blocksize" thing where the infrastructure and the market should be in charge, not a coder.)
But Gregory Maxwell is too divisive and too much of a control freak (and too out-of-touch about what the technology and the market are actually ready for) to be "in charge" of this software development project as a CTO.
So this is your CTO, Bitcoin. Deal with it.
He dismissed everyone on Wikipedia back then as "shrieking masses" and he dismisses /btc as a "cesspool" now.
This guy is never gonna change. He was like this 10 years ago, and he's still like this now.
He's one of those arrogant C/C++ programmers, who thinks that because he understands C/C++, he's smarter than everyone else.
It doesn't matter if you also know how to code (in C/C++ or some other langugage).
It doesn't matter if you understand markets and economics.
It doesn't matter if you run a profitable company.
It doesn't even matter if you're Satoshi Nakamoto:
Satoshi Nakamoto, October 04, 2010, 07:48:40 PM "It can be phased in, like: if (blocknumber > 115000) maxblocksize = largerlimit / It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete."
https://np.reddit.com/btc/comments/3wo9pb/satoshi_nakamoto_october_04_2010_074840_pm_it_can/
Gregory Maxwell is in charge of Bitcoin now - and he doesn't give a flying fuck what anyone else thinks.
He has and always will simply "do whatever he thinks is right without the burden of explaining himself to you" - even he has to destroy the community and the project in the process.
That's just the kind of person he is - 10 years ago on Wikipedia (when he was just one of many editors), and now (where he's managed to become CTO of a company which took over Satoshi's respository and paid off most of its devs).
We now have to make a choice:
Footnotes:
[1]
If Bitcoin usage and blocksize increase, then mining would simply migrate from 4 conglomerates in China (and Luke-Jr's slow internet =) to the top cities worldwide with Gigabit broadban - and price and volume would go way up. So how would this be "bad" for Bitcoin as a whole??
https://np.reddit.com/btc/comments/3tadml/if_bitcoin_usage_and_blocksize_increase_then/
[2]
"What if every bank and accounting firm needed to start running a Bitcoin node?" – bdarmstrong
https://np.reddit.com/btc/comments/3zaony/what_if_every_bank_and_accounting_firm_needed_to/
[3]
It may well be that small blocks are what is centralizing mining in China. Bigger blocks would have a strongly decentralizing effect by taming the relative influence China's power-cost edge has over other countries' connectivity edge. – ForkiusMaximus
https://np.reddit.com/btc/comments/3ybl8it_may_well_be_that_small_blocks_are_what_is/
[4]
Blockchain Neutrality: "No-one should give a shit if the NSA, big businesses or the Chinese govt is running a node where most backyard nodes can no longer keep up. As long as the NSA and China DON'T TRUST EACH OTHER, then their nodes are just as good as nodes run in a basement" - ferretinjapan
https://np.reddit.com/btc/comments/3uwebe/blockchain_neutrality_noone_should_give_a_shit_if/
submitted by ydtm to btc [link] [comments]

Adam Back & Greg Maxwell are experts in mathematics and engineering, but not in markets and economics. They should not be in charge of "central planning" for things like "max blocksize". They're desperately attempting to prevent the market from deciding on this. But it *will*, despite their efforts.

Adam Back apparently missed the boat on being an early adopter, even after he was personally informed about Bitcoin in an email from Satoshi.
So Adam didn't mine or buy when bitcoins were cheap.
And he didn't join Bitcoin's Github repo until the price was at an all-time high.
He did invent HashCash, and on his Twitter page he proudly claims that "Bitcoin is just HashCash plus inflation control."
But even with all his knowledge of math and cryptography, he obviously did not understand enough about markets and economics - so he missed the boat on Bitcoin - and now he's working overtime to try to make up for his big mistake, with $21+55 million in venture-capital fiat backing him and his new company, Blockstream (founded in November 2014).
Meanwhile, a lot of the rest of us, without a PhD in math and crypto, were actually smarter than Adam about markets and economics.
And this is really the heart of the matter in these ongoing debates we're still forced to keep having with him.
So now it actually might make a certain amount of economic sense for us to spend some of our time trying to get adam3us Adam Back (and nullc Gregory Maxwell) to stop hijacking our Bitcoin codebase.
Satoshi didn't give the Bitcoin repo to a couple of economically clueless C/C++ devs so that they could cripple it by imposing artificial scarcity on blockchain capacity.
Satoshi was against central economic planners, and he gave Bitcoin to the world so that it could grow naturally as a decentralized, market-based emergent phenomenon.
Adam Back didn't understand the economics of Bitcoin back then - and he still doesn't understand it now.
And now we're also discovering that he apparently has a very weak understanding of legal concepts as well.
And that he also has a very weak understanding of negotiating techniques as well.
Who is he to tell us we should not do simple "max blocksize"-based scaling now - simply because he might have some pie-in-the-sky Rube-Goldberg-contraption solution months or years down the road?
He hasn't even figured out how to do decentralized path-finding in his precious Lightning Network.
So really what he's saying is:
I have half a napkin sketch here for a complicated expensive Rube-Goldberg-contraption solution with a cool name "Lightning Network"...
which might work several months or years down the road...
except I'm still stuck on the decentralized path-finding part...
but that's only a detail!
just like that little detail of "inflation control" which I was also too dumb to add to HashCash for years and years...
and which I was also too dumb to even recognize when someone shoved a working implementation of it in my face and told me I might be able to get rich off of it...
So trust me...
My solution will be much safer than that "other" ultra-simple KISS solution (Classic)...
which only involved changing a 1 MB to a 2 MB in some code, based on empirical tests which showed that the miners and their infrastructure would actually already probably support as much as 3 MB or 4 MB...
and which is already smoothly running on over 1,000 nodes on the network!
That's his roadmap: pie-in-the-sky, a day late and a dollar short.
That's what he has been trying to force on the community for over a year now - relying on censorship of online forums and international congresses, relying on spreading lies and FUD - and now even making vague ridiculous legal threats...
...but we still won't be intimidated by him, even after a year of his FUD and lies, with his PhD and his $21+55 million in VC backing.
Because he appears to be just plain wrong again.
Just like he was wrong about Bitcoin when he first heard about it.
Adam Back needs to face the simple fact that he does not understand how markets and economics work in the real world.
And he also evidently does not understand how negotiating and law and open-source projects work in the real world.
If he didn't have Theymos theymos supporting him via censorship, and austindhill Austin Hill and the other venture capitalists backing him with millions of dollars, then Adam Back would probably be just another unknown Bitcoin researcher right now, toiling away over yet another possible scaling solution candidate which nobody was paying much attention to yet, and which might make a splash a few months or years down the road (provided he eventually figures out that one nagging little detail about how to add the "decentralized path-finding"!).
In the meantime, Adam Back has hijacked our code to use as his own little pet C/C++ crypto programming project, for his maybe-someday scaling solution - and he is doing everything he can to suppress Satoshi's original, much simpler scaling plan.
Adam is all impeding Bitcoin's natural growth in adoption and price, through:
Transactions vs. Price graph showed amazingly tight correlation from 2011 to 2014. Then Blockstream was founded in November 2014 - and the correlation decoupled and the price stagnated.
Seriously, look closely at the graph in that imgur link:
https://imgur.com/jLnrOuK
What's going on in that graph?
So it seems logical to formulate the following hypothesis:
This, in a nutshell, is the hypothesis which the market is eager to test.
Via a hard-fork.
Which was not controversial to anyone in the Bitcoin community previously.
Including Satoshi Nakamoto:
Satoshi Nakamoto, October 04, 2010, 07:48:40 PM "It can be phased in, like: if (blocknumber > 115000) maxblocksize = largerlimit / It can start being in versions way ahead, so by the time it reaches that block number and goes into effect, the older versions that don't have it are already obsolete."
https://np.reddit.com/btc/comments/3wo9pb/satoshi_nakamoto_october_04_2010_074840_pm_it_can/
Including adam3us Adam Back:
Adam Back: 2MB now, 4MB in 2 years, 8MB in 4 years, then re-assess
https://np.reddit.com/Bitcoin/comments/3ihf2b/adam_back_2mb_now_4mb_in_2_years_8mb_in_4_years/
Including nullc Greg Maxwell:
"Even a year ago I said I though we could probably survive 2MB" - nullc
https://np.reddit.com/btc/comments/43mond/even_a_year_ago_i_said_i_though_we_could_probably/):
Including theymos Theymos:
Theymos: "Chain-forks [='hardforks'] are not inherently bad. If the network disagrees about a policy, a split is good. The better policy will win" ... "I disagree with the idea that changing the max block size is a violation of the 'Bitcoin currency guarantees'. Satoshi said it could be increased."
https://np.reddit.com/btc/comments/45zh9d/theymos_chainforks_hardforks_are_not_inherently/).
And the market probably will test this. As soon as it needs to.
Because Bitstream's $21+55 million in VC funding is just a drop in the bucket next to Bitcoin's $5-6 million dollars in market capitalization - which smart Bitcoin investors will do everything they can to preserve and increase.
The hubris and blindness of certain C/C++ programmers
In Adam's mind, he's probably a "good guy" - just some innocent programmer into crypto who thinks he understands Bitcoin and "knows best" how to scale it.
But he's wrong about the economics and scaling of Bitcoin now - just like he was wrong about the economics and scaling of Bitcoin back when he missed the boat on being an early adopter.
His vision back then (when he missed the boat) was too pessimistic - and his scaling plan right now (when he assents to the roadmap published by Gregory Maxwell) is too baroque (ie, needlessly complex) - and "too little, too late".
A self-fulfilling prophecy?
In some very real sense, there is a risk here that Adam's own pessimism about Bitcoin could turn into a self-fulfilling prophecy.
In other words, he never thought Bitcoin would succeed - and now maybe it really won't succeed, now that he has unfairly hijacked its main repo and is attempting to steer it in a direction which Satoshi clearly never intended.
It's even quite possible that there could be a subtle psychological phenomenon at play here: at some (unconscious) level, maybe Adam wants to prove that he was "right" when he missed the boat on Bitcoin because he thought it would never work.
After all, if Bitcoin fails (even due to him unfairly hijacking the code and the debate), then in some sense, it would be a kind of vindication for him.
Adam Back has simply never believed in Bitcoin and supported it the way most of the rest of us do. So he may (subconsciously) actually want to see it fail.
Subconscious "ego" issues may be at play.
There may be some complex, probably subconscious "ego" issues at play here.
I know this is a serious accusation - but after years of this foot-dragging and stonewalling from Adam, trying to strangle Bitcoin's natural growth, he shouldn't be surprised if people start accusing him (his ego, his blindness, his lack of understanding of markets and economics) of being one of the main risk factors which could seriously hurt Bitcoin.
This is probably a much more serious problem than he himself can probably ever comprehend. For it goes to one of his "blind spots" - which (by definition), he can never see - but the rest of the community can.
He thinks he's just some smart guy who is trying to help Bitcoin - and he is smart about certain things and he can help Bitcoin in certain ways.
For example, I was a big fan of Adam's back when I read his posts on bitcointalk.org about "homomorphic encryption" (which I guess now has been renamed as "Confidential Transactions" - "CT").
But, regarding his work on the so-called "Lightning Network", many people are still unconvinced on a few major points - eg:
  • LN would be quite complex and is still unproven, so we actually have no indication of whether it might not contain some minor but fatal flaw which will prevent it from working altogether;
  • In particular, there isn't even a "napkin sketch" or working concept for the most important component of LN - "decentralized path-finding":
https://np.reddit.com/bitcoin_uncensored/comments/3gjnmd/lightning_may_not_be_a_scaling_solution/
https://np.reddit.com/btc/comments/43sgqd/unullc_vs_buttcoiner_on_decentralized_routing_of/
https://np.reddit.com/btc/comments/43oi26/lightning_network_is_selling_as_a_decentralized/
  • It is simply unconscionable for Adam to oppose simpler "max blocksize"-based, on-chain scaling solutions now, apparently due to his unproven belief that a more complex off-chain and still-unimplemented scaling solution such as LN later would somehow be preferable (especially when LN still lacks a any plan for providing the key component of "decentralized path-finding").
Venture capitalists and censors have made Adam much more important than he should be.
If this were a "normal" or "traditional" flame war on a dev mailing list (ie, if there were no censorship from Theymos helping Adam, and no $21-55 million in VC helping Adam) - then the community would be ignoring Adam.
He'd be just another lonely math PhD toiling away on some half-baked pet project, ignored by the community instead of "leading" it.
So Adam (and Greg) are not smart about everything.
In particular, they do not appear to have a deep understanding how markets and economics work.
And we have proof of this - eg, in the form of:
Satoshi was an exception. He knew enough about markets and math, and enough about engineering and economics, to release the Bitcoin code which has worked almost flawlessly for 7 years now.
But guys like Adam and Greg are only good at engineering - they're terrible at economics.
As programmers, they have an engineer's mindset, where something is a "solution" only if it satisfies certain strict mathematical criteria.
But look around. A lot of technologies have become massively successful, despite being imperfect from the point of view of programming / mathematics, strictly speaking.
Just look at HTML / JavaScript / CSS - certainly not the greatest of languages in the opinions of many serious programmers - and yet here we are today, where they have become the de facto low-level languages which most of the world uses to interact on the Internet.
The "perfect" is the enemy of the "good".
The above saying captures much of the essence of the arguments continually being made against guys like Adam and Greg.
They don't understand how a solution which is merely "good enough" can actually take over the world.
They tend to "over-engineer" stuff, and they tend to ignore important issues about how markets and programs can interact in the real world.
In other words, they fail to understand that sometimes it's more important to get something "imperfect" out the door now, instead of taking too long to release something "perfect"...
... because time and tide waits for no man, and Bitcoin / Blockstream / Core are not the only cryptocurrency game in town.
If Adam and Greg can't provide the scaling which the market needs, when it needs it, then the market can and will look elsewhere.
This is why so many of us are arguing that (as paradoxical and deflating as it may feel for certain coders with massive egos) they don't actually always know best - and maybe, just maybe, Bitcoin would thrive even better if they would simply get out of the way and let the market decide certain things.
Coders often think they're the smartest guys in the room.
Many people involved in Bitcoin know that coders like Adam and Greg are used to thinking that they're the smartest guys in the room.
In particular, we know this because many of us have gone through this same experience in our own fields of expertise (but evidently most of us have acquired enough social skills and self awareness to be able to "compensate" for this much better than they have).
So we know how this can lead to a kind of hubris - where they simply automatically brush off and disregard the objections of "the unwashed masses" who happen to disagree with them.
Many of us also have had the experience of talking to "that C/C++ programmer guy" - in a class, at a seminar, at a party - and realizing that "he just doesn't get" many of the things that everyone else does get.
Why is why some of us continue to lecture Adam and Greg like this.
Because we know guys like them - and we know that they aren't as smart about everything as they think they are.
They should really sit down and seriously analyze a comment such as the following:
https://np.reddit.com/btc/comments/44qr31/gregory_maxwell_unullc_has_evidently_never_heard/czs7uis
He [Greg Maxwell] is not alone. Most of his team shares his ignorance.
Here's everything you need to know: The team considers the limit simply a question of engineering, and will silence discussion on its economic impact since "this is an engineering decision."
It's a joke. They are literally re-creating the technocracy of the Fed through a combination of computer science and a complete ignorance of the way the world works.
If ten smart guys in a room could outsmart the market, we wouldn't need Bitcoin.
~ tsontar
Adam and Greg probably read comments like that and just brush them off.
They probably think guys like tsontar are irrelevant.
They probably say to themselves: "That guy doesn't have a PhD in mathematics, and he doesn't know how to do C pointer arithmetic - so what can he possibly know about Bitcoin?"
But history has already shown that a lot of times, a non-mathematician, non-C-coder does know more about Bitcoin than a cryptography expert with a PhD in math.
Clearly, tsontar understands markets way better than adam3us or nullc.
Do they really grasp the seriousness of the criticism being leveled at them?
They are literally re-creating the technocracy of the Fed through a combination of computer science and a complete ignorance of the way the world works.
If ten smart guys in a room could outsmart the market, we wouldn't need Bitcoin.
https://np.reddit.com/btc/comments/44qr31/gregory_maxwell_unullc_has_evidently_never_heard/czs7uis
Do Adam and Greg really understand what this means?
Do they really understand what a serious indictment of their intellectual faculties this apparently off-handed remark really is?
These are the real issues now - issues about markets and economics.
And as we keep saying: if they don't understand the real issues, then they should please just get out of the way.
After months and months of them failing to mount any kind of intelligent response to such utterly scathing criticisms - and their insistence on closing their eyes and pretending that Bitcoin doesn't need a simple scaling solution as of "yesterday" - the Bitcoin-using public is finally figuring out that Adam and Greg cannot deliver what we need, when we need it.
One of the main things that the Bitcoin-using public doesn't want is the artificial "max blocksize" which Adam and Greg are stubbornly and blindly trying to force on us via the code repo which they hijacked from us.
One of the main things the Bitcoin-using public does want is for Bitcoin to be freed from the shackles of any artificial scarcity on the blockchain capacity, which guys like Adam and Greg insist on imposing upon it - in their utter cluelessness about how markets and economics and emergent phenomena actually work.
People's money is on the line. Taking our code back from them may actually be the most important job many of us have right now.
This isn't some kind of academic exercise, nor is it some kind of joke.
For many of us, this is dead serious.
There is currently $ 5-6 billion dollars of wealth on the line (and possibly much, much more someday).
And many people think that Adam and Greg are the main parties responsible for jeopardizing this massive wealth - with their arrogance and their obtuseness and their refusal to understand that they aren't smarter than the market.
So, most people's only hope now is that the market itself stop Adam and Greg from interfering in issues of markets and economics and simple scaling which are clearly beyond their comprehension - ie (to reiterate):
And after a year of their increasingly desperate FUD and lies and stone-walling and foot-dragging, it looks like the market is eventually going to simply route around them.
submitted by ydtm to btc [link] [comments]

A thing to consider while reading this sub and up-voted comments...

Having been here for a while I think this community is now roughly 80% ether traders and 20% bitcoin maximalists spreading FUD to pump their investment, which is lame as ethereum growth doesn’t limit bitcoin’s, and actually the main gateway into ethereum is still bitcoin, something to consider when looking at bitcoin’s trading volume. And if you think about it, whichever is the outcome of “the halvening” it will be good for eth too!
I believe that around 70% of the market cap belongs to early adopters and mostly holders (like me) focused on ethereum development (people who actually know about homomorphic encryption, zk-snarks, PoS, sharding, etc) and the other 30% are day traders that care mostly about short term money, playing what John Bogle calls "the losers game" because most of them lose money.
And here’s a tip, people deciding ethereum’s economic and monetary policies have vested interest in its success as they themselves hold ether and will probably come out with a solution that first, benefits ethereum’s future as a platform and second, benefits ethereum holders. Remember this community actually has the ability to do hard forks considering all stakeholders (including miners of course).
So with that in mind, consider that only some of the traders are experienced and even some of those might be coordinated whales, so price action is tricky, technical analysis doesn't mean much in this landscape and coordinated whales might have slowly been selling since the price broke six dollars towards the 15 dollar moon, maybe saving some for a coordinated dump to scare weak hands and get some cheap eth. But don’t be a fool, fundamentals are super strong and ethereum is regarded as the blockchain as it was supposed to be and it’s already eating the world WHILE IN BETA!
EDIT: this post went from 10 upvotes in an hour to 6, then to 8, maybe that tells you everything about what's going on here...
submitted by hmontalvo369 to ethtrader [link] [comments]

Blinded bearer certificates

A while ago I wrote a post about quickly scaling via federated sidechains, which generated a lot of good discussion. Now I'd like to bring people up-to-speed on another semi-centralized solution to scaling quickly, and this one also solves anonymity. It's called blinded bearer certificates. This is an idea well-known by Bitcoin experts, based on a paper by David Chaum written in 1983. Because all of the experts are familiar with the idea and are not terribly interested in it on an intellectual level, it is not discussed much. But people who don't study this stuff probably won't know about it, and it could be very usefully applied to Bitcoin.
As applied to Bitcoin, it would work like this:
For simplicity, above, each certificate was worth 1 satoshi. You can instead do a mixture of certificate denominations for increased space and time efficiency. If the sender doesn't have the correct denominations for a particular trade, then he can have the bank transform some higher-denomination certificates into lower-denomination certificates; there's no need for change from the recipient. For anonymity, there should be only a small number of different denominations. It's possible to make certificates divisible, but this complicates the scheme and probably harms anonymity.
Blinded bearer certificates have a number of excellent properties:
The obvious flaw, and the reason why this idea never took off despite being known and possible since 1983, is that the bank is a single point of failure and can steal all of the money. However, when built on Bitcoin's decentralized contracts system, it's possible to spread the trust out far more than was possible previously. If the "bank" is composed of 20 trustworthy and totally-independent entities in 20 different countries, and 80% of them have to turn evil in order to steal the money, then it seems to me that the system is secure enough for smaller values (less than a few hundred dollars in total value tied up in these centralized certificates, say). It'd additionally be possible for wallets supporting these certificates to automatically diversify among different issuing banks, spreading out the risk even more. For example, a wallet might ask the user how much money it trusts BankA with, how much money it trusts BankB with, etc., and then automatically trade certificates between all of the banks to ensure that these limits are not long exceeded.
This would be an excellent shorter-term solution to the problems of scaling and anonymity. It requires no changes to Bitcoin itself, and the technologies are not that difficult, requiring no complicated P2P network or anything. (Later on, as technologies mature, I'd expect decentralized solutions such as Lightning to replace these certificates in almost all cases.)
Open Transactions already mostly implements this idea, but IMO it's way over-designed, to the extent that hardly anybody is willing to figure it out. I've long wished that somebody would write a quick-and-dirty certificate server and wallet just in order to popularize this idea, and then development on more complicated stuff can proceed from there.
submitted by theymos to Bitcoin [link] [comments]

A summary of Hadron's whitepaper

Hadron

submitted by shrieko to HadronCoin [link] [comments]

Intergalactic Money: The deep impact of a self-evolving infinitely-scalable general-purpose realtime unforkable public blockchain federation

Intergalactic Money: The deep impact of a self-evolving infinitely-scalable general-purpose realtime unforkable public blockchain federation
Prologue: This article is a strategic response to the following crypto-related papers published in 2017: 1. “An (Institutional) Investor’s Take on Cryptoassets” by John Pfeffer of Pfeffer Capital and 2. “Plasma: Scalable Autonomous Smart Contracts” by Joseph Poon of Lightning Network and Vitalik Buterin of Ethereum Foundation.
John Pfeffer in his paper titled “An (Institutional) Investor’s Take on Cryptoassets” claims that “scaling solutions for blockchains in particular and decentralized networks including (implied) DAG-based networks such as PoS, Sharding, etc. are bullish for adoption and users/consumers but bearish for token value/investors. Even without those technology shifts, the cost of using decentralized protocols is deflationary, since the cost of processing power, storage and bandwidth are deflationary.” Farther he states “ It’s a mistake to compare monopoly network effects of Facebook or other centralized platforms to blockchain protocols because blockchain protocols can be forked to a functionally identical blockchain with the same history and users up to the moment if a parent chain persists in being arbitrarily expensive to use(i.e. rent-seeking). Like TCP/IP but unlike Facebook, blockchain protocols are open-source software that anyone can copy or fork freely.” Add regulatory pressures on bitcoin and public permissionless currency and its negative impact.

https://preview.redd.it/zjyhwcacmml11.jpg?width=636&format=pjpg&auto=webp&s=ea21ea7e6957cb39dfbb57fa3193e5805578d38d

It’s obvious from his statements; John is not aware of latest R&D projects focused on improving decentralized networks and advances in decentralized protocols especially “Unforkable Realtime Blockchains” such as Algorand, Bitlattice and Orch.Network based on Recursive STARKs and FHE/SHE. He is also ignorant of the fact that there are several projects working on self-evolving censor-proof quantum safe protocols such as Orch Network (token symbol: ORC and URL: https://orch.network/). These protocols have adopted a continuous development strategy while getting ready for next paradigm shifts in technology e.g. practical quantum computing and quantum internet. He also does not understand that a futuristic protocol token with infinite-divisibility integrated with a hybrid quantum-classical computational infrastructure can easily counteract and neutralize the deflationary nature of its own tokens and its limited supply hardcap making it infinitely scalable and elastic.
While I agree with his following statement: “A non-sovereign, non-fiat, trustless, censorship-resistant cryptoasset would be a far better alternative for most foreign currency international reserves. IMF SDRs are already a synthetic store of value, so could also be easily and sensibly replaced by such a cryptoasset.”, this necessarily does not make BTC the right candidate for several reasons: 1. BTC is not a self-improving self-evolvable fully censorship-resistant cryptoasset which is a must for it to qualify as a viable reserve asset and appeal to long-term institutional and high networth investors.
Bitcoins miners are mostly corporate entities having large investments in ASIC-based mining equipments. It’s not impossible to corner 51% mining power by a centralized resourceful entity compromising double spending protection and other trustless security measures built-in. So BTC is not truly decentralized. 2. The underlying hash algorithm and encryption protocol of BTC known as SHA-256 can be broken by multi-qubit quantum circuits and quantum computers under active development in labs across the world. So BTC is not future-proof and its very existence is threatened unless its core developers continuously modify and improve its underlying security model and technology. 3. Bitcoin is not infinitely-divisible that’s it’s not only upwardly non-scalable, the same is true for its downward scalability. In fact BTC has only 8 decimal places known as Satoshis(1 satoshi = 0.00000001 BTC)
Futuristic protocol tokens such as infinitely scalable minerless Orch(ORC) should be more attractive to long-term investors looking for an alternative non-sovereign, non-fiat, and trustless, censorship-resistant privacy preserving high-velocity cryptoasset.
In their paper titled “Plasma: Scalable Autonomous Smart Contracts” Joseph Poon and Vitalik Buterin defines their proposal as “Plasma is a proposed framework for incentivized and enforced execution of ‘smart contracts’ which is scalable to a significant amount of state updates per second (potentially billions) enabling the blockchain to be able to represent a significant amount of decentralized financial applications worldwide.” Now first thing is it’s not clear what do they mean by “Autonomous Smart Contracts” and what specifically autonomous component in Plasma it refers to. For example, an autonomous weapon would set the target and hit it on its own without any humans in the loop or an autonomous self-driving car would drive down to a destination point without any human navigating it.
Now contrary to their claims, their off-chain and second-layer scaling solution with Ethereum(ETH) as the root blockchain is neither censor-proof nor truly scalable as this requires state-channel based masternodes/validators. So it’s not a feasible solution at all as trust issues will crop up at every moment.
Moreover, Scalable Multi-Party Computation is feasible only in a platform that guarantees functional encryption i.e. query, exchange and computation between encrypted objects, data and entities which is possible only via recursive STARKs and Lattice-based FHE(Fully Homomorphic Encryption). A second-layer protocol like Plasma does not have the capability of providing functional encryption to all distributed anonymous parties having zero mutual trusts.
There is a repeated effort to push some dangerous products under a guise of advanced blockchains and decentralized platforms. For instance, hidden external oracles and corporate entity-controlled decentralized platforms. Blockchain applications live in their own digital realm, totally orthogonal to the real world and environment we live in. Be it decentralized application or a smart contract, their reach is limited to the space they can control. Any use case projection in our reality eventually confronts the following hard fact: how can an app efficiently and securely interact with the physical world? Now hidden external oracles like that of oraclize.it and hardware pythias are being marketed as the solutions to this problem. But (IMHO) internal encrypted entities of Orch (ORC) platform known as Degents having access to cryptographically reliable external software/hardware sensors-actors will transparently and securely interact with the external world/environment.
Only minerless future-proof general-purpose decentralized networks such as Orch(ORC) designed from scratch as an MPC(Multiparty Computation) platform can deliver truly scalable MPC solutions flawlessly and reliably to millions of consumers simultaneously without compromising on security and trustlessness.
The far reaching impact of a self-evolving infinitely-scalable general-purpose realtime unforkable public blockchain with built-in quantum safe privacy and multicompute features will be immeasurable and profound.
It would transform the whole universe of blockchain and decentralized networks inlcuding all blockchain-based and blockchainfree platforms such as DAG-based and DHT-based platforms e.g. IOTA, Nano and Holochain.
Orch Network (native token symbol: ORC and URL: https://orch.network) will enable and power following dapps and user-cases:

  1. Privacy-preserving Infinitely-divisible Hypercurrency and Confidential Global Payment System with integrated encrypted decentralized chat service
  2. Unmanned Decentralized Cryptoasset Exchanges
  3. Large-scale Federated IoT Networks
  4. Decentralized DNS Clusters
  5. Anonymous trading of Tokenized Financial Assets and Derivatives Contracts
  6. Automated Hedge Funds
  7. Crypto darkpools
  8. Temporal Insurance Products
  9. Global Supply chain and unmanned cargo ships and drones
  10. Realtime Encrypted Video Communication capable Anonymous Web Infrastructure
  11. High-velocity Non-sovereign Reserve Asset
  12. Near-Perfect Coin Mixer
  13. Decentralized Marketplace App
  14. Transparent Robust Stable Coins
  15. Decentralized P2P Storage of functionally encrypted data
  16. Permissionless ICO Platforms
  17. Decentralized and Encrypted Facebook, gmail, Twitter and google-like search/answer engines
  18. Decentralized CDNs
  19. Customizable Decentralized Governance System for blockchains and dapps
Another important thing that will boost the price and value of Orch Network token ORC is its integrated Turing Incomplete cyber contract protocol running Turing Incomplete cyber contracts written in Crackcity(a Turing Incomplete language derived from Crack and Simplicity) that runs on top of Crack Machine(s). Crack machines are Orch’s blockchain virtual machines.
Ethereum’s main deficiency and Achilles’ heel is its Turing Complete smart contract programming language Solidity.

  1. Turing-complete languages are fundamentally inappropriate for writing “smart contracts” — because such languages are inherently undecidable, which makes it impossible to know what a “smart contract” will do before running it.
(2) We should learn from Wall Street’s existing DSLs (domain-specific languages) for financial products and smart contracts, based on declarative and functional languages such as Ocaml and Haskell — instead of doing what the Web 2.0 programmers” behind Solidity did, and what Peter Todd is also apparently embarking upon: ie, ignoring the lessons that Wall Street has already learned, and “reinventing the wheel”, using less-suitable languages such as C++ and JavaScript-like languages (Solidity), simply because they seem “easier” for the “masses” to use.
(3) We should also consider using specification languages (to say what a contract does) along with implementation languages (saying how it should do it) — because specifications are higher-level and easier for people to read than implementations which are lower-level meant for machines to run — and also because ecosystems of specification/implementation language pairs (such as Coq/Ocaml) support formal reasoning and verification tools which could be used to mathematically prove that a smart contract’s implementation is “correct” (ie, it satisfies its specification) before even running it.
Turing-complete languages lead to “undecidable” programs (ie, you cannot figure out what you do until after you run them)
One hint: recall that Gödel’s incompleteness theorem proved that any mathematical system which is (Turing)-complete, must also be inconsistent incomplete [hat tip] — that is, in any such system, it must be possible to formulate propositions which are undecidable within that system.
This is related to things like the Halting Problem.
And by the way, Ethereum’s concept of “gas” is not a real solution to the Halting Problem: Yes, running out of “gas” means that the machine will “stop” eventually, but this naïve approach does not overcome the more fundamental problems regarding undecidability of programs written using a Turing-complete language.
The take-away is that:
When using any Turing-complete language, it will always be possible for someone (eg, the DAO hacker, or some crook like Bernie Madoff, or some well-meaning but clueless dev from slock.it) to formulate a “smart contract” whose meaning cannot be determined in advance by merely inspecting the code: ie, it will always be possible to write a smart contract whose meaning can only be determined after running the code.
Take a moment to contemplate the full, deep (and horrifying) implications of all this.
Some of the greatest mathematicians and computer scientists of the 20th century already discovered and definitively proved (much to the consternation most of their less-sophisticated (naïve) colleagues — who nevertheless eventually were forced to come around and begrudgingly agree with them) that: Given a “smart contract” written in a Turing-complete language, it is impossible to determine the semantics / behavior of that “smart contract” in advance, by mere inspection — either by a human, or even by a machine such as a theorem prover or formal reasoning tool (because such tools unfortunately only work on more-restricted languages, not on Turing-complete languages — for info on such more-restricted languages, see further below on “constructivism” and “intuitionistic logic”).
The horrifying conclusion is that: the only way to determine the semantics / behavior of a “smart contract” is “after-the-fact” — ie, by actually running it on some machine (eg, the notorious EVM) — and waiting to see what happens (eg, waiting for a hacker to “steal” tens of millions of dollars — simply because he understood the semantics / behavior of the code better than the developers did.
Last but not the least, increasing regulatory pressures on Bitcoin, Ethereum and other permissionless public cryptocurrencies/cryptotokens will impact their prices negatively in the medium to long-term.
The need for a hyperfast private zero-knowledge proof cryptocurrency that keeps payer-payee and payment data private and secure along with a decentralized scalable multicomputation platform can’t be overemphasized.
submitted by OrchNetwork to u/OrchNetwork [link] [comments]

Intergalactic Money: The deep impact of a self-evolving infinitely-scalable general-purpose realtime unforkable public blockchain federation

Intergalactic Money: The deep impact of a self-evolving infinitely-scalable general-purpose realtime unforkable public blockchain federation
Prologue: This article is a strategic response to the following crypto-related papers published in 2017: 1. “An (Institutional) Investor’s Take on Cryptoassets” by John Pfeffer of Pfeffer Capital and 2. “Plasma: Scalable Autonomous Smart Contracts” by Joseph Poon of Lightning Network and Vitalik Buterin of Ethereum Foundation.
John Pfeffer in his paper titled “An (Institutional) Investor’s Take on Cryptoassets” claims that “scaling solutions for blockchains in particular and decentralized networks including (implied) DAG-based networks such as PoS, Sharding, etc. are bullish for adoption and users/consumers but bearish for token value/investors. Even without those technology shifts, the cost of using decentralized protocols is deflationary, since the cost of processing power, storage and bandwidth are deflationary.” Farther he states “ It’s a mistake to compare monopoly network effects of Facebook or other centralized platforms to blockchain protocols because blockchain protocols can be forked to a functionally identical blockchain with the same history and users up to the moment if a parent chain persists in being arbitrarily expensive to use(i.e. rent-seeking). Like TCP/IP but unlike Facebook, blockchain protocols are open-source software that anyone can copy or fork freely.” Add regulatory pressures on bitcoin and public permissionless currency and its negative impact.
It’s obvious from his statements; John is not aware of latest R&D projects focused on improving decentralized networks and advances in decentralized protocols especially “Unforkable Realtime Blockchains” such as Algorand, Bitlattice and Orch.Network based on Recursive STARKs and FHE/SHE. He is also ignorant of the fact that there are several projects working on self-evolving censor-proof quantum safe protocols such as Orch Network (token symbol: ORC and URL: https://orch.network/). These protocols have adopted a continuous development strategy while getting ready for next paradigm shifts in technology e.g. practical quantum computing and quantum internet. He also does not understand that a futuristic protocol token with infinite-divisibility integrated with a hybrid quantum-classical computational infrastructure can easily counteract and neutralize the deflationary nature of its own tokens and its limited supply hardcap making it infinitely scalable and elastic.

https://preview.redd.it/lj2bgefhmml11.jpg?width=636&format=pjpg&auto=webp&s=ce282da0942d65464d7edf2c822fff4737f0aa87
While I agree with his following statement: “A non-sovereign, non-fiat, trustless, censorship-resistant cryptoasset would be a far better alternative for most foreign currency international reserves. IMF SDRs are already a synthetic store of value, so could also be easily and sensibly replaced by such a cryptoasset.”, this necessarily does not make BTC the right candidate for several reasons: 1. BTC is not a self-improving self-evolvable fully censorship-resistant cryptoasset which is a must for it to qualify as a viable reserve asset and appeal to long-term institutional and high networth investors.
Bitcoins miners are mostly corporate entities having large investments in ASIC-based mining equipments. It’s not impossible to corner 51% mining power by a centralized resourceful entity compromising double spending protection and other trustless security measures built-in. So BTC is not truly decentralized. 2. The underlying hash algorithm and encryption protocol of BTC known as SHA-256 can be broken by multi-qubit quantum circuits and quantum computers under active development in labs across the world. So BTC is not future-proof and its very existence is threatened unless its core developers continuously modify and improve its underlying security model and technology. 3. Bitcoin is not infinitely-divisible that’s it’s not only upwardly non-scalable, the same is true for its downward scalability. In fact BTC has only 8 decimal places known as Satoshis(1 satoshi = 0.00000001 BTC)
Futuristic protocol tokens such as infinitely scalable minerless Orch(ORC) should be more attractive to long-term investors looking for an alternative non-sovereign, non-fiat, and trustless, censorship-resistant privacy preserving high-velocity cryptoasset.
In their paper titled “Plasma: Scalable Autonomous Smart Contracts” Joseph Poon and Vitalik Buterin defines their proposal as “Plasma is a proposed framework for incentivized and enforced execution of ‘smart contracts’ which is scalable to a significant amount of state updates per second (potentially billions) enabling the blockchain to be able to represent a significant amount of decentralized financial applications worldwide.” Now first thing is it’s not clear what do they mean by “Autonomous Smart Contracts” and what specifically autonomous component in Plasma it refers to. For example, an autonomous weapon would set the target and hit it on its own without any humans in the loop or an autonomous self-driving car would drive down to a destination point without any human navigating it.
Now contrary to their claims, their off-chain and second-layer scaling solution with Ethereum(ETH) as the root blockchain is neither censor-proof nor truly scalable as this requires state-channel based masternodes/validators. So it’s not a feasible solution at all as trust issues will crop up at every moment.
Moreover, Scalable Multi-Party Computation is feasible only in a platform that guarantees functional encryption i.e. query, exchange and computation between encrypted objects, data and entities which is possible only via recursive STARKs and Lattice-based FHE(Fully Homomorphic Encryption). A second-layer protocol like Plasma does not have the capability of providing functional encryption to all distributed anonymous parties having zero mutual trusts.
There is a repeated effort to push some dangerous products under a guise of advanced blockchains and decentralized platforms. For instance, hidden external oracles and corporate entity-controlled decentralized platforms. Blockchain applications live in their own digital realm, totally orthogonal to the real world and environment we live in. Be it decentralized application or a smart contract, their reach is limited to the space they can control. Any use case projection in our reality eventually confronts the following hard fact: how can an app efficiently and securely interact with the physical world? Now hidden external oracles like that of oraclize.it and hardware pythias are being marketed as the solutions to this problem. But (IMHO) internal encrypted entities of Orch (ORC) platform known as Degents having access to cryptographically reliable external software/hardware sensors-actors will transparently and securely interact with the external world/environment.
Only minerless future-proof general-purpose decentralized networks such as Orch(ORC) designed from scratch as an MPC(Multiparty Computation) platform can deliver truly scalable MPC solutions flawlessly and reliably to millions of consumers simultaneously without compromising on security and trustlessness.
The far reaching impact of a self-evolving infinitely-scalable general-purpose realtime unforkable public blockchain with built-in quantum safe privacy and multicompute features will be immeasurable and profound.
It would transform the whole universe of blockchain and decentralized networks inlcuding all blockchain-based and blockchainfree platforms such as DAG-based and DHT-based platforms e.g. IOTA, Nano and Holochain.
Orch Network (native token symbol: ORC and URL: https://orch.network) will enable and power following dapps and user-cases:

  1. Privacy-preserving Infinitely-divisible Hypercurrency and Confidential Global Payment System with integrated encrypted decentralized chat service
  2. Unmanned Decentralized Cryptoasset Exchanges
  3. Large-scale Federated IoT Networks
  4. Decentralized DNS Clusters
  5. Anonymous trading of Tokenized Financial Assets and Derivatives Contracts
  6. Automated Hedge Funds
  7. Crypto darkpools
  8. Temporal Insurance Products
  9. Global Supply chain and unmanned cargo ships and drones
  10. Realtime Encrypted Video Communication capable Anonymous Web Infrastructure
  11. High-velocity Non-sovereign Reserve Asset
  12. Near-Perfect Coin Mixer
  13. Decentralized Marketplace App
  14. Transparent Robust Stable Coins
  15. Decentralized P2P Storage of functionally encrypted data
  16. Permissionless ICO Platforms
  17. Decentralized and Encrypted Facebook, gmail, Twitter and google-like search/answer engines
  18. Decentralized CDNs
  19. Customizable Decentralized Governance System for blockchains and dapps
Another important thing that will boost the price and value of Orch Network token ORC is its integrated Turing Incomplete cyber contract protocol running Turing Incomplete cyber contracts written in Crackcity(a Turing Incomplete language derived from Crack and Simplicity) that runs on top of Crack Machine(s). Crack machines are Orch’s blockchain virtual machines.
Ethereum’s main deficiency and Achilles’ heel is its Turing Complete smart contract programming language Solidity.

  1. Turing-complete languages are fundamentally inappropriate for writing “smart contracts” — because such languages are inherently undecidable, which makes it impossible to know what a “smart contract” will do before running it.
(2) We should learn from Wall Street’s existing DSLs (domain-specific languages) for financial products and smart contracts, based on declarative and functional languages such as Ocaml and Haskell — instead of doing what the Web 2.0 programmers” behind Solidity did, and what Peter Todd is also apparently embarking upon: ie, ignoring the lessons that Wall Street has already learned, and “reinventing the wheel”, using less-suitable languages such as C++ and JavaScript-like languages (Solidity), simply because they seem “easier” for the “masses” to use.
(3) We should also consider using specification languages (to say what a contract does) along with implementation languages (saying how it should do it) — because specifications are higher-level and easier for people to read than implementations which are lower-level meant for machines to run — and also because ecosystems of specification/implementation language pairs (such as Coq/Ocaml) support formal reasoning and verification tools which could be used to mathematically prove that a smart contract’s implementation is “correct” (ie, it satisfies its specification) before even running it.
Turing-complete languages lead to “undecidable” programs (ie, you cannot figure out what you do until after you run them)
One hint: recall that Gödel’s incompleteness theorem proved that any mathematical system which is (Turing)-complete, must also be inconsistent incomplete [hat tip] — that is, in any such system, it must be possible to formulate propositions which are undecidable within that system.
This is related to things like the Halting Problem.
And by the way, Ethereum’s concept of “gas” is not a real solution to the Halting Problem: Yes, running out of “gas” means that the machine will “stop” eventually, but this naïve approach does not overcome the more fundamental problems regarding undecidability of programs written using a Turing-complete language.
The take-away is that:
When using any Turing-complete language, it will always be possible for someone (eg, the DAO hacker, or some crook like Bernie Madoff, or some well-meaning but clueless dev from slock.it) to formulate a “smart contract” whose meaning cannot be determined in advance by merely inspecting the code: ie, it will always be possible to write a smart contract whose meaning can only be determined after running the code.
Take a moment to contemplate the full, deep (and horrifying) implications of all this.
Some of the greatest mathematicians and computer scientists of the 20th century already discovered and definitively proved (much to the consternation most of their less-sophisticated (naïve) colleagues — who nevertheless eventually were forced to come around and begrudgingly agree with them) that: Given a “smart contract” written in a Turing-complete language, it is impossible to determine the semantics / behavior of that “smart contract” in advance, by mere inspection — either by a human, or even by a machine such as a theorem prover or formal reasoning tool (because such tools unfortunately only work on more-restricted languages, not on Turing-complete languages — for info on such more-restricted languages, see further below on “constructivism” and “intuitionistic logic”).
The horrifying conclusion is that: the only way to determine the semantics / behavior of a “smart contract” is “after-the-fact” — ie, by actually running it on some machine (eg, the notorious EVM) — and waiting to see what happens (eg, waiting for a hacker to “steal” tens of millions of dollars — simply because he understood the semantics / behavior of the code better than the developers did.
Last but not the least, increasing regulatory pressures on Bitcoin, Ethereum and other permissionless public cryptocurrencies/cryptotokens will impact their prices negatively in the medium to long-term.
The need for a hyperfast private zero-knowledge proof cryptocurrency that keeps payer-payee and payment data private and secure along with a decentralized scalable multicomputation platform can’t be overemphasized.
submitted by OrchNetwork to u/OrchNetwork [link] [comments]

Special consultant Guo Yuhang remarked, “ITC is worth looking forward to.”

The ITC project announced recently that it had obtained investment from Guo Yuhang of Xinghe Capital and invited him to be the consultant of the ITC project.
Undoubtedly, capital always promotes the innovation and development of technology until its promotion and popularization. With the soaring price of bitcoin, the concept of blockchain is moving from the niche to the general public, attracting wide attention. In the area of blockchain + IoT applications, ITC has also become the new focus both inside and outside the market with the launch of trading platforms such as Huobi and OKEx, attracting many well-known investors and trading platforms including Guo Yuhang of Xinghe Capital.
Guo Yuhang commented, “Blockchain is a new direction of financial technology.”
Guo Yuhang started to do Angel Investment from 2008 and established an investment institution in 2016. As the founder of Dianrong Network and the sponsor of Xinghe Capital, Guo Yuhang has turned from an internet financial entrepreneur into an investor in the financial technology field, but he has always been focusing on investment on pioneering projects of financial technology and on technological changes in the field of financial technology. R & D in blockchain is among one of his investment directions.
As Guo Yuhang said, blockchain and AI are new directions in financial technology. Based on the Distributed ledger technology, blockchain is firstly used in the financial sector. Many financial institutions such as JPMorgan and Bank of Canada plan to launch blockchain products or set up digital banks.
“ITC project is worth looking forward to.” Guo Yuhang said.
In fact, the application of blockchain technology has spread from the financial sector to all walks of life. Many industries that need to record and supervise the transactions are speeding up to get on the technology train of blockchain, inevitably followed by blockchain projects with mixed qualities. In the view of Guo Yuhang, many projects are currently only wrapped in a sugar coating layer of blockchain technology, while real blockchain projects must have a blockchain R & D team, with accumulation of technology and development.
For ITC projects, Guo Yuhang said that many large companies have begun to layout the blockchain and there are many opportunities and great potential in the future, but the challenge of talent competition is grim as well. Adhere to the correct direction of the team, and keep up with the talent pool, the future of the project is worth looking forward to.
ITC: Focus on IoT operating system
As the hottest concept, the combination of blockchain and IoT has always been a source of concern, and security has become the key for ITC projects. Since the establishment, ITC has been dedicated to solving IoT security issues through blockchain technology.
Based on blockchain, ITC is a safe IoT light operating system, which was designed to solve the current IoT serious security issues to meet the high degree of IoT concurrency usage scenarios that enable interoperability of everything. The ITC solution incorporates blockchain technology, asymmetric cryptography, semi-homomorphic encryption, and distributed architecture without date centers.
At the same time, ITC is also short for IOT onchain Token, which is used to drive the ITC, a decentralized IoT operating system. Every smart device that joins the ecosystem of ITC has built-in ITC for function-driven consumption. In the meantime, every big data analysis request in every ecosystem also consumes ITC and based on the assertion that data sovereignty belongs to the user, ITC will be assigned to users according to the data contribution of each user.
The good development of ITC’s data ecology and the growth of smart devices not only the guarantee the growing ITC demand in the ITC ecosystem, but also secure the operability of ITC's cut-in scenarios and projects to be carried out.
Following the LinkVC, ITC was again awarded the investment of Guo Yuhang from Xinghe Capital, who was invited to be the ITC project consultant to jointly build the ecosystem of ITC. Blockchain technology, known as having revolutionary potential, like AI and big data, may all be technologies that will change the world, which, of course, is a gradual process.
As mentioned above, the role of capital in promoting technology can not be underestimated. The continuous injection of capital will undoubtedly provide more help for the development of ITC projects. It will not only speed up the development and improvement of the ecosystem of IoT Chain (ITC), but will also help further ITC solutions in the field of IoT in various industries and application.
With the blessing of capital, ITC's future is undoubtedly worth the wait!
More ITC project information please visit:
ITC official website: https://iotchain.io/
ITC Telegram Group: https://t.me/IoTChain
ITC Github : https://github.com/IoTChainCode
ITC White Paper: https://iotchain.io/pdf/web/ITCWHITEPAPER.pdf
submitted by xiezhuopeng to u/xiezhuopeng [link] [comments]

Fully Homomorphic Encryption from the Ground Up Threshold Cryptosystems From Threshold Fully Homomorphic Encryption Fully Homomorphic Encryption without Modulus Switching f ... Homomorphic Secret Sharing from Lattices Without FHE BMH2018 Pitch

The use of homomorphic encryption technique will not only offer privacy protection, it will also allow ready access to encrypted data over public blockchain for auditing and other purposes. In order words, the use of homomorphic encryption to store data on public blockchain will offer the best of both public and private blockchains in one single package. Private blockchains are still in the ... This is very valuable - Homomorphic Encryption & Blockchain - We write about @48coins #blockchain #CryptoNewsnet #Encryption #Homomorphic Price analysis; Trading guides; Bitcoin Master Guide; Saturday, October 24, 2020. Write for us; News; Bitcoin . Legendary Hedge Fund Manager Says Bitcoin Is the Ultimate Inflation Hedge. Gemini Review. Bitfinex posts a bounty/reward for 2016 Bitcoin hackers. Crypto market crash may have been the result of market manipulation. Chinese BTC Miners Can Oust Western Competition. All Bitcoin basics ... One of the use cases of homomorphic encryption on the blockchain, as explained by Kobi Gurkan from Shield128 blockchain security platform, takes an Ethereum smart contract for managing employee expenses. In this use case, employees who do not want their peers to know about their expenses can encrypt the expense details and send it to a smart contract. The encrypted expense will then be added ... In such a scenario, using homomorphic encryption could allow transmission of private data that can still be manipulated by a third party. Some cryptographic techniques like Zero Knowledge Proofs (ZKs) are already implementing a form of homomorphic encryption. Sounds too good to be true? To some extent, it is. The concept of homomorphic encryption is not new. The first attempt to develop such a ...

[index] [2770] [48221] [33000] [5837] [39719] [28224] [12370] [40730] [16080] [10631]

Fully Homomorphic Encryption from the Ground Up

This video is unavailable. Watch Queue Queue. Watch Queue Queue My LINK price prediction for November Can money buy you happiness? - Duration: 4:49. Palatine of facts Recommended for you. 4:49. 5 RULES FOR THE REST OF YOUR LIFE Matthew McConaughey ... Why you need threshold signatures to protect your Bitcoin wallet - Duration: 27:34. Arvind Narayanan 826 views. 27:34. Winter School on Cryptography: Fully Homomorphic Encryption - Craig Gentry ... This video is unavailable. Watch Queue Queue. Watch Queue Queue Pitch of the Team "Homomorphic Blockchain: Quantum computing safe and GDPR conform encryption for DLT / Blockchain.” at Blockchained Mobility Hackathon 2018 From July 20th to 22nd 2018 the ...

#