These crates contain experimental pure Rust implementations of scalafield arithmetic for the respective elliptic curves (secp256k1, NIST P-256). These implementations are new, unaudited, and haven't received much public scrutiny. We have explicitly labeled them as being at a "USE AT YOUR OWN RISK" level of maturity. That said, these implementations utilize the best modern practices for this class of elliptic curves (complete projective formulas providing constant time scalar multiplication). In particular:
This release has been a cross-functional effort, with contributions from some of the best Rust elliptic curve cryptography experts. I'd like to thank everyone who's contributed, and hope that these crates are useful, especially for embedded cryptography and cryptocurrency use cases. EDIT: the version in the title is incorrect. The correct version is v0.4.0, unfortunately the title cannot be edited.
Anyswap - A completely decentralized swap exchange the supports all your coins
Hello crypto enthusiasts, After the recent run up of DEFI products and massive price movements, I’ve come across an innovative product with tremendous upside potential. If you have used Uniswap in the past, and are bound to only swapping Ethereum with Ethereum based tokens, a pressing problem arises... ‘How come I can’t use my Bitcoin, XRP, Litecoin, etc. to make the swap? Why do I have to trade into Ether, to gain access to these tokens?’ Enter Anyswap... Anyswap Anyswap is the first completely decentralized swap exchange that will allow you to use any coin or tokens (ECDSA and EDDSA as signature algorithms - 98% of all blockchains) with one another. No third party risk. The ANY token issued is a governance token, which will allow voting rights for holders to choose which coins will be listed next. No ICO, no fund raising, no airdrop, no premine. Get em while there hot. Mark your calendar... July 20th 2020, ANY token will be available for purchase. Join their telegram group for more info Anyswap TG Thanks for listening. And to the mooooon 🚀🚀🚀
Thanks to all who submitted questions for Shiv Malik in the GAINS AMA yesterday, it was great to see so much interest in Data Unions! You can read the full transcript here:
Gains x Streamr AMA Recap
https://preview.redd.it/o74jlxia8im51.png?width=1236&format=png&auto=webp&s=93eb37a3c9ed31dc3bf31c91295c6ee32e1582be Thanks to everyone in our community who attended the GAINS AMA yesterday with, Shiv Malik. We were excited to see that so many people attended and gladly overwhelmed by the amount of questions we got from you on Twitter and Telegram. We decided to do a little recap of the session for anyone who missed it, and to archive some points we haven’t previously discussed with our community. Happy reading and thanks to Alexandre and Henry for having us on their channel! What is the project about in a few simple sentences? At Streamr we are building a real-time network for tomorrow’s data economy. It’s a decentralized, peer-to-peer network which we are hoping will one day replace centralized message brokers like Amazon’s AWS services. On top of that one of the things I’m most excited about are Data Unions. With Data Unions anyone can join the data economy and start monetizing the data they already produce. Streamr’s Data Union framework provides a really easy way for devs to start building their own data unions and can also be easily integrated into any existing apps. Okay, sounds interesting. Do you have a concrete example you could give us to make it easier to understand? The best example of a Data Union is the first one that has been built out of our stack. It's called Swash and it's a browser plugin. You can download it here: http://swashapp.io/ And basically it helps you monetize the data you already generate (day in day out) as you browse the web. It's the sort of data that Google already knows about you. But this way, with Swash, you can actually monetize it yourself. The more people that join the union, the more powerful it becomes and the greater the rewards are for everyone as the data product sells to potential buyers. Very interesting. What stage is the project/product at? It's live, right? Yes. It's live. And the Data Union framework is in public beta. The Network is on course to be fully decentralized at some point next year. How much can a regular person browsing the Internet expect to make for example? So that's a great question. The answer is no one quite knows yet. We do know that this sort of data (consumer insights) is worth hundreds of millions and really isn't available in high quality. So With a union of a few million people, everyone could be getting 20-50 dollars a year. But it'll take a few years at least to realise that growth. Of course Swash is just one data union amongst many possible others (which are now starting to get built out on our platform!) With Swash, I believe they now have 3,000 members. They need to get to 50,000 before they become really viable but they are yet to do any marketing. So all that is organic growth. I assume the data is anonymized btw? Yes. And there in fact a few privacy protecting tools Swash supplys to its users. How does Swash compare to Brave? So Brave really is about consent for people's attention and getting paid for that. They don't sell your data as such. Swash can of course be a plugin with Brave and therefore you can make passive income browsing the internet. Whilst also then consenting to advertising if you so want to earn BAT. Of course it's Streamr that is powering Swash. And we're looking at powering other DUs - say for example mobile applications. The holy grail might be having already existing apps and platforms out there, integrating DU tech into their apps so people can consent (or not) to having their data sold - and then getting a cut of that revenue when it does sell. The other thing to recognise is that the big tech companies monopolise data on a vast scale - data that we of course produce for them. That is stifling innovation. Take for example a competitor map app. To effectively compete with Google maps or Waze, they need millions of users feeding real time data into it. Without that - it's like Google maps used to be - static and a bit useless. Right, so how do you convince these big tech companies that are producing these big apps to integrate with Streamr? Does it mean they wouldn't be able to monetize data as well on their end if it becomes more available through an aggregation of individuals? If a map application does manage to scale to that level then inevitably Google buys them out - that's what happened with Waze. But if you have a data union which bundles together the raw location data of millions of people then any application builder can come along and license that data for their app. This encourages all sorts of innovation and breaks the monopoly. We're currently having conversations with Mobile Network operators to see if they want to pilot this new approach to data monetization. And that's what even more exciting. Just be explicit with users - do you want to sell your data? Okay, if yes, then which data point do you want to sell. Then the mobile network operator (like T-mobile for example) then organises the sale of the data of those who consent and everyone gets a cut. Streamr - in this example provides the backend to port and bundle the data, and also the token and payment rail for the payments. So for big companies (mobile operators in this case), it's less logistics, handing over the implementation to you, and simply taking a cut? It's a vision that we'll be able to talk more about more concretely in a few weeks time 😁 Compared to having to make sense of that data themselves (in the past) and selling it themselves Sort of. We provide the backened to port the data and the template smart contracts to distribute the payments. They get to focus on finding buyers for the data and ensuring that the data that is being collected from the app is the kind of data that is valuable and useful to the world. (Through our sister company TX, we also help build out the applications for them and ensure a smooth integration). The other thing to add is that the reason why this vision is working, is that the current data economy is under attack. Not just from privacy laws such as GDPR, but also from Google shutting down cookies, bidstream data being investigated by the FTC (for example) and Apple making changes to IoS14 to make third party data sharing more explicit for users. All this means that the only real places for thousands of multinationals to buy the sort of consumer insights they need to ensure good business decisions will be owned by Google/FB etc, or from SDKs or through this method - from overt, rich, consent from the consumer in return for a cut of the earnings. A couple of questions to get a better feel about Streamr as a whole now and where it came from. How many people are in the team? For how long have you been working on Streamr? We are around 35 people with one office in Zug, Switzerland and another one in Helsinki. But there are team members all over the globe, we’ve people in the US, Spain, the UK, Germany, Poland, Australia and Singapore. I joined Streamr back in 2017 during the ICO craze (but not for that reason!) And did you raise funds so far? If so, how did you handle them? Are you planning to do any future raises? We did an ICO back in Sept/Oct 2017 in which we raised around 30 Millions CHF. The funds give us enough runway for around five/six years to finalize our roadmap. We’ve also simultaneously opened up a sister company consultancy business, TX which helps enterprise clients implementing the Streamr stack. We've got no more plans to raise more! What is the token use case? How did you make sure it captures the value of the ecosystem you're building The token is used for payments on the Marketplace (such as for Data Union products for example) also for the broker nodes in the Network. ( we haven't talked much about the P2P network but it's our project's secret sauce). The broker nodes will be paid in DATAcoin for providing bandwidth. We are currently working together with Blockscience on our tokeneconomics. We’ve just started the second phase in their consultancy process and will be soon able to share more on the Streamr Network’s tokeneconoimcs. But if you want to summate the Network in a sentence or two - imagine the Bittorrent network being run by nodes who get paid to do so. Except that instead of passing around static files, it's realtime data streams. That of course means it's really well suited for the IoT economy. Well, let's continue with questions from Twitter and this one comes at the perfect time. Can Streamr Network be used to transfer data from IOT devices? Is the network bandwidth sufficient? How is it possible to monetize the received data from a huge number of IOT devices? From u/EgorCypto Yes, IoT devices are a perfect use case for the Network. When it comes to the network’s bandwidth and speed - the Streamr team just recently did extensive research to find out how well the network scales. The result was that it is on par with centralized solutions. We ran experiments with network sizes between 32 to 2048 nodes and in the largest network of 2048 nodes, 99% of deliveries happened within 362 ms globally. To put these results in context, PubNub, a centralized message brokering service, promises to deliver messages within 250 ms — and that’s a centralized service! So we're super happy with those results. Here's a link to the paper: https://medium.com/streamrblog/streamr-network-performance-and-scalability-whitepaper-adb461edd002 While we're on the technical side, second question from Twitter: Can you be sure that valuable data is safe and not shared with service providers? Are you using any encryption methods? From u/ CryptoMatvey Yes, the messages in the Network are encrypted. Currently all nodes are still run by the Streamr team. This will change in the Brubeck release - our last milestone on the roadmap - when end-to-end encryption is added. This release adds end-to-end encryption and automatic key exchange mechanisms, ensuring that node operators can not access any confidential data. If BTW - you want to get very technical the encryption algorithms we are using are: AES (AES-256-CTR) for encryption of data payloads, RSA (PKCS #1) for securely exchanging the AES keys and ECDSA (secp256k1) for data signing (same as Bitcoin and Ethereum). Last question from Twitter, less technical now :) In their AMA ad, they say that Streamr has three unions, Swash, Tracey and MyDiem. Why does Tracey help fisherfolk in the Philippines monetize their catch data? Do they only work with this country or do they plan to expand? From u/ alej_pacedo So yes, Tracey is one of the first Data Unions on top of the Streamr stack. Currently we are working together with the WWF-Philippines and the UnionBank of the Philippines on doing a first pilot with local fishing communities in the Philippines. WWF is interested in the catch data to protect wildlife and make sure that no overfishing happens. And at the same time the fisherfolk are incentivized to record their catch data by being able to access micro loans from banks, which in turn helps them make their business more profitable. So far, we have lots of interest from other places in South East Asia which would like to use Tracey, too. In fact TX have already had explicit interest in building out the use cases in other countries and not just for sea-food tracking, but also for many other agricultural products. (I think they had a call this week about a use case involving cows 😂) I recall late last year, that the Streamr Data Union framework was launched into private beta, now public beta was recently released. What are the differences? Any added new features? By u/Idee02 The main difference will be that the DU 2.0 release will be more reliable and also more transparent since the sidechain we are using for micropayments is also now based on blockchain consensus (PoA). Are there plans in the pipeline for Streamr to focus on the consumer-facing products themselves or will the emphasis be on the further development of the underlying engine?by u/ Andromedamin We're all about what's under the hood. We want third party devs to take on the challenge of building the consumer facing apps. We know it would be foolish to try and do it all! As a project how do you consider the progress of the project to fully developed (in % of progress plz) by u/ Hash2T We're about 60% through I reckon! What tools does Streamr offer developers so that they can create their own DApps and monetize data?What is Streamr Architecture? How do the Ethereum blockchain and the Streamr network and Streamr Core applications interact? By u/ CryptoDurden We'll be releasing the Data UNion framework in a few weeks from now and I think DApp builders will be impressed with what they find. We all know that Blockchain has many disadvantages as well, So why did Streamr choose blockchain as a combination for its technology? What's your plan to merge Blockchain with your technologies to make it safer and more convenient for your users? By u/noonecanstopme So we're not a blockchain ourselves - that's important to note. The P2P network only uses BC tech for the payments. Why on earth for example would you want to store every single piece of info on a blockchain. You should only store what you want to store. And that should probably happen off chain. So we think we got the mix right there. What were the requirements needed for node setup ? by u/ John097 Good q - we're still working on that but those specs will be out in the next release. How does the STREAMR team ensure good data is entered into the blockchain by participants? By u/ kartika84 Another great Q there! From the product buying end, this will be done by reputation. But ensuring the quality of the data as it passes through the network - if that is what you also mean - is all about getting the architecture right. In a decentralised network, that's not easy as data points in streams have to arrive in the right order. It's one of the biggest challenges but we think we're solving it in a really decentralised way. What are the requirements for integrating applications with Data Union? What role does the DATA token play in this case? By u/JP_Morgan_Chase There are no specific requirements as such, just that your application needs to generate some kind of real-time data. Data Union members and administrators are both paid in DATA by data buyers coming from the Streamr marketplace. Regarding security and legality, how does STREAMR guarantee that the data uploaded by a given user belongs to him and he can monetize and capitalize on it? By u/kherrera22 So that's a sort of million dollar question for anyone involved in a digital industry. Within our system there are ways of ensuring that but in the end the negotiation of data licensing will still, in many ways be done human to human and via legal licenses rather than smart contracts. at least when it comes to sizeable data products. There are more answers to this but it's a long one! Okay thank you all for all of those! The AMA took place in theGAINS Telegramgroup 10/09/20. Answers by Shiv Malik.
ABCMint is a quantum resistant cryptocurrency with the Rainbow Multivariable Polynomial Signature Scheme.
Good day, the price is going up to 0.3USDT. ABCMint Second Foundation ABCMint has been a first third-party organization that focuses on post-quantum cryptography research and technology and aims to help improve the ecology of ABCMint technology since 2018. https://abcmintsf.com https://abcmintsf.com/exchange What is ABCMint? ABCMint is a quantum resistant cryptocurrency with the Rainbow Multivariable Polynomial Signature Scheme. Cryptocurrencies and blockchain technology have attracted a significant amount of attention since 2009. While some cryptocurrencies, including Bitcoin, are used extensively in the world, these cryptocurrencies will eventually become obsolete and be replaced when the quantum computers avail. For instance, Bitcoin uses the elliptic curved signature (ECDSA). If a bitcoin user?s public key is exposed to the public chain, the quantum computers will be able to quickly reverse-engineer the private key in a short period of time. It means that should an attacker decide to use a quantum computer to decrypt ECDSA, he/she will be able to use the bitcoin in the wallet. The ABCMint Foundation has improved the structure of the special coin core to resist quantum computers, using the Rainbow Multivariable Polynomial Signature Scheme, which is quantum resisitant, as the core. This is a fundamental solution to the major threat to digital money posed by future quantum computers. In addition, the ABCMint Foundation has implemented a new form of proof of arithmetic (mining) "ABCardO" which is different from Bitcoin?s arbitrary mining. This algorithm is believed to be beneficial to the development of the mathematical field of multivariate. Rainbow Signature - the quantum resistant signature based on Multivariable Polynomial Signature Scheme Unbalanced Oil and Vinegar (UOV) is a multi-disciplinary team of experts in the field of oil and vinegar. One of the oldest and most well researched signature schemes in the field of variable cryptography. It was designed by J. Patarin in 1997 and has withstood more than two decades of cryptanalysis. The UOV scheme is a very simple, smalls and fast signature. However, the main drawback of UOV is the large public key, which will not be conducive to the development of block practice technology.
The rainbow signature is an improvement on the oil and vinegar signature which increased the efficiency of unbalanced oil and vinegar. The basic concept is a multi-layered structure and generalization of oil and vinegar. PQC - Post Quantum Cryptography The public key cryptosystem was a breakthrough in modern cryptography in the late 1970s. It has become an increasingly important part of our cryptography communications network over The Internet and other communication systems rely heavily on the Diffie-Hellman key exchange, RSA encryption, and the use of the DSA, ECDSA or related algorithms for numerical signatures. The security of these cryptosystems depends on the difficulty level of number theory problems such as integer decomposition and discrete logarithm problems. In 1994, Peter Shor demonstrated that quantum computers can solve all these problems in polynomial time, which made this security issue related to the cryptosystems theory irrelevant. This development is known as the "post-quantum cryptography" (PQC) In August 2015, the U.S. National Security Agency (NSA) released an announcement regarding its plans to transition to quantum-resistant algorithms. In December 2016, the National Institute of Standards and Technology (NIST) announced a call for proposals for quantum-resistant algorithms. The deadline was November 30, 2017, which also included the rainbow signatures used for ABCMint.
Bitcoin (BTC) is a peer-to-peer cryptocurrency that aims to function as a means of exchange that is independent of any central authority. BTC can be transferred electronically in a secure, verifiable, and immutable way.
Launched in 2009, BTC is the first virtual currency to solve the double-spending issue by timestamping transactions before broadcasting them to all of the nodes in the Bitcoin network. The Bitcoin Protocol offered a solution to the Byzantine Generals’ Problem with ablockchainnetwork structure, a notion first created byStuart Haber and W. Scott Stornetta in 1991.
Bitcoin’s whitepaper was published pseudonymously in 2008 by an individual, or a group, with the pseudonym “Satoshi Nakamoto”, whose underlying identity has still not been verified.
The Bitcoin protocol uses an SHA-256d-based Proof-of-Work (PoW) algorithm to reach network consensus. Its network has a target block time of 10 minutes and a maximum supply of 21 million tokens, with a decaying token emission rate. To prevent fluctuation of the block time, the network’s block difficulty is re-adjusted through an algorithm based on the past 2016 block times.
With a block size limit capped at 1 megabyte, the Bitcoin Protocol has supported both the Lightning Network, a second-layer infrastructure for payment channels, and Segregated Witness, a soft-fork to increase the number of transactions on a block, as solutions to network scalability.
Bitcoin is a peer-to-peer cryptocurrency that aims to function as a means of exchange and is independent of any central authority. Bitcoins are transferred electronically in a secure, verifiable, and immutable way.
Network validators, whom are often referred to as miners, participate in the SHA-256d-based Proof-of-Work consensus mechanism to determine the next global state of the blockchain.
The Bitcoin protocol has a target block time of 10 minutes, and a maximum supply of 21 million tokens. The only way new bitcoins can be produced is when a block producer generates a new valid block.
The protocol has a token emission rate that halves every 210,000 blocks, or approximately every 4 years.
Unlike public blockchain infrastructures supporting the development of decentralized applications (Ethereum), the Bitcoin protocol is primarily used only for payments, and has only very limited support for smart contract-like functionalities (Bitcoin “Script” is mostly used to create certain conditions before bitcoins are used to be spent).
In the Bitcoin network, anyone can join the network and become a bookkeeping service provider i.e., a validator. All validators are allowed in the race to become the block producer for the next block, yet only the first to complete a computationally heavy task will win. This feature is called Proof of Work (PoW). The probability of any single validator to finish the task first is equal to the percentage of the total network computation power, or hash power, the validator has. For instance, a validator with 5% of the total network computation power will have a 5% chance of completing the task first, and therefore becoming the next block producer. Since anyone can join the race, competition is prone to increase. In the early days, Bitcoin mining was mostly done by personal computer CPUs. As of today, Bitcoin validators, or miners, have opted for dedicated and more powerful devices such as machines based on Application-Specific Integrated Circuit (“ASIC”). Proof of Work secures the network as block producers must have spent resources external to the network (i.e., money to pay electricity), and can provide proof to other participants that they did so. With various miners competing for block rewards, it becomes difficult for one single malicious party to gain network majority (defined as more than 51% of the network’s hash power in the Nakamoto consensus mechanism). The ability to rearrange transactions via 51% attacks indicates another feature of the Nakamoto consensus: the finality of transactions is only probabilistic. Once a block is produced, it is then propagated by the block producer to all other validators to check on the validity of all transactions in that block. The block producer will receive rewards in the network’s native currency (i.e., bitcoin) as all validators approve the block and update their ledgers.
The Bitcoin protocol utilizes the Merkle tree data structure in order to organize hashes of numerous individual transactions into each block. This concept is named after Ralph Merkle, who patented it in 1979. With the use of a Merkle tree, though each block might contain thousands of transactions, it will have the ability to combine all of their hashes and condense them into one, allowing efficient and secure verification of this group of transactions. This single hash called is a Merkle root, which is stored in the Block Header of a block. The Block Header also stores other meta information of a block, such as a hash of the previous Block Header, which enables blocks to be associated in a chain-like structure (hence the name “blockchain”). An illustration of block production in the Bitcoin Protocol is demonstrated below. https://preview.redd.it/m6texxicf3151.png?width=1591&format=png&auto=webp&s=f4253304912ed8370948b9c524e08fef28f1c78d
Block time and mining difficulty
Block time is the period required to create the next block in a network. As mentioned above, the node who solves the computationally intensive task will be allowed to produce the next block. Therefore, block time is directly correlated to the amount of time it takes for a node to find a solution to the task. The Bitcoin protocol sets a target block time of 10 minutes, and attempts to achieve this by introducing a variable named mining difficulty. Mining difficulty refers to how difficult it is for the node to solve the computationally intensive task. If the network sets a high difficulty for the task, while miners have low computational power, which is often referred to as “hashrate”, it would statistically take longer for the nodes to get an answer for the task. If the difficulty is low, but miners have rather strong computational power, statistically, some nodes will be able to solve the task quickly. Therefore, the 10 minute target block time is achieved by constantly and automatically adjusting the mining difficulty according to how much computational power there is amongst the nodes. The average block time of the network is evaluated after a certain number of blocks, and if it is greater than the expected block time, the difficulty level will decrease; if it is less than the expected block time, the difficulty level will increase.
What are orphan blocks?
In a PoW blockchain network, if the block time is too low, it would increase the likelihood of nodes producingorphan blocks, for which they would receive no reward. Orphan blocks are produced by nodes who solved the task but did not broadcast their results to the whole network the quickest due to network latency. It takes time for a message to travel through a network, and it is entirely possible for 2 nodes to complete the task and start to broadcast their results to the network at roughly the same time, while one’s messages are received by all other nodes earlier as the node has low latency. Imagine there is a network latency of 1 minute and a target block time of 2 minutes. A node could solve the task in around 1 minute but his message would take 1 minute to reach the rest of the nodes that are still working on the solution. While his message travels through the network, all the work done by all other nodes during that 1 minute, even if these nodes also complete the task, would go to waste. In this case, 50% of the computational power contributed to the network is wasted. The percentage of wasted computational power would proportionally decrease if the mining difficulty were higher, as it would statistically take longer for miners to complete the task. In other words, if the mining difficulty, and therefore targeted block time is low, miners with powerful and often centralized mining facilities would get a higher chance of becoming the block producer, while the participation of weaker miners would become in vain. This introduces possible centralization and weakens the overall security of the network. However, given a limited amount of transactions that can be stored in a block, making the block time too longwould decrease the number of transactions the network can process per second, negatively affecting network scalability.
3. Bitcoin’s additional features
Segregated Witness (SegWit)
Segregated Witness, often abbreviated as SegWit, is a protocol upgrade proposal that went live in August 2017. SegWit separates witness signatures from transaction-related data. Witness signatures in legacy Bitcoin blocks often take more than 50% of the block size. By removing witness signatures from the transaction block, this protocol upgrade effectively increases the number of transactions that can be stored in a single block, enabling the network to handle more transactions per second. As a result, SegWit increases the scalability of Nakamoto consensus-based blockchain networks like Bitcoin and Litecoin. SegWit also makes transactions cheaper. Since transaction fees are derived from how much data is being processed by the block producer, the more transactions that can be stored in a 1MB block, the cheaper individual transactions become. https://preview.redd.it/depya70mf3151.png?width=1601&format=png&auto=webp&s=a6499aa2131fbf347f8ffd812930b2f7d66be48e The legacy Bitcoin block has a block size limit of 1 megabyte, and any change on the block size would require a network hard-fork. On August 1st 2017, the first hard-fork occurred, leading to the creation of Bitcoin Cash (“BCH”), which introduced an 8 megabyte block size limit. Conversely, Segregated Witness was a soft-fork: it never changed the transaction block size limit of the network. Instead, it added an extended block with an upper limit of 3 megabytes, which contains solely witness signatures, to the 1 megabyte block that contains only transaction data. This new block type can be processed even by nodes that have not completed the SegWit protocol upgrade. Furthermore, the separation of witness signatures from transaction data solves the malleability issue with the original Bitcoin protocol. Without Segregated Witness, these signatures could be altered before the block is validated by miners. Indeed, alterations can be done in such a way that if the system does a mathematical check, the signature would still be valid. However, since the values in the signature are changed, the two signatures would create vastly different hash values. For instance, if a witness signature states “6,” it has a mathematical value of 6, and would create a hash value of 12345. However, if the witness signature were changed to “06”, it would maintain a mathematical value of 6 while creating a (faulty) hash value of 67890. Since the mathematical values are the same, the altered signature remains a valid signature. This would create a bookkeeping issue, as transactions in Nakamoto consensus-based blockchain networks are documented with these hash values, or transaction IDs. Effectively, one can alter a transaction ID to a new one, and the new ID can still be valid. This can create many issues, as illustrated in the below example:
Alice sends Bob 1 BTC, and Bob sends Merchant Carol this 1 BTC for some goods.
Bob sends Carols this 1 BTC, while the transaction from Alice to Bob is not yet validated. Carol sees this incoming transaction of 1 BTC to him, and immediately ships goods to B.
At the moment, the transaction from Alice to Bob is still not confirmed by the network, and Bob can change the witness signature, therefore changing this transaction ID from 12345 to 67890.
Now Carol will not receive his 1 BTC, as the network looks for transaction 12345 to ensure that Bob’s wallet balance is valid.
As this particular transaction ID changed from 12345 to 67890, the transaction from Bob to Carol will fail, and Bob will get his goods while still holding his BTC.
With the Segregated Witness upgrade, such instances can not happen again. This is because the witness signatures are moved outside of the transaction block into an extended block, and altering the witness signature won’t affect the transaction ID. Since the transaction malleability issue is fixed, Segregated Witness also enables the proper functioning of second-layer scalability solutions on the Bitcoin protocol, such as the Lightning Network.
Lightning Network is a second-layer micropayment solution for scalability. Specifically, Lightning Network aims to enable near-instant and low-cost payments between merchants and customers that wish to use bitcoins. Lightning Network was conceptualized in a whitepaper by Joseph Poon and Thaddeus Dryja in 2015. Since then, it has been implemented by multiple companies. The most prominent of them include Blockstream, Lightning Labs, and ACINQ. A list of curated resources relevant to Lightning Network can be found here. In the Lightning Network, if a customer wishes to transact with a merchant, both of them need to open a payment channel, which operates off the Bitcoin blockchain (i.e., off-chain vs. on-chain). None of the transaction details from this payment channel are recorded on the blockchain, and only when the channel is closed will the end result of both party’s wallet balances be updated to the blockchain. The blockchain only serves as a settlement layer for Lightning transactions. Since all transactions done via the payment channel are conducted independently of the Nakamoto consensus, both parties involved in transactions do not need to wait for network confirmation on transactions. Instead, transacting parties would pay transaction fees to Bitcoin miners only when they decide to close the channel. https://preview.redd.it/cy56icarf3151.png?width=1601&format=png&auto=webp&s=b239a63c6a87ec6cc1b18ce2cbd0355f8831c3a8 One limitation to the Lightning Network is that it requires a person to be online to receive transactions attributing towards him. Another limitation in user experience could be that one needs to lock up some funds every time he wishes to open a payment channel, and is only able to use that fund within the channel. However, this does not mean he needs to create new channels every time he wishes to transact with a different person on the Lightning Network. If Alice wants to send money to Carol, but they do not have a payment channel open, they can ask Bob, who has payment channels open to both Alice and Carol, to help make that transaction. Alice will be able to send funds to Bob, and Bob to Carol. Hence, the number of “payment hubs” (i.e., Bob in the previous example) correlates with both the convenience and the usability of the Lightning Network for real-world applications.
Schnorr Signature upgrade proposal
Elliptic Curve Digital Signature Algorithm (“ECDSA”) signatures are used to sign transactions on the Bitcoin blockchain. https://preview.redd.it/hjeqe4l7g3151.png?width=1601&format=png&auto=webp&s=8014fb08fe62ac4d91645499bc0c7e1c04c5d7c4 However, many developers now advocate for replacing ECDSA with Schnorr Signature. Once Schnorr Signatures are implemented, multiple parties can collaborate in producing a signature that is valid for the sum of their public keys. This would primarily be beneficial for network scalability. When multiple addresses were to conduct transactions to a single address, each transaction would require their own signature. With Schnorr Signature, all these signatures would be combined into one. As a result, the network would be able to store more transactions in a single block. https://preview.redd.it/axg3wayag3151.png?width=1601&format=png&auto=webp&s=93d958fa6b0e623caa82ca71fe457b4daa88c71e The reduced size in signatures implies a reduced cost on transaction fees. The group of senders can split the transaction fees for that one group signature, instead of paying for one personal signature individually. Schnorr Signature also improves network privacy and token fungibility. A third-party observer will not be able to detect if a user is sending a multi-signature transaction, since the signature will be in the same format as a single-signature transaction.
4. Economics and supply distribution
The Bitcoin protocol utilizes the Nakamoto consensus, and nodes validate blocks via Proof-of-Work mining. The bitcoin token was not pre-mined, and has a maximum supply of 21 million. The initial reward for a block was 50 BTC per block. Block mining rewards halve every 210,000 blocks. Since the average time for block production on the blockchain is 10 minutes, it implies that the block reward halving events will approximately take place every 4 years. As of May 12th 2020, the block mining rewards are 6.25 BTC per block. Transaction fees also represent a minor revenue stream for miners.
Technical: Upcoming Improvements to Lightning Network
Price? Who gives a shit about price when Lightning Network development is a lot more interesting????? One thing about LN is that because there's no need for consensus before implementing things, figuring out the status of things is quite a bit more difficult than on Bitcoin. In one hand it lets larger groups of people work on improving LN faster without having to coordinate so much. On the other hand it leads to some fragmentation of the LN space, with compatibility problems occasionally coming up. The below is just a smattering sample of LN stuff I personally find interesting. There's a bunch of other stuff, like splice and dual-funding, that I won't cover --- post is long enough as-is, and besides, some of the below aren't as well-known. Anyway.....
Yeah the exciting new Lightning Network channel update protocol!
Solves "toxic waste" problem. In the current Poon-Dryja update protocol, old state ("waste") is dangerous ("toxic") because if your old state is acquired by your most hated enemy, they can use that old state to publish a stale unilateral close transaction, which your counterparty must treat as a theft attempt and punish you, causing you to lose funds. With Decker-Russell-Osuntokun old state is not revoked, but is instead gainsaid by later state: instead of actively punishing old state, it simply replaces the old state with a later state.
Allows multiple participants in the update protocol. This can be used as the update protocol for a channel factory with 3 or more participants, for example (channels are not practical for multiple participants since the loss of any one participants makes the channel completely unuseable; it's more sensible to have a multiple-participant factory that splits up into 2-participant channels). Poon-Dryja only supports two participants. Another update protocol, Decker-Wattenhofer, also supports multiple participants, but requires much larger locktimes in case of a unilateral close (measurable in weeks, whereas Poon-Dryja and Decker-Russell-Osuntokun can be measured in hours or days).
It uses nLockTime in a very clever way.
No, it does not solve the "watchtower needed" problem. Decker-Russell-Osuntokun still requires watchtowers if you're planning to be offline for a long time.
What might be confused is that it was initially thought that watchtowers under Decker-Russell-Osuntokun could be made more efficient by having the channel participant update a single "slot" in the watchtower, rather than having to consume one "slot" per update in Poon-Dryja. However, the existence of the "poisoned blob" attack by ZmnSCPxj means that having a replaceable "slot" is risky if the other participant of the channel can spoof you. And the safest way to prevent spoofing somebody is to identify that somebody --- but now that means the watchtower can surveill the activities of somebody it has identified, losing privacy.
Requires base layer change --- SIGHASH_NOINPUT / SIGHASH_ANYPREVOUT. This is still being worked out and may potentially not reach Bitcoin anytime soon.
Determining costs of routes is somewhat harder, and may complicate routefinding algorithms. In particular: every channel today has a "CLTV Delta", a number of blocks by which the total maximum delay of the payment is increased. This maximum delay is the maximum amount of time by which an outgoing payment can be locked, and needs to be reduced for UX purposes. Decker-Russell-Osuntokun will also add a "CSV minimum", a number of blocks, which must be smaller than the delay of an HTLC going through the channel. Current routefinding algos are good at minimizing a summed-up cost (like the "CLTV Delta") so the "CSV minimum" may require discovering / developing new routefinding algos.
Due to the "CSV minimum" above, existing nodes that don't understand Decker-Russell-Osuntokun cannot reliably route over Decker-Russell-Osuntokun channels, as they might not impose this minimum properly.
Multipart payments / AMP
Splitting up large payments into smaller parts!
There are at least three variants of multipart payments: Original, Base, and High.
Original is the original AMP proposed by Lightning Labs. It sacrifices proof-of-payment in order to allow each path to have a different payment hash. This is done by having the payer use a derivation scheme to generate each part's payment preimage from a seed, then having the split the seed (using secret sharing) to each part. The receiver can only reconstruct the seed if all parts reach it.
Base simply uses the same payment hash for all routes. This retains proof-of-payment (i.e. an invoice is undeniably signed by the receiver, including a payment hash in the invoice; public knowledge of the payment preimage is proof that the receiver has in fact received money, and any third party can be convinced of this by being shown the signed invoice and the preimage). The receiver could just take one part of the payment and then claim to be underpaid by the payer and then deny service, but claiming any one part is enough to publish the payment preimage, creating a proof-of-payment: so the receiver can provably be made liable, even if it took just one part, thus the incentive of the receiver is to only take in the payment once all parts have arrived to it.
High requires elliptic curve points / scalars. It combines both Original and Base, retaining proof-of-payment (sacrificed by Original) and ensuring cryptographically-secure waiting for all parts (rather than the mere economically-incentivized of Base). This is done by using elliptic curve homomorphism to addition of scalars to add together the payer-provided preimage (really scalar) of Original with the payee-provided preimage (really scalar) of Base.
Better expected reliability. Channels are limited by capacity. By splitting up into many smaller payments, you can fit into more channels and be more likely to successfully reach the payee.
Capacity on mutiple of your channels can be used to pay. Currently if you have 0.05BTC on one channel and 0.05BTC on another channel, you can't pay 0.06BTC without first rebalancing your channels (and paying fees for the rebalance first, whether the payment succeeds or not). With multipart you can now combine the capacities of multiple of your channels, and only pay fees for combining them if the payment pushes through.
Wumbo payments (oversized payments) come "for free" without having to be explicitly supported by the nodes of the network: you just split up wumbo payments into parts smaller than the wumbo limit.
Multipart will have higher fees. Part of the feerate of each channel is a flat-rate fee. Going through multiple paths means paying more of this flat-rate fee.
It's not clear how to split up payments. Heuristics for payment splitting have to be derived and developed and tested.
Payment points / scalars
Using the magic of elliptic curve homomorphism for fun and Lightning Network profits! Basically, currently on Lightning an invoice has a payment hash, and the receiver reveals a payment preimage which, when inputted to SHA256, returns the given payment hash. Instead of using payment hashes and preimages, just replace them with payment points and scalars. An invoice will now contain a payment point, and the receiver reveals a payment scalar (private key) which, when multiplied with the standard generator point G on secp256k1, returns the given payment point. This is basically Scriptless Script usage on Lightning, instead of HTLCs we have Scriptless Script Pointlocked Timelocked Contracts (PTLCs).
Enables a shit-ton of improvements: payment decorrelation, stuckless payments, noncustodial escrow over Lightning (the Hodl Hodl Lightning escrow is custodial, read the fine print), High multipart.
It's the same coolness that makes Schnorr Signatures cool. ECDSA, despite being based on elliptic curves, is not cool because the hash-the-nonce operation needed to prevent it from infringing Schnorr's fatherfucking patent also prevents ECDSA from using the cool elliptic curve homomorphism of addition over scalars.
Requires Schnorr on Bitcoin layer.
Actually, we can work with 2p-ECDSA without waiting for Schnorr. We get back the nice elliptic curve homomorphism by passing the ECDSA nonce through another cryptosystem, Paillier. This gets us the ability to do Scriptless Script. I think it has only 80-bits security because of going through Paillier though.
Basically the conundrum is: we could implement 2p-ECDSA now, hope we never have to test the 80-bit security anytime soon, then switch to Schnorr with 128-bit security later (which means reimplementing a bunch of things, because the calculations are different and the data that needs to be exchanged between channel participants is very different between the 2p-ECDSA and Schnorr). Reimplementing is painful and is more dev work. If we don't implement with 2p-ECDSA now, though, we will be delaying all the nice elliptic curve goodness (stuckless, noncustodial escrow, payment decorrelation) until Bitcoin gets Schnorr.
Elliptic curve discrete log problem is theoretically quantum-vulnerable. If we can't find a qunatum-resistant homomorphic construction, we'll have to give up the advantages (payment decorrelation, stuckless payments, noncustodial escrow over Lightning) we got from using elliptic curve points and go back to boring old hashes.
Ensuring that payers cannot access data or other digital goods without proof of having paid the provider. In a nutshell: the payment preimage used as a proof-of-payment is the decryption key of the data. The provider gives the encrypted data, and issues an invoice. The buyer of the data then has to pay over Lightning in order to learn the decryption key, with the decryption key being the payment preimage.
Enables data providers to sell data. This could be sensors, livestreams, blogs, articles, whatever.
There's no scheme to determine if the data provider is providing actually-useful data. The data-provider could just stream https://random.org for example. This is a potentially-impossible problem. Even if the data-provider provides a "sample" of the data, and is able to derive some proof that the sample is indeed a true snippet of the encrypted data, the rest of the data outside of the sample might just be random junk.
No more payments getting stuck somewhere in the Lightning network without knowing whether the payee will ever get paid! (that's actually a bit overmuch claim, payments still can get stuck, but what "stuckless" really enables is that we can now safely run another parallel payment attempt until any one of the payment attempts get through). Basically, by using the ability to add points together, the payer can enforce that the payee can only claim the funds if it knows two pieces of information:
The payment scalar corresponding to the payment point in the invoice signed by the payee.
An "acknowledgment" scalar provided by the payer to the payee via another communication path.
This allows the payer to make multiple payment attempts in parallel, unlike the current situation where we must wait for an attempt to fail before trying another route. The payer only needs to ensure it generates different acknowledgment scalars for each payment attempt. Then, if at least one of the payment attempts reaches the payee, the payee can then acquire the acknowledgment scalar from the payer. Then the payee can acquire the payment. If the payee attempts to acquire multiple acknowledgment scalars for the same payment, the payer just gives out one and then tells the payee "LOL don't try to scam me", so the payee can only acquire a single acknowledgment scalar, meaning it can only claim a payment once; it can't claim multiple parallel payments.
Can safely run multiple parallel payment attempts as long as you have the funds to do so.
Needs payment point + scalar
Non-custodial escrow over Lightning
The "acknowledgment" scalar used in stuckless can be reused here. The acknowledgment scalar is derived as an ECDH shared secret between the payer and the escrow service. On arrival of payment to the payee, the payee queries the escrow to determine if the acknowledgment point is from a scalar that the escrow can derive using ECDH with the payer, plus a hash of the contract terms of the trade (for example, to transfer some goods in exchange for Lightning payment). Once the payee gets confirmation from the escrow that the acknowledgment scalar is known by the escrow, the payee performs the trade, then asks the payer to provide the acknowledgment scalar once the trade completes. If the payer refuses to give the acknowledgment scalar even though the payee has given over the goods to be traded, then the payee contacts the escrow again, reveals the contract terms text, and requests to be paid. If the escrow finds in favor of the payee (i.e. it determines the goods have arrived at the payer as per the contract text) then it gives the acknowledgment scalar to the payee.
True non-custodial escrow: the escrow service never holds any funds.
Needs payment point + scalar.
Because elliptic curve points can be added (unlike hashes), for every forwarding node, we an add a "blinding" point / scalar. This prevents multiple forwarding nodes from discovering that they have been on the same payment route. This is unlike the current payment hash + preimage, where the same hash is used along the route. In fact, the acknowledgment scalar we use in stuckless and escrow can simply be the sum of each blinding scalar used at each forwarding node.
Privacy! Multiple forwarding nodes cannot coordinate to try to uncover the payer and payee of each payment.
\These questions are sourced directly from Telegram* Q: How do I shutdown my Chaosnet Darknode?A: Please follow these directions: https://docs.renproject.io/chaosnet/chaosnet-darknode/untitled-3 Q: Can I run a Chaosnet Darknode and Mainnet Darknode at the same time (on the same computer).A: No, if you want to do that you’ll have to run them on separate computers. Q: You mentioned DCEP in your latest piece and "12 App Ideas", but it's going to run on a centralized private network. The Bank of England also just released a report on how they're thinking about their CBDC and DLT/centralization, and stress that a DLT could add resilience, but there's also no reason a currency couldn't be more centralized. The Block reported that other central banks (like the EU and Singapore) are considering third-party chains like Corda. Can you comment on which CBDC designs may or may not be compatible with RZL? You previously said "RZL sMPC provides ECDSA signatures because that’s what it is used by Ethereum, Bitcoin, etc. Whatever solution they come up with, will be the solution that RZL has to be upgraded to use (the whole point of RenVM is not to tell other chains how to do things, and still provide interop; this means waiting on them to define their solution and then working with that)." So, what does centralization mean for RZL, and how can we think about compatibility between these designs on the technical side? A: The topic of centralisation in interoperability comes down to the compounding effect of using multiple networks. Put another way “you’re only as decentralised as your most centralised component”. While there are nuances to this, the core idea rings true. RenVM can be used to interoperate many different kinds of chains (anything using ECDSA, or naturally supporting lively threshold signatures) is a candidate to be included in RenVM. However, a centralised currency that has been bridged to a decentralised chain is not decentralised. The centralised entity that controls the currency might say “nothing transferred to/from this other chain will be honoured”. That’s a risk that you take with centralised currencies (take a look at the T&Cs for USDC for example). The benefit of RenVM in these instances is to become a standard. Short-term, RenVM brings interoperability to some core chains. Medium-term, it expands that to other more interesting chains based on community demands. Long-term, it becomes the standard for how to implement interop. For example: you create a new chain and don’t worry about interop explicitly because you know RenVM will have your back. For centralised currencies this is still advantageous, because the issuing entity only has to manage one chain (theirs) but can still get their currency onto other chains/ecosystems. From a technical perspective, the Darknodes just have to be willing to adopt the chain/currency. Q: dApps will have their own risk tolerances for centralized assets. Eg USDC was a bigger deal for MakerDAO than Uniswap. If CBDC liquidity were suddenly bridgeable, some dApps would be more eager to adopt it than others - even despite the risks - because they provide native liquidity and can be used to store/hedge in it without cashing it out. My question is more technical as it relates to RenVM as the "Universal Stablecoin Converter". You sound convinced that RenVM can bridge Libra, DCEP, maybe other CBDCs in the future, but I'm skeptical how RenVM works with account-based currencies. (1) Are we even sure of DCEP's underlying design and whether it or other CBDCs even plan to use digital signatures? And (2) wouldn't RenVM need a KYC-approved account to even get an address on these chains? It seems like DCEP would have to go through a Chinese Circle, who would just issue an ERC20. A: As far as underlying blockchain technology goes (eg the maths of it) I don’t see there being any issues. Until we know more about whether or not KYCd addresses are required (and if they are, how they work), then I can’t specifically comment on that. However, it is more than possible not to require RenVM to be KYCd (just like you can’t “KYC Ethereum”) and instead move that requirement to addresses on the host blockchain (eg KYC Ethereum addresses for receiving the cross-chain asset). Whether this happens or not would ultimately be up to whether the issuer wanted interoperability to be possible. Q: In that scenario, how would RenVM even receive the funds to be transferred to the KYC'd Ethereum address? For Alice to send DCEP to Bob's KYC'd Ethereum address, RenVM would need a DCEP address of its own, no? A: Again, this is impossible to say for certain without knowing the implementation of the origin chain. You could whitelist known RenVM scripts (by looking at their form, like RenVM itself does on Bitcoin). But mostly likely, these systems will have some level of smart contract capabilities and this allows very flexible control. You can just whitelist the smart contract address that RenVM watches for cross-chain events. In origin chains with smart contracts, the smart contract holds the funds (and the keys the smart contract uses to authorise spends are handled as business logic). So there isn’t really a “RenVM public address” in the same sense that there is in Bitcoin. Q: The disbonding period for Darknodes seem long, what happens if there is a bug? A: It’s actually good for the network to have a long disbonding period in the face of a bug. If people were able to panic sell, then not only would the bug cause potential security issues, but so too would a mass exodus of Darknodes from the network. Having time to fix the bug means that Darknodes may as well stick around and continue securing the network as best they can. Because their REN is at stake (as you put it) they’re incentivised to take any of the recommended actions and update their nodes as necessary. This is also why it’s critical for the Greycore to exist in the early days of the network and why we are rolling out SubZero the way that we are. If such a bug becomes apparent (more likely in the early days than the later days), then the Greycore has a chance to react to it (the specifics of which would of course depend on the specifics of the bug). This becomes harder and slower as the network becomes more decentralised over time. Not mcap, but the price of bonded Ren. Furthermore, the price will be determined by how much fees darknodes have collected. BTW, loongy could you unveil based on what profits ratio/apr the price will be calculated? This is up to the Darknodes to governance softly. This means there isn’t a need for an explicit oracle. Darknodes assess L vs R individually and vote to increase fees to drive L down and drive R up. L is driven down by continue fees, whereas R is driven up by minting/burning fees. Q: How do you think renvm would perform on a day like today when even cexs are stretched. Would the system be able to keep up? A: This will really depend on the number of shards that RenVM is operating. Shards operate in parallel so more shards = more processing power. Q: The main limiting factor is the speed of the underlying chain, rather than RenVM? A: That’s generally the case. Bitcoin peaks at about 7 TPS so as long as we are faster than this, any extra TPS is “wasted”. And you actually don’t want to be faster than you have to be. This lets you drop hardware requirements, and lowering the cost of running a Darknode. This has two nice effects: (a) being an operator generates more profit because costs are lower, and (b) it’s more accessible to more people because it’s a little cheaper to get started (albeit this is minor). Q: Just getting caught up on governance, but what about: unbonded REN = 1 vote, bonded REN = (1 vote + time_served). That'd be > decentralization of Darknodes alone, an added incentive to be registered, and counter exchanges wielding too much control. A: You could also have different decaying rates. For example, assuming that REN holders have to vote by “backing” the vote of Darknodes: Let X be the amount of REN used to voted, backed behind a Darknode and bonded for T time. Let Y be the amount of time a Darknode has been active for. Voting power of the Darknode could = Sqrt(Y) * Log(X + T) Log(1,000,000,000) = ~21 so if you had every REN bonded behind you, your voting power would only be 21x the voting power of other nodes. This would force whales to either run Darknodes for a while and contribute actively to the ecosystem (or lock up their REN for an extended period for addition voting power), and would force exchanges to spread their voting out over many different nodes (giving power back to those running nodes). Obviously the exchange could just run lots of Darknodes, but they would have to do this over a long period of time (not feasible, because people need to be able to withdraw their REN). Q: Like having superdelegates, i.e, nodes trusted by the community with higher voting power? Maybe like council nodes A: Well, this is essentially what the Greycore is. Darknodes that have been voted in by the community to act as a secondary signature on everything. (And, interestingly enough, you could vote out all members to remove the core entirely.) Q: Think the expensive ren is a security feature as well. So, doubt this would impact security potentially? I don’t know. I wouldn’t vote to cut my earnings by 40% for example lol A: It can lead to centralisation over time though. If 100K REN becomes prohibitively expensive, then you will only see people running Darknodes that can afford a large upfront capital investment. In the mid/long-term this can have adverse effects on the trust in the system. It’s important that people “external” to the system (non-Darknodes) can get themselves into the system. Allowing non-Darknodes to have some governance (even if it’s not overall things) would be critical to this. Q: That darknode option sounds very interesting although it could get more centralized as the price of 100k Ren rises.For instance dark nodes may not want to vote to lower the threshold from 100k to 50k once Ren gets too expensive. A: A great point. And one of the reasons it would be ideal to be able to alter those parameters without just the Darknodes voting. Otherwise, you definitely risk long-term centralisation. Q: BTC is deposited into a native BTC address, but who controls this address (where/how is this address’s private key stored)? A: This is precisely the magic behind RenVM. RenVM uses an MPC algorithm to generate the controlling private key. No one ever sees this private key, and no one can sign things with it without consensus from everyone else.
The family of public-key cryptosystems, a fundamental breakthrough in modern cryptography in the late 1970s, has increasingly become a part of our communication networks over the last three decades. The Internet and other communication systems rely principally on the Diffie-Hellman key exchange, RSA encryption, and digital signatures using DSA, ECDSA, or related algorithms. The security of these cryptosystems depends on the difficulty of number theory problems such as Integer Factorization and the Discrete Log Problem. In 1994, Peter Shor showed that quantum computers could solve each of these problems in polynomial time, thus rendering the security of all cryptosystems based on such assumptions impotent. In the academic world, this new science bears the moniker Post-Quantum Cryptography (PQC). In August 2015, the National Security Agency (NSA) published an online announcement stating a plans to transition to quantum-resistant algorithms. In December 2016, the National Institute of Standards and Technology (NIST) announced a call for proposals of quantum resistant algorithms with a deadline of November 30th 2017. In light of the threat that quantum computers pose to cryptosystems such as RSA and ECC, the once-distant need to develop and deploy quantum-resistant technologies is quickly becoming a reality. Cryptocurrencies like Bitcoin are new financial instruments which are created to make financial transactions more efficient, cheaper, and decentralized. Their fundamental building blocks are cryptographic algorithms such as ECC digital signatures which are used to perform various functions to ensure the integrity and security of the whole system. However, the use of ECC signatures and other similar cryptographic algorithms means that quantum computing could pose a fatal threat to the security of existing cryptocurrencies, which deploy number theory-based public key cryptosystems extensively. The mission of the ABCMint Foundation is to successfully develop quantum-resistant blockchain technology. We also look to promote and support fundamental research for quantum computing technology and post-quantum algorithms.
These questions are sourced directly from Telegram, other monthly FAQ can be found here:https://docs.renproject.io/darknodes/community/monthly-community-faq Q: So RenVM is essentially a BFT protocol (with 1/3 malicious nodes) that does ECDSA threshold key generation and signing? Is that right? A: Yes, that's exactly what we have! We are exploring getting this to 1/2 and are confident it is possible, but the current implementation on Testnet is 1/3. Just today we also pushed an update that doubled the speed (and halved the bandwidth) of the sMPC signing algorithm. Q: Have any tests been done on the speed of Interoperability? A: The Testnet demo is live and open to the public, have a play with it and let us know about your experience (including speed). We have done some preliminary profiling; numbers look good so far. Fast enough for a single shard to keep up with Bitcoin. The next version of RZL sMPC is under development and will introduce pre-computations that significantly increase the peak performance of RenVM from 10 TPS to over 100 TPS (these numbers are based on our initial conservative estimates). Q: Currently, we see a quick performance of the swaps. When migrating to the mainnet (considering there will be real mainnet of say 250 Darknodes and real BTC, ETH, etc.) will it affect the speed? A: Speed is a complex issue when it comes to RenVM. I'll try and break it down: The biggest concern for speed is that RenVM needs to wait for a transaction to be confirmed on one chain before shifting the tokens to another chain. When working with Bitcoin this can take hours. -So latency is unavoidable (think of latency as how long a tunnel is) -So what about throughput (how wide the tunnel is)? First, how to solve the latency problem. Well, we cannot actually solve it because we cannot change Bitcoin. But we can work around it by using "Universal Interoperability." In this model, a third party takes on the confirmation risk. While RenVM waits for the confirmation of a transaction on Bitcoin, the third party steps in and fulfills the Ethereum side of the transaction with BTC that has already been shifted previously. When the Bitcoin transaction is finally confirmed, the third party is refunded using the newly shifted BTC. This means the third party is taking on risk (the Bitcoin transaction may be shuffled away), so they charge a fee to cover this + their services. This means that the shift can be almost instant, and the only thing we need to worry about is throughput. We believe we can get 10 TPS throughput, which is more than Bitcoin, so throughput isn't a problem (we only need to be as fast as Bitcoin). For other chains that are faster, we can introduce multiple shards. If one shard can do 10 TPS, then 10 shards can do 100 TPS. I've described this process with Bitcoin, but it works for any pair of chains. Also, the third party cannot be guaranteed to step in (maybe they don't want to take the risk today) but if they do not, then the transaction will still go through but just at the slower speed. If the third party does step in, they're guaranteed to be refunded. So the introduction of "Universal Interoperability" does not introduce any central trust into the system. Q: So Universal Interoperability is a partially centralized thing? A: No because any third party can step in and provide the service. Further, the processes involved are all handled by smart contracts. Q: Has there been a discussion of security in terms of sharding? Getting 1/3 stake and compromising a shard is obviously much easier than compromising the network, what's everyone's thoughts on that? A: Yes there has; once you move to a sharding model, the risk of an attacker gaining control of a shard becomes a probabilistic problem rather than an absolute one (for example if you're sampling with replacement, in theory, a single attacker can corrupt the whole network). Let's say an attacker owns enough of the network to have a 2^-1 chance of corrupting a shard (expected time to attack = ~2 days). If you are using a 20/20 multi-sig, where each shard controls one signature, then the chance of corrupting enough shards becomes 2^-20 (expected time to attack = ~2800 years). In line with this example, the shard could be around N=24 (which would have a corruption chance of ~0.56) so each shard can be very fast (and shards would be running in parallel). Obviously we want to avoid multisigs (they're expensive and not all blockchains can support them) but this is mostly an example of the larger concept: requiring multiple shards to work together. Q: Just got curious if the bug-fixing and developing has been overwhelming since the release of testnet? How do you feel it's been so far? A: I wouldn't say overwhelming. It's definitely keeping us busy. Finding bugs and fixing them is actually very satisfying work; it reduces stress by increasing confidence, and this helps improve motivation and productivity. It's also good to be able to revisit parts of the system and go about perfecting them. Often in software development, there is the adage "never optimize early". Well, the time has finally come to optimize (not just performance, but design, safety, etc.). Everyone wants the thing they build to be perfect, and being able to make that the focus is an awesome feeling. Q: Is there a reason for having private repos? A: It's important for the success of the network to maintain a competitive advantage, and important to avoid "day zero" bugs from people that find them but don't report (in the hopes to take advantage). We'll be getting the code (and our maths) reviewed and audited, and probably show it to first adopting groups so they can verify it themselves, and as Mainnet grows we will open-source everything, along with a Transperency Plan that outlines when and how repos will be open-sourced. Q: My Darknodes still show the old command center. How do I view them on the new one? A: The new Command Center is for RenVM specifically (and it's only viewable on RenVM Testnet); once we switch Darknodes over to the RenVM network, they will utilize the new Command Center. To play around with it, put your MetaMask on Kovan Test Network. A video that a community member created can be found here: https://twitter.com/RenIsLyfe/status/1166091169853579265?s=20 Q: Digital Ocean (DO) sent me a message saying my VPS would be down for maintenance, is this an issue? A: Nope, this is just part and parcel of using a VPS. From time to time, they need to do maintenance. They will inform you if you need to take action. This is a real-world example of why it's crazy to expect a decentralized network to have all participants online all the time, and why you cannot "incentivize" being online by punishing being offline. It's unavoidable even when there are entire expert teams with years of experience on the job. The more nodes you have, the more likely any one of them is to experience an issue like this at any one time. Your REN is not at risk if your Darknode does go offline. It is also unlikely that a Darknode that is offline due to these kinds of circumstances will remain offline long enough to be forced out of the network. Q: Will the community darknodes be partaking in the RenVM Testnet, or are you using your own nodes to test it out, or is it a gradual deploy? A: The team has about 24 Testnet Darknodes that power it. We may open these Testnet nodes up to a few groups in the Working Group, but no public participation of Testnet Darknodes will be pursued at this time. Q: A couple of questions for the team: 1) Bonded REN value informs how much value can be securely shifted through RenVM at any given time. If bonded value drops below the threshold, are there any risks beyond incentive to collude which arise? is there any liquidation risk ala TBTCsigners? 2) Does RenVM enforce any time floors/ceilings on shifting/locking tokens? I assume anything like that would be enforced by a third party like Compound? A: 1. There are collusion risks but we plan to mitigate this by having Darknodes able to "tell on each other" so if you are colluding with someone that you don't trust 100% you risk losing your bond so attacks only really make sense if you own all the colluding Darknodes (which, by definition, isn't really collusion it's buying up a bunch of REN). There is no liquidation risk. This is one key reason why we bond using REN, not another token; the "value of REN" is tied only to the use of RenVM. The safety of RenVM is predicated on the use of RenVM. RenVM is used = RenVM is safe
No time ceilings. We've been having discussions about how to keep Darknode well incentivized to maintain long-term deposits, but (a) most of RenVM's UX is built around handling the native token, not a wrapped version of it (how is a BTC maxi going to get a hold of ETH to use their ERC20 BTC?), and (b) payments will be paid out over time to RenVM not instantly so this creates a more stable income for the Darknodes instead of large but infrequent lumps of pay, (c) we got another trick up our sleeve that I'll be adding to the GitHub any day now, (d) if you have ideas about how to incentive Darknodes to maintain BTC that is being deposited long-term, please feel free to let us know!
Q: Has there been a pattern established where third-parties could pay the gas for the eth transactions needed during shifting? For instance, would it be straightforward for an app dev to pay the gas for the user but add a small additional fee onto the RenVM transaction? They would pay the gas in ETH for the user in exchange for that value collected in BTC or zBTC? A: This is going to be very straightforward for devs. We are designing examples as we speak to set the standard for doing this and therefore make integration as easy as possible. Q: Can a RenVM gateway addresses be reused? As in if a user creates a gateway address for 0.1 BTC, can they send exactly 0.1 BTC that address, mint zBTC, and then repeat that process again without creating a new gateway? A: Currently no, a gateway can only be used once; but we are in the process of creating that feature and it should be ready within the next month or so. Q: What’s the best way to set up a Darknode if I only have Microsoft? A: We do not formally support a Windows CLI as of right now, but we are adding Windows CLI support prior to Mainnet, so please do stay tuned.
Bitcoin Cash Hard Fork 15 May 2019 | Know Everything About Upcoming BCH Fork
https://preview.redd.it/idsupgh4k7y21.png?width=1500&format=png&auto=webp&s=0a00b768fdbad52a99bfb7f041c79e109d2b1c44 The price of Bitcoin Cash (BCH) surged dramatically once the news of the upcoming Bitcoin Cash fork came out. BCH broke over 300 USD with an increase of 13% as the news of Schnorr upgrade broke the internet and the crypto space. Schnorr upgrade was initially being proposed by Peter Wuille, the Blockstream co-founder. The Bitcoin Cash community has voted for the Schnorr upgrade unlike their criticism on the past discussions on Lightning, Segregated Witnesses (SegWit) and other technologies. The Bitcoin Cash hard fork date scheduled is on May 15, 2019. Before that, a testnet has already been launched, which will help the developers test before the official launch. You can track the BCH hard fork time here, where you can find Bitcoin Cash hard fork countdown. Alysssa Hertig tweeted from CoinDesk that this change is going to be phenomenal, and is widely supported by the community members:
Let us understand what difference would it make to the BCH fork 2019 after the Schnorr Upgrade:
Cryptographically, to prove that you own Bitcoin and in order to send funds to others, you “sign” with a private key, which as of now, uses Elliptic Curve Digital Signature Algorithm (ECDSA) scheme which lacked scalability and privacy features. But Schnorr signatures will be able to verify several signatures at once, which is way faster than even verifying one signature eight times, which in turn will improve scalability and privacy, wherein there would be certain anonymity. Schnorr signatures will aggregate the signatures, public keys and messages of multiple transactions into one, enabling faster transactions. Read More - https://coinswitch.co/news/bitcoin-cash-hard-fork-15-may-2019-know-everything-about-upcoming-bch-fork
Peer-to-peer smart derivatives for any asset over any network!
Taurus0x Overview Distributed off-chain / on-chain protocol powering smart derivatives from end to end, for any asset over any network. Background of Taurus0x Remember around September 2017 when the world lost its cool over Bitcoin prices? It was nearly an ideological war for many. It occurred to me to create an app for people to bid on Bitcoin prices, and I would connect that app to a smart contract to execute bids on the blockchain. It took me a long couple of weeks to figure out how many licenses I would need to acquire to run such a business in the United States. It became evident that market making is a huge undertaking and is better off decentralized in a an open-standard protocol to generate liquidity. The protocol needed to be fully decentralized as a primary requirement. Why? because I believe in the philosophy of decentralization and creating fair market makers, governed by a public community. It is the right thing to do in order to create equal opportunity for consumers without centralized control and special privileges. It comes at no surprise to anyone at this point that the vast majority of “ICOs” were empty promises. Real life utility was and is a necessity for any viable project. Transitioning from a centralized world to a tokenized and decentralized one cannot be abrupt. The protocol needed to support both worlds and allow for a free market outcome as far as adoption. Scalability-wise and as of today, Ethereum could not handle a real-time full DEX that could compete with advanced and well-known centralized exchanges. And quite frankly, maybe it’s not meant to. This is when the off-chain thinking started, especially after witnessing a couple of the most successful projects adopting this approach, like Lighting and 0xProject. The trade-off was the complexity of handling cryptographic communications without the help of the blockchain. I had met my co-founder Brett Hayes at the time. I would need another 3 or 4 articles to explain Brett for you. To the substance. What is Asymmetrical Cryptography? Asymmetrical cryptography is a form of cryptography that uses public and private key pairs. Each public key comes with its associated and unique private key. If you encrypt a piece of data with a private, only the associated public key may be used to decrypt the data. And vice versa. If I send you a “hello” encrypted with my private key, and you try to decrypt it with my public key (which is no secret). If it decrypts fine, then you are positive that this “hello” came from me. This is what we call digital signatures. The figure below is from Taurus0x whitepaper and describes the chosen digital signature algorithm (ECDSA). https://preview.redd.it/n8kavgofbm211.png?width=1000&format=png&auto=webp&s=289695a17cd413b68105b249d615b82bae1fe1dc What are Smart Derivatives? Well, what are derivatives in the first place? In the financial world, a derivative is a contract between two or more parties based upon an asset. Its price is determined by fluctuations in the underlying asset. The most common underlying assets include stocks, bonds, commodities, currencies, interest rates and market indexes. Futures contracts, forward contracts, options, swaps, cryptocurrency prices and warrants are common derivatives. Smart Derivatives are smart contracts that behave like financial derivatives. They possess enough information and funds to allow for execution with guaranteed and trusted outcomes. What is Taurus0x? Taurus0x is a distributed off-chain / on-chain protocol powering smart derivatives from end to end. Taurus0x is both asset and network-agnostic. The philosophy is to also become blockchain-agnostic as more blockchains come to life. Distributed = fully decentralized set of smart contracts and libraries. Off-chain = ad-hoc protocol not limited to a blockchain. On-chain= trusted outcome without intermediaries. Asset-agnostic = supports any asset, not limited to cryptocurrency. Network-agnostic = contracts can be transmitted over any network (email, text, twitter, facebook, pen and paper, etc.) Who can use Taurus0x? Taurus0x protocol is ultimately built to serve end consumers who trade derivative contracts. Participants may engage in a peer-to-peer derivative contracts among each other without the need for a house in the middle. The Taurus0x team and advisory realize that the migration from a centralized world to a decentralized one cannot be abrupt, specifically in FinTech. Taurus0x is built to support existing business models as well as C2C peer-to-peer. Exchanges who want to take on the derivative market may use an open-source protocol without worrying about building a full backend to handle contract engagement and settlement. Taurus0x Exchanges would simply connect participants to each other, using matching algorithms. Taurus0x intends to standardize derivative trading in an open way. Having more exchanges using the protocol allows for creating public and permission-ed pools to generate compounded liquidity of contracts. This helps smaller exchanges by lowering the entry-to-market barrier. How does Taurus0x work? The process is simple and straightforward. Implementation details are masked by the protocol making it very easy to build on top. The first 2 steps represent off-chain contract agreement, while 3 and 4 solidify and execute the contract on-chain. 1- Create A producer creates a contract from any client using Taurus0x protocol, whether from an app, a website or a browser extension. The producer specifies a condition that is expected to happen sometime in the future. For example, I (the producer) might create a binary contract with the following condition: Apple stock > $200 by July 1, 2018 with a premium of 10 TOKENs (any ERC20 token) The contract will be automatically signed with my private key, which confirms that I created it. I can then share it (a long hexadecimal text) with anyone over any network I choose. 2- Sign When the consumer receives the signed contract, they will be able to load it via any client using Taurus0x. If the consumer disagrees with the producer on the specified condition, they will go ahead and sign the contract with their private key. Back to our example above, the consumer would think that Apple stock will remain under $200 by July 1, 2018. Now that the we have collected both signatures, the contract is ready to get published on blockchain. 3- Publish Anyone who possesses the MultiSig contract and its 2 signatures can go ahead and publish it to the Ethereum blockchain. That would most likely be either the producer, the consumer or a party like an exchange in the middle hosting off-chain orders. As soon as the contract is published, Taurus0x proxy (an open-source smart contract) will pull necessary funds from participating wallets into the newly created Smart Derivative. The funds will live in the derivative contract until successful execution. 4- Execute If at any point before the contract expiration date the specified condition becomes true (i.e. Apple Stock > $200), the producer can go ahead and execute the derivative contract. The contract will calculate the outcome and transfer funds accordingly. In this binary derivative example, the producer will receive 20 TOKENs in their wallet upon executing the contract. If the expiration date comes and the producer had never successfully executed the contract, the consumer may execute it themselves and collect the 20 TOKENs. This figure is from the Taurus0x whitepaper depicts the process: https://preview.redd.it/vr2y9b8ibm211.png?width=1250&format=png&auto=webp&s=1b7a8144fe2a41116a4f64d7418d3dacb4f42fc5 Summary Taurus0x is a highly versatile and modular protocol built using Ethereum-based smart contracts and wrapper JS libraries to bootstrap developer adoption. While Smart Derivatives are the first application of Taurus0x, it is worth noting that the protocol is not limited to cryptocurrencies or even derivatives for that matter. It is an ad-hoc and scalable contract management solution meant to guarantee trusted outcomes in the future based on conditions specified today. The semi off-chain nature of the protocol helps remediate Ethereum’s scalability limitations and makes it a viable product. Finally, the plan for Taurus0x is to be governed by a Decentralized Autonomous Organization or DAO as outlined in the roadmap on https://taurus0x.com. This is an area of research and development as of today. Decentralization does not fulfill its purpose if governance remains centralized, therefore it is without compromise that Taurus0x follows a decentralized governance structure.
Schnorr can do multisignature in a very straightforward and scalable way
-Gregory Maxwell, 2015 The theory is that Schnorr signatures are linear so if (r1,s1) and (r2,s2) are two signatures, then (r1+r2,s1+s2) is the signature of both signatures put together. This is cannot be applied directly to Bitcoin multisig because if the signature work linearly it means someone could forge a signature using the other public keys and “cancel out” the other signature. This problem is best described by Pieter Wuille:
This would mean that he could sign for both of them while everyone is assuming that we have created an address that is multisig that actually requires both of their signatures. This is the cancellation problem. You can choose your keys in such a way that other people's keys get canceled out.
-Pieter Wuille, 2016. So the linear formula which allow native supports of multisig wallets also native support of one member of a multisig wallet taking over the whole wallet. This problem could be fixed through delinearization but this introduces new issues. Specifically that it isn’t proven to be secure or not to be a breaking change to the cryptographic algorithm. Which is why the Schnorr signatures implementation has been delayed over and over (it is now, I believe, 18 months away at least).
Myth 2: reduction of at least a 25% in terms of storage and bandwidth
Estimates are that this upgrade would reduce the use of storage and bandwidth by at least 25%.
-Bitcoincore.org, March 2017. This estimate is pure fantasy. The same article states “Assuming every historical signature would be reduced to 1 byte”. Never, ever, Schnorr signatures will reduce signatures to 1 Byte. Just never. But do not believe me, once again Pieter Wuille says it: Schnorr signature are of 64 bytes fixed size. For comparison, current signatures are at max 73 bytes (12% bigger). This assumes that the delinearization process described in myth 1 does not incur a bigger signature than that or some additional data transfer. Also, This is of course assuming that everyone that use multisig decides to use the Schnorr alternative.
Myth 3: Schnorr signature will improve privacy
This one is a bit tricky and to be fair it is highly dependent on actual implementation.
Schnorr allows the entire policy of the multisig to be obscured and indistinguishable from a conventional single pubkey. In a threshold setup, it also becomes impossible for participants to reveal which of them authorized, or not, a transaction.
-Bitcoincore.org, March 2017. By design 1 Schnorr signature would replace all the signatures that would normally be involved in a multisig transaction. Thus hiding them. But this is only the theory. In practice the holders of the keys in the multisig still need to communicate and exchange their signature in order to generate that one Schnorr signature. It is extremely naive to believe that this communication step would not leave any public traces. This is particularly true if (as it has been mentioned) a bitcoin node acts as an aggregator.
The idea behind signature aggregation is to enable system validators ie. Bitcoin nodes to compute a single key and signature for every inputs of all transactions at the protocol level.
-Bitcoincore.org, March 2017. In that use case, the privacy would disappear the instant the third party node is involved in the transaction. Let’s remember that anyone can run a node which by default is decentralized and permissionless.
Why remove the ECDSA for another cryptographic system?? This seems quite pointless and quite a waste of time. While it is undeniable that he could yield some bandwidth reduction (nowhere near 25%!), the gains are far from offsetting the efforts. If anything, the cryptographic system should be changed for a quantum computing secure algorithm but not for another variant of the same. edit: spelling and link.
Ardor Improvement Proposal (AIP001?) - Adding Support for the "ed25591" Digital Signature Algorithm
This post is inspired by some of the ideas in this thread - https://www.reddit.com/Ardocomments/7qane0/confusion_and_inconsistent_instructions_about/. Is there a way to improve the Addresses and Public Keys in Ardor? Yes, there are a few ways... but a really good option might be to smuggle in some extra features too. Like killing two birds with one stone. Basically, since Bitcoin and NXT launched there has been a lot of very impressive work done on Digital Signatures Algorithms and specifically in relation to Elliptical Curves. A lot of this work has even been motivated by the cryptocurrency space. Ardor has an opportunity to benefit from some of this innovation by making some simple but clever modifications to it's code. Different cryptocurrencies settled on competing standards for their Digital Signature Algorithms during the design phase. For example, Bitcoin uses an elliptical curve called Secp2561 with a Digital Signature Algorithm called ECDSA. Ardor uses an elliptical curve called Curve25519 with a Digital Signature Algorithm called EC-KCDSA. There are various reasons why these choices were selected. The history might be of interest to some of you, but I won't go into that now. The elliptic curves are constants so the progress over time is on the Digital Signature Algorithms. Curiously some of the most impressive progress has been made on a DSA that neither Bitcoin nor Ardor use, called ed25591. But at least ed25591 builds on the same curve that Ardor already uses - Curve25519. So what is so great about ed25591? Well, it's new. You might say that's not necessarily a good thing. We want to see stability with a component like this. Well, it's seeing some rapid adoption because of it's awesome new features. See here for adoption insight - https://ianix.com/pub/ed25519-deployment.html. I.e. it's quickly becoming a standard. So with out further ado - what makes ed25591 really cool:
Hardware Wallets are ready for it (unlike EC-KCDSA which is why Ardor and NXT haven't had access to any hardware wallets yet! And because it's fast becoming a standard, future Hardware Wallets will handle it even better!)
Native Multi-Sig with Schnorr-like signatures
Compact Signatures (even less blockchain-bloat/more scalability!)
Could potentially implement Ardor Account Control using two different crypto-systems (EC-KDSA and ed25519) diffusing any possible single point of failure
Fast signature verification (good for nodes, minimizes their work/energy expenditure)
Some of the really technical benefits can be seen on the projects' homepage here - https://ed25519.cr.yp.to/. Oh and it's completely unencumbered by patents or licenses. And the reference code is "public domain". I probably haven't done ed25591 enough justice. It's really great and you can read more about it on the web. But what's this about killing two birds with one stone? Well, by making a reasonably big modification to the Ardor software like this, you could also take the opportunity to fix the Ardor Address Problem. Currently the address derivation scheme prioritizes short human-readable addresses at the expense of security. You can win back that lost security by making an outgoing transaction from your Ardor Address/Account (or making sure the incoming transaction broadcasts the recipients public key). But even that's not ideal. If anyone followed the development of Bitcoin they'll know that moving from Pay-to-Pub-Key to P2PKH was a big step up. But with Ardor it seems like you have to regress to increase your account security, even though you've lost the security benefit of the extra Address being derived (hashing) from the public-key step. So, while making the change of adding support for ed25519, that would be the perfect time to update the Ardor Address format (make the addresses longer to remove the collision problem with the short addresses). The market seems to have decided that long difficult-to-read addresses are not a problem. Wallets and the infrastructure around them has meant that people don't often have to resort to typing out these addresses anyway. If we do this right we can have stronger cold-storage than Bitcoin does and a more flexible DSA than Bitcoin does. Not to mention a more scalable Blockchain, faster block-times, a decentralized exchange, increased transaction capacity, less wasteful consensus algorithm... etc. etc. =D Let me know what you, the community, think. I'm happy to take any questions.
Let me clarify common misconceptions about Bitcoin. Myth # 1. It's just something similar to other virtual currencies, nothing new All other virtual currencies are controlled by their regulatory center. This means that: they can be printed on the subjective whims of the currency regulator; they could be destroyed by an attack on this regulatory center.; arbitrary rules can be imposed by the currency regulator. Bitcoins, being initially a decentralized currency, solve all these problems. Myth # 2. Bitcoins do not solve any problems that gold and/or Fiat money cannot solve Unlike gold bitcoins: easy to carry and store; easy to authenticate. Unlike Fiat money, bitcoins: have predictable and decreasing emissions; not controlled by any regulatory center. Unlike Fiat electronic money, bitcoins: can be anonymous (like cash); there's no way the accounts can be frozen. Myth # 3. Bitcoins are secured by CPU time It is incorrect to say that bitcoins are secured by CPU time. When it is said that a currency is "secured" by something, it is meant to be centrally tied to something at the exchange rate. You can not exchange bitcoins for the computing power spent on their generation (it is too high). In this sense, bitcoins are not secured by anything. This is a self-valuable product. Think, unless gold is provided with something? No, it's just gold. It's the same with bitcoins. Bitcoin currency is created with the use of processor power: the integrity of the block chain is protected from all sorts of attacks by the existence of a large computer network. That's it. Myth # 4. Bitcoins are worthless because they are not secured by anything Gold is not secured by anything, but is used and valued everywhere. See the previous myth. Myth # 5. The value of bitcoins is based on how much electricity and processing power is required to generate them This myth is an attempt to apply labor value theory to bitcoins, which is not applicable to them and is probably false. Just because something requires X resources to create doesn't mean that the final product will cost X. it can cost more or less X, depending on the usefulness to users. In fact, there is a broken causal relationship (this applies to the above theory as a whole). The value of bitcoins is based on how valuable they are. If bitcoins rise in price, more people will try to generate them (because bitcoin generation becomes more profitable), this will increase the difficulty of generating, which in turn only leads to the difficulty of mining them. If bitcoins fall in price, then the reverse process occurs. These processes maintain a balance between the cost of generation and the cost of bitcoins generated. Myth # 6. Bitcoins have no value of their own (unlike some other things) Many things have their own value, but it is usually well below the market value of the thing. Consider gold: if it were not used as an inflation-resistant value, and used only for industrial purposes, it would not have today's value, since the industrial need for gold is much lower than it is available. Historical value has helped establish some things as a means of exchange, but it is certainly not a necessary condition. Perhaps bitcoins will not be used as a raw material for industrial purposes, but they have many other useful qualities that are necessary for the means of exchange. The value of bitcoins is determined solely by people's desire to trade them - supply and demand. Myth # 7. Bitcoins are illegal because they are not a legal tender Short answer: chickens are not a legal tender, but bartering with chickens is not illegal. There are many currencies that are not legal tender. Currency, after all, is just a convenient unit of account. Although national laws may vary from country to country (you should definitely check the laws of your state), in General - trading with any commodity exchange, including digital goods (e.g.: bitcoins, virtual worlds second Life or WoW game currencies), is not illegal. Myth # 8. Bitcoins are a form of domestic terrorism because they only harm the economic stability of the state and the state currency Read the relevant Wikipedia article. Action will not be considered terrorism if it is not violent. Bitcoins are not imposed on anyone with violence, so they are not terrorism. Also, bitcoins are not "internal". It's a worldwide product. Look at the auto-generated node map. Myth # 9. Bitcoins will only facilitate tax evasion, which will lead to a possible fall of civilization It's up to you whether you follow the laws of the country or face the consequences of breaking the laws. Myth # 10. Bitcoins can print/mint everyone, therefore they're useless To generate coins requires significant computing power, in addition, over time, all the coins will be generated. Myth # 11. Bitcoins are useless because they are based on unverified / unproven cryptography The Sha-256 and ECDSA algorithms that are used in the #Bitcoin program are well-known industrial encryption standards. Myth # 12. First bitcoin users are unfairly rewarded The first users were rewarded for taking on a higher risk of losing their time and money. From a more pragmatic point of view, the term "equity" is a conditional concept, making it unlikely to be agreed upon by a large number of people. Establishing "fairness" is not the goal of the Bitcoin project, as it would be simply impossible. The vast majority of the 21 million bitcoins still haven't been distributed among people. If you start generating or purchasing bitcoins today, you can become one of the "first users"yourself. Myth # 13. 21 million coins is not enough, it is not commensurate with the needs of mankind In fact, the Bitcoin project will exist 2099999997690000 (just over two quadrillions) of the maximum possible indivisible units. One bitcoin is 100 million (one hundred million) of them. In other words, each bitcoin can be divided into 10^8 parts. If the value of bitcoins rises too much, then people for convenience can start working with smaller pieces such as Milli-bitcoins (mBTC) and micro-bitcoins (µbtc). However, it is possible and denomination with coefficients 1:10, 1: 100 and so on. Myth # 14. Bitcoins are stored in wallet files, just copy the wallet and get more coins! No, Your wallet file contains secret private keys that give you the right to dispose of your bitcoins. Imagine that you have a key issued by your Bank to manage your account. If you give it to someone else, it will not increase the funds in your Bank account. The funds will be spent either by You or by this third party. Myth # 15. Lost coins cannot be replaced, which is bad The minimum bitcoin unit is 0.00000001, so this is not a problem. If you lose coins, all other coins will rise in price a little. Consider this a donation to all other bitcoin users. There is a related question (and the answer to it). Why is there no mechanism to replace lost coins? It is impossible to distinguish between the lost coin and the one that is simply not used at the moment and waiting in someone's purse of his time to be useful. Myth # 16. It's a giant pyramid scheme. In financial pyramids (see Ponzi scheme and MMM), the founders convince investors that they will be in profit. Bitcoins do not give such guarantees. There is no regulatory center, there is just a group of people who are building a new economy. However, one should not confuse bitcoins by themselves with various projects on the Internet, which can accept bitcoins as a contribution and be financial pyramids. Myth # 17. Limited emissions and lost coins generate a deflationary spiral Both deflationary forces can manifest themselves, and economic factors such as hoarding counteract the human factor, which can reduce the chances of a deflationary spiral. Myth # 18. The idea of bitcoin may not work because there is no way to control inflation Inflation is simply an increase in prices over time, which is usually a consequence of currency depreciation. It is a function of supply and demand. Given the fact that the supply of bitcoins is fixed (due to the peculiarities of their issue), unlike Fiat money, the only way out of control of inflation is the disappearance of demand for bitcoins. It should also be taken into account that bitcoins are a currency with a predictable decentralized issue. If demand falls to almost zero, then bitcoins will be doomed in any case. However, it is unlikely that this can actually happen. The key point here is that bitcoins cannot be impaired by a sharp increase in inflation by any person, organization or government, since there is no way to increase the supply too much due to the peculiarities of the issue. In fact, a more likely scenario is an increase in demand for bitcoins due to the growing popularity, which should lead to a constant increase in the exchange rate and deflation. Myth # 19. Bitcoin community is anarchists, conspiracy theorists, supporters of the gold standard and geeks Confirm. However, it is necessary to consider that it is only a part of all color of community. https://preview.redd.it/qkk7hybryqg21.jpg?width=1980&format=pjpg&auto=webp&s=a373d5483cc87c1e2c651ff864fc324273fa3f08
Nice Article About How HPB Perform Vs EOS (and so ETH)
Abstract Bitcoin, being the most successful cryptocurrency, has been repeatedly attacked with many users losing their funds. The industry's response to securing the user's assets is to offer tamper-resistant hardware wallets. Although such wallets are considered to be the most secure means for managing an account, no formal attempt has been previously done to identify, model and formally verify their properties. This paper provides the first formal model of the Bitcoin hardware wallet operations. We identify the properties and security parameters of a Bitcoin wallet and formally define them in the Universal Composition (UC) Framework. We present a modular treatment of a hardware wallet ecosystem, by realizing the wallet functionality in a hybrid setting defined by a set of protocols. This approach allows us to capture in detail the wallet's components, their interaction and the potential threats. We deduce the wallet's security by proving that it is secure under common cryptographic assumptions, provided that there is no deviation in the protocol execution. Finally, we define the attacks that are successful under a protocol deviation, and analyze the security of commercially available wallets. References
Alois, J.: Ethereum parity hack may impact eth 500.000 or 146 million (2017)
Atzei, N., Bartoletti, M., Lande, S., Zunino, R.: A formal model of bitcoin transactions. Financial Cryptography and Data Security. LNCS, Springer (2018)
Badertscher, C., Maurer, U., Tschudi, D., Zikas, V.: Bitcoin as a transaction ledger: A composable treatment. pp. 324–356 (2017)
Bamert, T., Decker, C., Wattenhofer, R., Welten, S.: Bluewallet: The secure bitcoin wallet. In: International Workshop on Security and Trust Management. pp. 65–80. Springer (2014)
Bonneau, J., Miller, A., Clark, J., Narayanan, A., Kroll, J.A., Felten, E.W.: Sok: Research perspectives and challenges for bitcoin and cryptocurrencies. In: Security and Privacy (SP), 2015 IEEE Symposium on. pp. 104–121. IEEE (2015)
Canetti, R.: Universally composable security: A new paradigm for cryptographic protocols. pp. 136–145 (2001)
Canetti, R.: Universally composable signatures, certification and authentication. Cryptology ePrint Archive, Report 2003/239 (2003), http://eprint.iacr.org/2003/239
Canetti, R., Krawczyk, H.: Universally composable notions of key exchange and secure channels. Cryptology ePrint Archive, Report 2002/059 (2002), http://eprint.iacr.org/2002/059
Garay, J., Kiayias, A., Leonardos, N.: The bitcoin backbone protocol: Analysis and applications. In: Annual International Conference on the Theory and Applications of Cryptographic Techniques. pp. 281–310. Springer (2015)
Gentilal, M., Martins, P., Sousa, L.: Trustzone-backed bitcoin wallet. In: Proceedings of the Fourth Workshop on Cryptography and Security in Computing Systems. pp. 25–28. ACM (2017)
Gkaniatsou, A., Arapinis, M., Kiayias, A.: Low-level attacks in bitcoin wallets. In: International Conference on Information Security. pp. 233–253. Springer (2017)
Heilman, E., Kendler, A., Zohar, A.: Eclipse attacks on bitcoin’s peer-to-peer network.
Hsiao, H.C., Lin, Y.H., Studer, A., Studer, C., Wang, K.H., Kikuchi, H., Perrig, A., Sun, H.M., Yang, B.Y.: A study of user-friendly hash comparison schemes. In: Computer Security Applications Conference, 2009. ACSAC’09. Annual. pp. 105–114. IEEE (2009)
Johnson, D., Menezes, A., Vanstone, S.: The elliptic curve digital signature algorithm (ecdsa). International journal of information security 1(1), 36–63 (2001)
Lim, I.K., Kim, Y.H., Lee, J.G., Lee, J.P., Nam-Gung, H., Lee, J.K.: The analysis and countermeasures on security breach of bitcoin. In: International Conference on Computational Science and Its Applications. pp. 720–732. Springer (2014)
Nakamoto, S.: Bitcoin: A peer-to-peer electronic cash system (2008)
Pass, R., Seeman, L., Shelat, A.: Analysis of the blockchain protocol in asynchronous networks. In: Annual International Conference on the Theory and Applications of Cryptographic Techniques. pp. 643–673. Springer (2017)
Penard, W., van Werkhoven, T.: On the secure hash algorithm family. Cryptography in Context pp. 1–18 (2008)
Tan, J., Bauer, L., Bonneau, J., Cranor, L.F., Thomas, J., Ur, B.: Can unicorns help users compare crypto key fingerprints? In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. pp. 3787–3798. ACM (2017)
Uzun, E., Karvonen, K., Asokan, N.: Usability analysis of secure pairing methods. In: International Conference on Financial Cryptography and Data Security. pp. 307–324. Springer (2007)
Vasek, M., Bonneau, J., Ryan Castellucci, C.K., Moore, T.: The bitcoin brain drain: a short paper on the use and abuse of bitcoin brain wallets. Financial Cryptography and Data Security, Lecture Notes in Computer Science. Springer (2016)
Volotikin, S.: Software attacks on hardware wallets. Black Hat USA 2018 (2018)
The algorithm we are going to see is ECDSA, a variant of the Digital Signature Algorithm applied to elliptic curves. ECDSA works on the hash of the message, rather than on the message itself. The choice of the hash function is up to us, but it should be obvious that a cryptographically-secure hash function should be chosen. Bitcoin Stack Exchange is a question and answer site for Bitcoin crypto-currency enthusiasts. It only takes a minute to sign up. Sign up to join this community. Anybody can ask a question Anybody can answer The best answers are voted up and rise to the top Bitcoin . Home ; Questions ; Tags ; Users ; Unanswered ; Nonce usage in ECDSA signing algorithm. Ask Question Asked 1 year, 3 months ago ... ECDSA. ECDSA stands for Elliptic Curve Digital Signature Algorithm. This is a Digital Signature Algorithm (DSA) that uses an elliptic curve cipher. The Bitcoin network utilizes this to ensure that only authorized parties can spend their bitcoins. Stack Exchange network consists of 176 Q&A communities including Stack Overflow, ... So I used my algorithm and the RPC call for signing and I got a couple of questions. Since we have the ephemeral ... bitcoin-core ecdsa. asked Mar 17 at 20:14. Allan Romanato. 186 12 12 bronze badges. 0. votes. 1answer 537 views How do I to get the r, S and Z values from a raw transaction (version 2) For ... Das Bitcoin-Protokoll verwendet derzeit einen bestimmten Signaturalgorithmus, den so genannten Elliptic Curve Digital Signature Algorithm (ECDSA). In nicht allzu ferner Zukunft wird erwartet, dass ein neuer Signaturalgorithmus namens Schnorr-Signaturen ein gültiger Signaturtyp im Bitcoin-Netzwerk ist. Schnorr-Signaturen helfen bei zwei heiklen Problemen im Bitcoin-Netzwerk: Skalierung und ...
Dev++ 01-01-EN Foundational Math, ECDSA and Transactions - Jimy Song
Skip navigation Sign in. Search Elliptic Curve Digital Signature Algorithm ECDSA Part 10 Cryptography Crashcourse - Duration: 35:32. Dr. Julian Hosp - Blockchain, Krypto, Bitcoin 5,761 views ECDSA Authentication Explained Using the Atmel ATECC108 CryptoAuthentication Device. Elliptic Curve Digital Signature Algorithm ECDSA Part 10 Cryptography Crashcourse - Duration: 35:32. Dr. Julian Hosp - Blockchain, Krypto, Bitcoin 5,281 views Jimmy Song explains the basics of cryptography that serves as a foundation for Bitcoin transactions. This course provides in-depth coverage of Elliptic Curve Digital Signature Algorithm (ECDSA ...