Simple & Accurate "2 Clicks" Bitcoin Mining Calculator

Searching for the Unicorn Cryptocurrency

Searching for the Unicorn Cryptocurrency
For someone first starting out as a cryptocurrency investor, finding a trustworthy manual for screening a cryptocurrency’s merits is nonexistent as we are still in the early, Wild West days of the cryptocurrency market. One would need to become deeply familiar with the inner workings of blockchain to be able to perform the bare minimum due diligence.
One might believe, over time, that finding the perfect cryptocurrency may be nothing short of futile. If a cryptocurrency purports infinite scalability, then it is probably either lightweight with limited features or it is highly centralized among a limited number of nodes that perform consensus services especially Proof of Stake or Delegated Proof of Stake. Similarly, a cryptocurrency that purports comprehensive privacy may have technical obstacles to overcome if it aims to expand its applications such as in smart contracts. The bottom line is that it is extremely difficult for a cryptocurrency to have all important features jam-packed into itself.
The cryptocurrency space is stuck in the era of the “dial-up internet” in a manner of speaking. Currently blockchain can’t scale – not without certain tradeoffs – and it hasn’t fully resolved certain intractable issues such as user-unfriendly long addresses and how the blockchain size is forever increasing to name two.
In other words, we haven’t found the ultimate cryptocurrency. That is, we haven’t found the mystical unicorn cryptocurrency that ushers the era of decentralization while eschewing all the limitations of traditional blockchain systems.
“But wait – what about Ethereum once it implements sharding?”
“Wouldn’t IOTA be able to scale infinitely with smart contracts through its Qubic offering?”
“Isn’t Dash capable of having privacy, smart contracts, and instantaneous transactions?”
Those thoughts and comments may come from cryptocurrency investors who have done their research. It is natural for the informed investors to invest in projects that are believed to bring cutting edge technological transformation to blockchain. Sooner or later, the sinking realization will hit that any variation of the current blockchain technology will always likely have certain limitations.
Let us pretend that there indeed exists a unicorn cryptocurrency somewhere that may or may not be here yet. What would it look like, exactly? Let us set the 5 criteria of the unicorn cryptocurrency:
Unicorn Criteria
(1) Perfectly solves the blockchain trilemma:
o Infinite scalability
o Full security
o Full decentralization
(2) Zero or minimal transaction fee
(3) Full privacy
(4) Full smart contract capabilities
(5) Fair distribution and fair governance
For each of the above 5 criteria, there would not be any middle ground. For example, a cryptocurrency with just an in-protocol mixer would not be considered as having full privacy. As another example, an Initial Coin Offering (ICO) may possibly violate criterion (5) since with an ICO the distribution and governance are often heavily favored towards an oligarchy – this in turn would defy the spirit of decentralization that Bitcoin was found on.
There is no cryptocurrency currently that fits the above profile of the unicorn cryptocurrency. Let us examine an arbitrary list of highly hyped cryptocurrencies that meet the above list at least partially. The following list is by no means comprehensive but may be a sufficient sampling of various blockchain implementations:
Bitcoin (BTC)
Bitcoin is the very first and the best known cryptocurrency that started it all. While Bitcoin is generally considered extremely secure, it suffers from mining centralization to a degree. Bitcoin is not anonymous, lacks smart contracts, and most worrisomely, can only do about 7 transactions per seconds (TPS). Bitcoin is not the unicorn notwithstanding all the Bitcoin maximalists.
Ethereum (ETH)
Ethereum is widely considered the gold standard of smart contracts aside from its scalability problem. Sharding as part of Casper’s release is generally considered to be the solution to Ethereum’s scalability problem.
The goal of sharding is to split up validating responsibilities among various groups or shards. Ethereum’s sharding comes down to duplicating the existing blockchain architecture and sharing a token. This does not solve the core issue and simply kicks the can further down the road. After all, full nodes still need to exist one way or another.
Ethereum’s blockchain size problem is also an issue as will be explained more later in this article.
As a result, Ethereum is not the unicorn due to its incomplete approach to scalability and, to a degree, security.
Dash’s masternodes are widely considered to be centralized due to their high funding requirements, and there are accounts of a pre-mine in the beginning. Dash is not the unicorn due to its questionable decentralization.
Nano boasts rightfully for its instant, free transactions. But it lacks smart contracts and privacy, and it may be exposed to well orchestrated DDOS attacks. Therefore, it goes without saying that Nano is not the unicorn.
While EOS claims to execute millions of transactions per seconds, a quick glance reveals centralized parameters with 21 nodes and a questionable governance system. Therefore, EOS fails to achieve the unicorn status.
Monero (XMR)
One of the best known and respected privacy coins, Monero lacks smart contracts and may fall short of infinite scalability due to CryptoNote’s design. The unicorn rank is out of Monero’s reach.
IOTA’s scalability is based on the number of transactions the network processes, and so its supposedly infinite scalability would fluctuate and is subject to the whims of the underlying transactions. While IOTA’s scalability approach is innovative and may work in the long term, it should be reminded that the unicorn cryptocurrency has no middle ground. The unicorn cryptocurrency would be expected to scale infinitely on a consistent basis from the beginning.
In addition, IOTA’s Masked Authenticated Messaging (MAM) feature does not bring privacy to the masses in a highly convenient manner. Consequently, the unicorn is not found with IOTA.

PascalCoin as a Candidate for the Unicorn Cryptocurrency
Please allow me to present a candidate for the cryptocurrency unicorn: PascalCoin.
According to the website, PascalCoin claims the following:
“PascalCoin is an instant, zero-fee, infinitely scalable, and decentralized cryptocurrency with advanced privacy and smart contract capabilities. Enabled by the SafeBox technology to become the world’s first blockchain independent of historical operations, PascalCoin possesses unlimited potential.”
The above summary is a mouthful to be sure, but let’s take a deep dive on how PascalCoin innovates with the SafeBox and more. Before we do this, I encourage you to first become acquainted with PascalCoin by watching the following video introduction:
The rest of this section will be split into 10 parts in order to illustrate most of the notable features of PascalCoin. Naturally, let’s start off with the SafeBox.
Part #1: The SafeBox
Unlike traditional UTXO-based cryptocurrencies in which the blockchain records the specifics of each transaction (address, sender address, amount of funds transferred, etc.), the blockchain in PascalCoin is only used to mutate the SafeBox. The SafeBox is a separate but equivalent cryptographic data structure that snapshots account balances. PascalCoin’s blockchain is comparable to a machine that feeds the most important data – namely, the state of an account – into the SafeBox. Any node can still independently compute and verify the cumulative Proof-of-Work required to construct the SafeBox.
The PascalCoin whitepaper elegantly highlights the unique historical independence that the SafeBox possesses:
“While there are approaches that cryptocurrencies could use such as pruning, warp-sync, "finality checkpoints", UTXO-snapshotting, etc, there is a fundamental difference with PascalCoin. Their new nodes can only prove they are on most-work-chain using the infinite history whereas in PascalCoin, new nodes can prove they are on the most-work chain without the infinite history.”
Some cryptocurrency old-timers might instinctively balk at the idea of full nodes eschewing the entire history for security, but such a reaction would showcase a lack of understanding on what the SafeBox really does.
A concrete example would go a long way to best illustrate what the SafeBox does. Let’s say I input the following operations in my calculator:
5 * 5 – 10 / 2 + 5
It does not take a genius to calculate the answer, 25. Now, the expression “5 \ 5 – 10 / 2 + 5”* would be forever imbued on a traditional blockchain’s history. But the SafeBox begs to differ. It says that the expression “5 \ 5 – 10 / 2 + 5”* should instead be simply “25” so as preserve simplicity, time, and space. In other words, the SafeBox simply preserves the account balance.
But some might still be unsatisfied and claim that if one cannot trace the series of operations (transactions) that lead to the final number (balance) of 25, the blockchain is inherently insecure.
Here are four important security aspects of the SafeBox that some people fail to realize:
(1) SafeBox Follows the Longest Chain of Proof-of-Work
The SafeBox mutates itself per 100 blocks. Each new SafeBox mutation must reference both to the previous SafeBox mutation and the preceding 100 blocks in order to be valid, and the resultant hash of the new mutated SafeBox must then be referenced by each of the new subsequent blocks, and the process repeats itself forever.
The fact that each new SafeBox mutation must reference to the previous SafeBox mutation is comparable to relying on the entire history. This is because the previous SafeBox mutation encapsulates the result of cumulative entire history except for the 100 blocks which is why each new SafeBox mutation requires both the previous SafeBox mutation and the preceding 100 blocks.
So in a sense, there is a single interconnected chain of inflows and outflows, supported by Byzantine Proof-of-Work consensus, instead of the entire history of transactions.
More concretely, the SafeBox follows the path of the longest chain of Proof-of-Work simply by design, and is thus cryptographically equivalent to the entire history even without tracing specific operations in the past. If the chain is rolled back with a 51% attack, only the attacker’s own account(s) in the SafeBox can be manipulated as is explained in the next part.
(2) A 51% Attack on PascalCoin Functions the Same as Others
A 51% attack on PascalCoin would work in a similar way as with other Proof-of-Work cryptocurrencies. An attacker cannot modify a transaction in the past without affecting the current SafeBox hash which is accepted by all honest nodes.
Someone might claim that if you roll back all the current blocks plus the 100 blocks prior to the SafeBox’s mutation, one could create a forged SafeBox with different balances for all accounts. This would be incorrect as one would be able to manipulate only his or her own account(s) in the SafeBox with a 51% attack – just as is the case with other UTXO cryptocurrencies. The SafeBox stores the balances of all accounts which are in turn irreversibly linked only to their respective owners’ private keys.
(3) One Could Preserve the Entire History of the PascalCoin Blockchain
No blockchain data in PascalCoin is ever deleted even in the presence of the SafeBox. Since the SafeBox is cryptographically equivalent to a full node with the entire history as explained above, PascalCoin full nodes are not expected to contain infinite history. But for whatever reason(s) one may have, one could still keep all the PascalCoin blockchain history as well along with the SafeBox as an option even though it would be redundant.
Without storing the entire history of the PascalCoin blockchain, you can still trace the specific operations of the 100 blocks prior to when the SafeBox absorbs and reflects the net result (a single balance for each account) from those 100 blocks. But if you’re interested in tracing operations over a longer period in the past – as redundant as that may be – you’d have the option to do so by storing the entire history of the PascalCoin blockchain.
(4) The SafeBox is Equivalent to the Entire Blockchain History
Some skeptics may ask this question: “What if the SafeBox is forever lost? How would you be able to verify your accounts?” Asking this question is tantamount to asking to what would happen to Bitcoin if all of its entire history was erased. The result would be chaos, of course, but the SafeBox is still in line with the general security model of a traditional blockchain with respect to black swans.
Now that we know the security of the SafeBox is not compromised, what are the implications of this new blockchain paradigm? A colorful illustration as follows still wouldn’t do justice to the subtle revolution that the SafeBox ushers. The automobiles we see on the street are the cookie-and-butter representation of traditional blockchain systems. The SafeBox, on the other hand, supercharges those traditional cars to become the Transformers from Michael Bay’s films.
The SafeBox is an entirely different blockchain architecture that is impressive in its simplicity and ingenuity. The SafeBox’s design is only the opening act for PascalCoin’s vast nuclear arsenal. If the above was all that PascalCoin offers, it still wouldn’t come close to achieving the unicorn status but luckily, we have just scratched the surface. Please keep on reading on if you want to learn how PascalCoin is going to shatter the cryptocurrency industry into pieces. Buckle down as this is going to be a long read as we explore further about the SafeBox’s implications.
Part #2: 0-Confirmation Transactions
To begin, 0-confirmation transactions are secure in PascalCoin thanks to the SafeBox.
The following paraphrases an explanation of PascalCoin’s 0-confirmations from the whitepaper:
“Since PascalCoin is not a UTXO-based currency but rather a State-based currency thanks to the SafeBox, the security guarantee of 0-confirmation transactions are much stronger than in UTXO-based currencies. For example, in Bitcoin if a merchant accepts a 0-confirmation transaction for a coffee, the buyer can simply roll that transaction back after receiving the coffee but before the transaction is confirmed in a block. The way the buyer does this is by re-spending those UTXOs to himself in a new transaction (with a higher fee) thus invalidating them for the merchant. In PascalCoin, this is virtually impossible since the buyer's transaction to the merchant is simply a delta-operation to debit/credit a quantity from/to accounts respectively. The buyer is unable to erase or pre-empt this two-sided, debit/credit-based transaction from the network’s pending pool until it either enters a block for confirmation or is discarded with respect to both sender and receiver ends. If the buyer tries to double-spend the coffee funds after receiving the coffee but before they clear, the double-spend transaction will not propagate the network since nodes cannot propagate a double-spending transaction thanks to the debit/credit nature of the transaction. A UTXO-based transaction is initially one-sided before confirmation and therefore is more exposed to one-sided malicious schemes of double spending.”
Phew, that explanation was technical but it had to be done. In summary, PascalCoin possesses the only secure 0-confirmation transactions in the cryptocurrency industry, and it goes without saying that this means PascalCoin is extremely fast. In fact, PascalCoin is capable of 72,000 TPS even prior to any additional extensive optimizations down the road. In other words, PascalCoin is as instant as it gets and gives Nano a run for its money.
Part #3: Zero Fee
Let’s circle back to our discussion of PascalCoin’s 0-confirmation capability. Here’s a little fun magical twist to PascalCoin’s 0-confirmation magic: 0-confirmation transactions are zero-fee. As in you don’t pay a single cent in fee for each 0-confirmation! There is just a tiny downside: if you create a second transaction in a 5-minute block window then you’d need to pay a minimal fee. Imagine using Nano but with a significantly stronger anti-DDOS protection for spam! But there shouldn’t be any complaint as this fee would amount to 0.0001 Pascal or $0.00002 based on the current price of a Pascal at the time of this writing.
So, how come the fee for blazingly fast transactions is nonexistent? This is where the magic of the SafeBox arises in three ways:
(1) PascalCoin possesses the secure 0-confirmation feature as discussed above that enables this speed.
(2) There is no fee bidding competition of transaction priority typical in UTXO cryptocurrencies since, once again, PascalCoin operates on secure 0-confirmations.
(3) There is no fee incentive needed to run full nodes on behalf of the network’s security beyond the consensus rewards.
Part #4: Blockchain Size
Let’s expand more on the third point above, using Ethereum as an example. Since Ethereum’s launch in 2015, its full blockchain size is currently around 2 TB, give or take, but let’s just say its blockchain size is 100 GB for now to avoid offending the Ethereum elitists who insist there are different types of full nodes that are lighter. Whoever runs Ethereum’s full nodes would expect storage fees on top of the typical consensus fees as it takes significant resources to shoulder Ethereum’s full blockchain size and in turn secure the network. What if I told you that PascalCoin’s full blockchain size will never exceed few GBs after thousands of years? That is just what the SafeBox enables PascalCoin to do so. It is estimated that by 2072, PascalCoin’s full nodes will only be 6 GB which is low enough not to warrant any fee incentives for hosting full nodes. Remember, the SafeBox is an ultra-light cryptographic data structure that is cryptographically equivalent to a blockchain with the entire transaction history. In other words, the SafeBox is a compact spreadsheet of all account balances that functions as PascalCoin’s full node!
Not only does the SafeBox’s infinitesimal memory size helps to reduce transaction fees by phasing out any storage fees, but it also paves the way for true decentralization. It would be trivial for every PascalCoin user to opt a full node in the form of a wallet. This is extreme decentralization at its finest since the majority of users of other cryptocurrencies ditch full nodes due to their burdensome sizes. It is naïve to believe that storage costs would reduce enough to the point where hosting full nodes are trivial. Take a look at the following chart outlining the trend of storage cost.

As we can see, storage costs continue to decrease but the descent is slowing down as is the norm with technological improvements. In the meantime, blockchain sizes of other cryptocurrencies are increasing linearly or, in the case of smart contract engines like Ethereum, parabolically. Imagine a cryptocurrency smart contract engine like Ethereum garnering worldwide adoption; how do you think Ethereum’s size would look like in the far future based on the following chart?

Ethereum’s future blockchain size is not looking pretty in terms of sustainable security. Sharding is not a fix for this issue since there still needs to be full nodes but that is a different topic for another time.
It is astonishing that the cryptocurrency community as a whole has passively accepted this forever-expanding-blockchain-size problem as an inescapable fate.
PascalCoin is the only cryptocurrency that has fully escaped the death vortex of forever expanding blockchain size. Its blockchain size wouldn’t exceed 10 GB even after many hundreds of years of worldwide adoption. Ethereum’s blockchain size after hundreds of years of worldwide adoption would make fine comedy.
Part #5: Simple, Short, and Ordinal Addresses
Remember how the SafeBox works by snapshotting all account balances? As it turns out, the account address system is almost as cool as the SafeBox itself.
Imagine yourself in this situation: on a very hot and sunny day, you’re wandering down the street across from your house and ran into a lemonade stand – the old-fashioned kind without any QR code or credit card terminal. The kid across you is selling a lemonade cup for 1 Pascal with a poster outlining the payment address as 5471-55. You flip out your phone and click “Send” with 1 Pascal to the address 5471-55; viola, exactly one second later you’re drinking your lemonade without paying a cent for the transaction fee!
The last thing one wants to do is to figure out how to copy/paste to, say, the following address 1BoatSLRHtKNngkdXEeobR76b53LETtpyT on the spot wouldn’t it? Gone are the obnoxiously long addresses that plague all cryptocurrencies. The days of those unreadable addresses will be long gone – it has to be if blockchain is to innovate itself for the general public. EOS has a similar feature for readable addresses but in a very limited manner in comparison, and nicknames attached to addresses in GUIs don’t count since blockchain-wide compatibility wouldn’t hold.
Not only does PascalCoin has the neat feature of having addresses (called PASAs) that amount to up to 6 or 7 digits, but PascalCoin can also incorporate in-protocol address naming as opposed to GUI address nicknames. Suppose I want to order something from Amazon using Pascal; I simply search the word “Amazon” then the corresponding account number shows up. Pretty neat, right?
The astute reader may gather that PascalCoin’s address system makes it necessary to commoditize addresses, and he/she would be correct. Some view this as a weakness; part #10 later in this segment addresses this incorrect perception.
Part #6: Privacy
As if the above wasn’t enough, here’s another secret that PascalCoin has: it is a full-blown privacy coin. It uses two separate foundations to achieve comprehensive anonymity: in-protocol mixer for transfer amounts and zn-SNARKs for private balances. The former has been implemented and the latter is on the roadmap. Both the 0-confirmation transaction and the negligible transaction fee would make PascalCoin the most scalable privacy coin of any other cryptocurrencies pending the zk-SNARKs implementation.
Part #7: Smart Contracts
Next, PascalCoin will take smart contracts to the next level with a layer-2 overlay consensus system that pioneers sidechains and other smart contract implementations.
In formal terms, this layer-2 architecture will facilitate the transfer of data between PASAs which in turn allows clean enveloping of layer-2 protocols inside layer-1 much in the same way that HTTP lives inside TCP.
To summarize:
· The layer-2 consensus method is separate from the layer-1 Proof-of-Work. This layer-2 consensus method is independent and flexible. A sidechain – based on a single encompassing PASA – could apply Proof-of-Stake (POS), Delegated Proof-of-Stake (DPOS), or Directed Acyclic Graph (DAG) as the consensus system of its choice.
· Such a layer-2 smart contract platform can be written in any languages.
· Layer-2 sidechains will also provide very strong anonymity since funds are all pooled and keys are not used to unlock them.
· This layer-2 architecture is ingenious in which the computation is separate from layer-2 consensus, in effect removing any bottleneck.
· Horizontal scaling exists in this paradigm as there is no interdependence between smart contracts and states are not managed by slow sidechains.
· Speed and scalability are fully independent of PascalCoin.
One would be able to run the entire global financial system on PascalCoin’s infinitely scalable smart contract platform and it would still scale infinitely. In fact, this layer-2 architecture would be exponentially faster than Ethereum even after its sharding is implemented.
All this is the main focus of PascalCoin’s upcoming version 5 in 2019. A whitepaper add-on for this major upgrade will be released in early 2019.
Part #8: RandomHash Algorithm
Surely there must be some tradeoffs to PascalCoin’s impressive capabilities, you might be asking yourself. One might bring up the fact that PascalCoin’s layer-1 is based on Proof-of-Work and is thus susceptible to mining centralization. This would be a fallacy as PascalCoin has pioneered the very first true ASIC, GPU, and dual-mining resistant algorithm known as RandomHash that obliterates anything that is not CPU based and gives all the power back to solo miners.
Here is the official description of RandomHash:
“RandomHash is a high-level cryptographic hash algorithm that combines other well-known hash primitives in a highly serial manner. The distinguishing feature is that calculations for a nonce are dependent on partial calculations of other nonces, selected at random. This allows a serial hasher (CPU) to re-use these partial calculations in subsequent mining saving 50% or more of the work-load. Parallel hashers (GPU) cannot benefit from this optimization since the optimal nonce-set cannot be pre-calculated as it is determined on-the-fly. As a result, parallel hashers (GPU) are required to perform the full workload for every nonce. Also, the algorithm results in 10x memory bloat for a parallel implementation. In addition to its serial nature, it is branch-heavy and recursive making in optimal for CPU-only mining.”
One might be understandably skeptical of any Proof-of-Work algorithm that solves ASIC and GPU centralization once for all because there have been countless proposals being thrown around for various algorithms since the dawn of Bitcoin. Is RandomHash truly the ASIC & GPU killer that it claims to be?
Herman Schoenfeld, the inventor behind RandomHash, described his algorithm in the following:
“RandomHash offers endless ASIC-design breaking surface due to its use of recursion, hash algo selection, memory hardness and random number generation.
For example, changing how round hash selection is made and/or random number generator algo and/or checksum algo and/or their sequencing will totally break an ASIC design. Conceptually if you can significantly change the structure of the output assembly whilst keeping the high-level algorithm as invariant as possible, the ASIC design will necessarily require proportional restructuring. This results from the fact that ASIC designs mirror the ASM of the algorithm rather than the algorithm itself.”
Polyminer1 (pseudonym), one of the members of the PascalCoin core team who developed RHMiner (official software for mining RandomHash), claimed as follows:
“The design of RandomHash is, to my experience, a genuine innovation. I’ve been 30 years in the field. I’ve rarely been surprised by anything. RandomHash was one of my rare surprises. It’s elegant, simple, and achieves resistance in all fronts.”
PascalCoin may have been the first party to achieve the race of what could possibly be described as the “God algorithm” for Proof-of-Work cryptocurrencies. Look no further than one of Monero’s core developers since 2015, Howard Chu. In September 2018, Howard declared that he has found a solution, called RandomJS, to permanently keep ASICs off the network without repetitive algorithm changes. This solution actually closely mirrors RandomHash’s algorithm. Discussing about his algorithm, Howard asserted that “RandomJS is coming at the problem from a direction that nobody else is.”
Link to Howard Chu’s article on RandomJS:
Yet when Herman was asked about Howard’s approach, he responded:
“Yes, looks like it may work although using Javascript was a bit much. They should’ve just used an assembly subset and generated random ASM programs. In a way, RandomHash does this with its repeated use of random mem-transforms during expansion phase.”
In the end, PascalCoin may have successfully implemented the most revolutionary Proof-of-Work algorithm, one that eclipses Howard’s burgeoning vision, to date that almost nobody knows about. To learn more about RandomHash, refer to the following resources:
RandomHash whitepaper:
Technical proposal for RandomHash:
Someone might claim that PascalCoin still suffers from mining centralization after RandomHash, and this is somewhat misleading as will be explained in part #10.
Part #9: Fair Distribution and Governance
Not only does PascalCoin rest on superior technology, but it also has its roots in the correct philosophy of decentralized distribution and governance. There was no ICO or pre-mine, and the developer fund exists as a percentage of mining rewards as voted by the community. This developer fund is 100% governed by a decentralized autonomous organization – currently facilitated by the PascalCoin Foundation – that will eventually be transformed into an autonomous smart contract platform. Not only is the developer fund voted upon by the community, but PascalCoin’s development roadmap is also voted upon the community via the Protocol Improvement Proposals (PIPs).
This decentralized governance also serves an important benefit as a powerful deterrent to unseemly fork wars that befall many cryptocurrencies.
Part #10: Common Misconceptions of PascalCoin
“The branding is terrible”
PascalCoin is currently working very hard on its image and is preparing for several branding and marketing initiatives in the short term. For example, two of the core developers of the PascalCoin recently interviewed with the Fox Business Network. A YouTube replay of this interview will be heavily promoted.
Some people object to the name PascalCoin. First, it’s worth noting that PascalCoin is the name of the project while Pascal is the name of the underlying currency. Secondly, Google and YouTube received excessive criticisms back then in the beginning with their name choices. Look at where those companies are nowadays – surely a somewhat similar situation faces PascalCoin until the name’s familiarity percolates into the public.
“The wallet GUI is terrible”
As the team is run by a small yet extremely dedicated developers, multiple priorities can be challenging to juggle. The lack of funding through an ICO or a pre-mine also makes it challenging to accelerate development. The top priority of the core developers is to continue developing full-time on the groundbreaking technology that PascalCoin offers. In the meantime, an updated and user-friendly wallet GUI has been worked upon for some time and will be released in due time. Rome wasn’t built in one day.
“One would need to purchase a PASA in the first place”
This is a complicated topic since PASAs need to be commoditized by the SafeBox’s design, meaning that PASAs cannot be obtained at no charge to prevent systematic abuse. This raises two seemingly valid concerns:
· As a chicken and egg problem, how would one purchase a PASA using Pascal in the first place if one cannot obtain Pascal without a PASA?
· How would the price of PASAs stay low and affordable in the face of significant demand?
With regards to the chicken and egg problem, there are many ways – some finished and some unfinished – to obtain your first PASA as explained on the “Get Started” page on the PascalCoin website:
More importantly, however, is the fact that there are few methods that can get your first PASA for free. The team will also release another method soon in which you could obtain your first PASA for free via a single SMS message. This would probably become by far the simplest and the easiest way to obtain your first PASA for free. There will be more new ways to easily obtain your first PASA for free down the road.
What about ensuring the PASA market at large remains inexpensive and affordable following your first (and probably free) PASA acquisition? This would be achieved in two ways:
· Decentralized governance of the PASA economics per the explanation in the FAQ section on the bottom of the PascalCoin website (
· Unlimited and free pseudo-PASAs based on layer-2 in the next version release.
“PascalCoin is still centralized after the release of RandomHash”
Did the implementation of RandomHash from version 4 live up to its promise?
The official goals of RandomHash were as follow:
(1) Implement a GPU & ASIC resistant hash algorithm
(2) Eliminate dual mining
The two goals above were achieved by every possible measure.
Yet a mining pool, Nanopool, was able to regain its hash majority after a significant but a temporary dip.
The official conclusion is that, from a probabilistic viewpoint, solo miners are more profitable than pool miners. However, pool mining is enticing for solo miners who 1) have limited hardware as it ensures a steady income instead of highly profitable but probabilistic income via solo mining, and 2) who prefer convenient software and/or GUI.
What is the next step, then? While the barrier of entry for solo miners has successfully been put down, additional work needs to be done. The PascalCoin team and the community are earnestly investigating additional steps to improve mining decentralization with respect to pool mining specifically to add on top of RandomHash’s successful elimination of GPU, ASIC, and dual-mining dominance.
It is likely that the PascalCoin community will promote the following two initiatives in the near future:
(1) Establish a community-driven, nonprofit mining pool with attractive incentives.
(2) Optimize RHMiner, PascalCoin’s official solo mining software, for performance upgrades.
A single pool dominance is likely short lived once more options emerge for individual CPU miners who want to avoid solo mining for whatever reason(s).
Let us use Bitcoin as an example. Bitcoin mining is dominated by ASICs and mining pools but no single pool is – at the time of this writing – even close on obtaining the hash majority. With CPU solo mining being a feasible option in conjunction with ASIC and GPU mining eradication with RandomHash, the future hash rate distribution of PascalCoin would be far more promising than Bitcoin’s hash rate distribution.
PascalCoin is the Unicorn Cryptocurrency
If you’ve read this far, let’s cut straight to the point: PascalCoin IS the unicorn cryptocurrency.
It is worth noting that PascalCoin is still a young cryptocurrency as it was launched at the end of 2016. This means that many features are still work in progress such as zn-SNARKs, smart contracts, and pool decentralization to name few. However, it appears that all of the unicorn criteria are within PascalCoin’s reach once PascalCoin’s technical roadmap is mostly completed.
Based on this expository on PascalCoin’s technology, there is every reason to believe that PascalCoin is the unicorn cryptocurrency. PascalCoin also solves two fundamental blockchain problems beyond the unicorn criteria that were previously considered unsolvable: blockchain size and simple address system. The SafeBox pushes PascalCoin to the forefront of cryptocurrency zeitgeist since it is a superior solution compared to UTXO, Directed Acyclic Graph (DAG), Block Lattice, Tangle, and any other blockchain innovations.


Author: Tyler Swob
submitted by Kosass to CryptoCurrency [link] [comments]

Verge Currency Beginner's Guide

Verge Currency Beginner's Guide
A short Background
2008 was the worst financial crisis the world had experience since the great depression. The efforts of banks worldwide were not enough to prevent its occurrence. Shortly after, someone by the name of Satoshi Nakamoto offered an alternative solution. A digital currency that removes the need for a central bank. His proposal written in the Bitcoin white paper, is summarized below:
  • A secure, decentralized network.
  • A system with economic properties.
  • No need for banks or rule makers.
  • Instant transactions without a need of a third party or government approval.
  • Bringing financial services to the unbanked 2.5 billion people.
  • Total financial freedom. No one can freeze your accounts.
  • Low transaction costs. No ridiculously high transaction fees.
  • A currency with finite amount where no one can print money whenever they want.
In 2009, when Satoshi Nakamoto launched Bitcoin, the network consisted of computers (in crypto terms, these are called Nodes) to approve transactions, movements of data along the chain. This allows for everyone willing to become a participant, creating a decentralized global network. Allowing for a decentralized currency, free of the control of politicians, or institutions.
The rules can only be changed if 51% of the network agrees on it. This way the network is completely democratized and resistant to hacking attacks.
Unlike today’s financial institutions, no one can freeze your account or prevent you sending money. You are the only person who truly holds your wealth.
It is an open source project. Anyone can see the code and offer or discuss changes with the community. On the other hand, anyone participating to the network with computational power gets incentives or pay, with a fractional amount of BTC.
The core of a secure decentralized network like Bitcoin, lies the Blockchain technology. To put it simply, the blockchain is like a series of Lego, connected to each other by linking information, called transactions. These transactions contain the following data sender, receiver and the unique signature of the sender.
The data will be converted into “hash” before being saved into a block. The bitcoin hash is generated using a set of cryptographic functions called sha256. This way the information is encrypted, is compressed and saved in the block.
Additionally, each block in the chain, contains the information from the block before it. This ensures that if someone tries to maliciously modify information in a block, all the block following this attempt will be changed, making it easier to spot.
Each block includes the information from the previous block. If someone wants to maliciously change the information in one block that change the complete result of all following blocks.
In this type of network there is only one blockchain, and all the information is kept in a public ledger which is shared amongst all the participating networks. For the blockchain to be valid, more than 50% of the participants (nodes and their computational power) must agree with it.
Bitcoin Today (2018)
Until today many, many, events have happened. The network has grown massively. The underlying code is improved in many ways. There are more and more developers and investors that have entered the cryptocurrency space.
Currently there are proposed changes being developed to the Bitcoin network that will make bitcoin rival the centralized networks of today (Visa, Mastercard), while significantly lowering the cost of these transaction.
Many alternative cryptocurrencies have been created along the way, improving some of the aspects of the bitcoin and focusing on certain applications, in the crypto-space, we call them altcoins.
The way that Bitcoin function, has severe flaws with regards to privacy:
  • Public Ledger: The transaction information is public, meaning, that transactions can be linked to a person.
  • IP Leakage: A persistent and motivated attacker will be able to associate your IP address with your bitcoin transaction.
Due to the above reasons, it was clear that there would be a need for a privacy coin. Different coins were then created that had this problem in mind. They were ‘too private’ in the sense that they completely by-passed the public ledger. The public ledger allows merchant to provide proof of transactions, which is important for bookkeeping.
Enter Verge Currency, formerly Dogecoindark; which offers transaction on the ledger, both public and private. Allowing the user to choose if the transactions are public or private.
2014 saw the birth of Dogecoin Dark; in 2016, it was rebranded to Verge Currency.
Verge improves upon the original Bitcoin blockchain and aims to fulfill its initial purpose of providing individuals and businesses with a fast, efficient and decentralized way of making direct transactions while maintaining your privacy.
What is the Verge Currency Mission?
Verge Currency aims to empower people around the globe using blockchain in everyday life and makes it possible for people to engage in transactions quickly, efficiently and privately. With Verge, business and individuals now have flexible options for sending and receiving payments.
Verge Currency also offer helpful integrations and tools that enable them to handle large scale transactions between merchants and small-scale private payments.
Is Verge Currency a private company and how is it funded?
Following in the spirit of Bitcoin, Verge is an open-source software, and a community. It is not a company, never had an ICO. The development is entirely funded by the community and the developers. Currently Verge is looking into setting up an official Verge merchandise store, and an Official Verge mining pool, for multiple algorithms.
General technical capabilities of XVG blockchain:
Protocol PoW (Proof of Work)
Algorithms Scrypt, X17, Lyra2rev2, myr-groestl and blake2s
Max Coin Supply 16.5 billion XVG
Circulation Supply 15.2 billion XVG
Minable yes
Atomic Swaps Enabled
Tx (Transaction) Speed 5-10 Seconds
Tps (Transactions per sec.) 100 (Will be ~2000 with RSK)
Tx Fee 0.1 XVG
Privacy Options:
Tor + I2P Networks fully obfuscated IP address / User's Location is hidden
Stealth Addresses It enables users to anonymously receive funds to their wallet. Therefore third parties are no longer able to track receivers addresses, nor are they able to combine official wallet addresses with their stealth addresses.
RING CT Under development
See our blackpaper V5.0 for detailed information.
Development Updates
Marketing Updates


Verge is a community-driven project. The community is the pillar of Verge, from the past to the future, the community built Verge. The community or Vergefam connects everyone from around the world, regardless of cultural background. The common vision is to provide everyone access to financial freedom, and the choice of privacy while transacting.
Below you can find the Verge Telegram communities from around the world;
Official Telegram
🇧🇷 🇵🇹 Brasil/Portugal/
🇨🇦 Canada
🇳🇴 🇸🇪 🇩🇰 Norway/Sweden/Denmark
🇩🇪 🇦🇹 🇨🇭 🇱🇮 Germany/Austria/Switzerland/Liechtenstein
🇵🇹 Portugal
🇪🇸 Spain
🇱🇺 Netherlands
🇹🇷 Turkey
🇫🇷 France
🇵🇾 Croatia
🇦🇱 🇽🇰 Albania/Kosovo
🇷🇴 Romania
🇭🇺 Hungary
🇷🇺 Russia
🇮🇳 India
🇲🇾 Malaysia
🇯🇵 Japan
🇰🇷 Korea
🇨🇳 China
🇿🇦 South Africa
🔌Wallet Support
🖥️ Mining support
Mass Adoption
Low fees, quick transactions, high volume in circulation, multiplatform support, Wraith protocol are the ingredients that make Verge perfectly positioned for mass adoption. Transact on the public ledger for everyday purchases or stay private if you wish so.
Getting Started
You can find the matching instructions as below:
See the following useful links:

Official Links
Verge Team
Block Explorer 1
Block Explorer 2
Network Status
Verge Zendesk
Last Edit: Latest development update links are added to the Tech section.
submitted by Desolatorbtc to vergecurrency [link] [comments]

Is Crypto Currency truly at risk due to Quantum Computers, and what can you do about it?

Is Crypto Currency truly at risk due to Quantum Computers, and what can you do about it?

There is no denying that the Quantum revolution is coming. Security protocols for the internet, banking, telecommunications, etc... are all at risk, and your Bitcoins (and alt-cryptos) are next!
This article is not really about quantum computers[i], but, rather, how they will affect the future of cryptocurrency, and what steps a smart investor will take. Since this is a complicated subject, my intention is to provide just enough relevant information without being too “techy.”

The Quantum Evolution

In 1982, Nobel winning physicist, Richard Feynman, hypothesized how quantum computers[ii] would be used in modern life.
Just one year later, Apple released the “Apple Lisa”[iii] – a home computer with a 7.89MHz processor and a whopping 5MB hard drive, and, if you enjoy nostalgia, it used 5.25in floppy disks.
Today, we walk around with portable devices that are thousands of times more powerful, and, yet, our modern day computers still work in a simple manner, with simple math, and simple operators[iv]. They now just do it so fast and efficient that we forget what’s happening behind the scenes.
No doubt, the human race is accelerating at a remarkable speed, and we’ve become obsessed with quantifying everything - from the everyday details of life to the entire universe[v]. Not only do we know how to precisely measure elementary particles, we also know how to control their actions!
Yet, even with all this advancement, modern computers cannot “crack” cryptocurrencies without the use of a great deal more computing power, and since it’s more than the planet can currently supply, it could take millions, if not billions, of years.
However, what current computers can’t do, quantum computers can!
So, how can something that was conceptualized in the 1980’s, and, as of yet, has no practical application, compromise cryptocurrencies and take over Bitcoin?
To best answer this question, let’s begin by looking at a bitcoin address.

What exactly is a Bitcoin address?

Well, in layman terms, a Bitcoin address is used to send and receive Bitcoins, and looking a bit closer (excuse the pun), it has two parts:[vi]
A public key that is openly shared with the world to accept payments. A public key that is derived from the private key. The private key is made up of 256 bits of information in a (hopefully) random order. This 256 bit code is 64 characters long (in the range of 0-9/a-f) and further compressed into a 52 character code (using RIPEMD-160).
NOTE: Although many people talk about Bitcoin encryption, Bitcoin does not use Encryption. Instead, Bitcoin uses a hashing algorithm (for more info, please see endnote below[vii]).
Now, back to understanding the private key:
The Bitcoin address “1EHNa6Q4Jz2uvNExL497mE43ikXhwF6kZm” translates to a private key of “5HpHagT65TZzG1PH3CSu63k8DbpvD8s5ip4nEB3kEsreAnchuDf” which further translates to a 256 bit private key of “0000000000000000000000000000000000000000000000000000000000000001” (this should go without saying, but do not use this address/private key because it was compromised long ago.) Although there are a few more calculations that go behind the scenes, these are the most relevant details.
Now, to access a Bitcoin address, you first need the private key, and from this private key, the public key is derived. With current computers, it’s classically impractical to attempt to find a private key based on a public key. Simply put, you need the private key to know the public key.
However, it has already been theorized (and technically proven) that due to private key compression, multiple private keys can be used to access the same public key (aka address). This means that your Bitcoin address has multiple private keys associated with it, and, if someone accidentally discovers or “cracks” any one of those private keys, they have access to all the funds in that specific address.
There is even a pool of a few dedicated people hunting for these potential overlaps[viii], and they are, in fact, getting very efficient at it. The creator of the pool also has a website listing every possible Bitcoin private key/address in existence[ix], and, as of this writing, the pool averages 204 trillion keys per day!
But wait! Before you get scared and start panic selling, the probability of finding a Bitcoin address containing funds (or even being used) is highly unlikely – nevertheless, still possible!
However, the more Bitcoin users, the more likely a “collision” (finding overlapping private/public key pairs)! You see, the security of a Bitcoin address is simply based on large numbers! How large? Well, according to my math, 1.157920892373x1077 potential private keys exist (that number represents over 9,500 digits in length! For some perspective, this entire article contains just over 14,000 characters. Therefore, the total number of Bitcoin addresses is so great that the probability of finding an active address with funds is infinitesimal.

So, how do Quantum Computers present a threat?

At this point, you might be thinking, “How can a quantum computer defeat this overwhelming number of possibilities?” Well, to put it simple; Superposition and Entanglement[x].
Superposition allows a quantum bit (qbit) to be in multiple states at the same time. Entanglement allows an observer to know the measurement of a particle in any location in the universe. If you have ever heard Einstein’s quote, “Spooky Action at a Distance,” he was talking about Entanglement!
To give you an idea of how this works, imagine how efficient you would be if you could make your coffee, drive your car, and walk your dog all at the same time, while also knowing the temperature of your coffee before drinking, the current maintenance requirements for your car, and even what your dog is thinking! In a nutshell, quantum computers have the ability to process and analyze countless bits of information simultaneously – and so fast, and in such a different way, that no human mind can comprehend!
At this stage, it is estimated that the Bitcoin address hash algorithm will be defeated by quantum computers before 2028 (and quite possibly much sooner)! The NSA has even stated that the SHA256 hash algorithm (the same hash algorithm that Bitcoin uses) is no longer considered secure, and, as a result, the NSA has now moved to new hashing techniques, and that was in 2016! Prior to that, in 2014, the NSA also invested a large amount of money in a research program called “Penetrating Hard Targets project”[xi] which was used for further Quantum Computer study and how to break “strong encryption and hashing algorithms.” Does NSA know something they’re not saying or are they just preemptively preparing?
Nonetheless, before long, we will be in a post-quantum cryptography world where quantum computers can crack crypto addresses and take all the funds in any wallet.

What are Bitcoin core developers doing about this threat?

Well, as of now, absolutely nothing. Quantum computers are not considered a threat by Bitcoin developers nor by most of the crypto-community. I’m sure when the time comes, Bitcoin core developers will implement a new cryptographic algorithm that all future addresses/transactions will utilize. However, will this happen before post-quantum cryptography[xii]?
Moreover, even after new cryptographic implementation, what about all the old addresses? Well, if your address has been actively used on the network (sending funds), it will be in imminent danger of a quantum attack. Therefore, everyone who is holding funds in an old address will need to send their funds to a new address (using a quantum safe crypto-format). If you think network congestion is a problem now, just wait…
Additionally, there is the potential that the transition to a new hashing algorithm will require a hard fork (a soft fork may also suffice), and this could result in a serious problem because there should not be multiple copies of the same blockchain/ledger. If one fork gets attacked, the address on the other fork is also compromised. As a side-note, the blockchain Nebulas[xiii] will have the ability to modify the base blockchain software without any forks. This includes adding new and more secure hashing algorithms over time! Nebulas is due to be released in 2018.

Who would want to attack Bitcoin?

Bitcoin and cryptocurrency represent a threat to the controlling financial system of our modern economy. Entire countries have outright banned cryptocurrency[xiv] and even arrested people[xv], and while discrediting it, some countries are copying cryptocurrency to use (and control) in their economy[xvi]!
Furthermore, Visa[xvii], Mastercard[xviii], Discover[xix], and most banks act like they want nothing to do with cryptocurrency, all the while seeing the potential of blockchain technology and developing their own[xx]. Just like any disruptive technology, Bitcoin and cryptocurrencies have their fair share of enemies!
As of now, quantum computers are being developed by some of the largest companies in the world, as well as private government agencies.
No doubt, we will see a post-quantum cryptography world sooner than most realize. By that point, who knows how long “3 letter agencies” will have been using quantum technology - and what they’ll be capable of!

What can we do to protect ourselves today?

Of course, the best option is to start looking at how Bitcoin can implement new cryptographic features immediately, but it will take time, and we have seen how slow the process can be just for scaling[xxi].
The other thing we can do is use a Bitcoin address only once for outgoing transactions. When quantum computers attack Bitcoin (and other crypto currencies), their first target will be addresses that have outgoing transactions on the blockchain that contain funds.
This is due to the fact that when computers first attempt to crack a Bitcoin address, the starting point is when a transaction becomes public. In other words, when the transaction is first signed – a signed transaction is a digital signature derived from the private key, and it validates the transaction on the network. Compared to classical computers, quantum computers can exponentially extrapolate this information.
Initially, Bitcoin Core Software might provide some level of protection because it only uses an address once, and then sends the remaining balance (if any) to another address in your keypool. However, third party Bitcoin wallets can and do use an address multiple times for outgoing transactions. For instance, this could be a big problem for users that accept donations (if they don’t update their donation address every time they remove funds). The biggest downside to Bitcoin Core Software is the amount of hard-drive space required, as well as diligently retaining an up-to-date copy of the entire blockchain ledger.
Nonetheless, as quantum computers evolve, they will inevitably render SHA256 vulnerable, and although this will be one of the first hash algorithms cracked by quantum computers, it won’t be the last!

Are any cryptocurrencies planning for the post-quantum cryptography world?

Yes, indeed, there are! Here is a short list of ones you may want to know more about:

Full disclosure:

Although I am in no way associated with any project listed above, I do hold coins in all as well as Bitcoin, Litecoin and many others.
The thoughts above are based on my personal research, but I make no claims to being a quantum scientist or cryptographer. So, don’t take my word for anything. Instead, do your own research and draw your own conclusions. I’ve included many references below, but there are many more to explore.
In conclusion, the intention of this article is not to create fear or panic, nor any other negative effects. It is simply to educate. If you see an error in any of my statements, please, politely, let me know, and I will do my best to update the error.
Thanks for reading!


[i] – A great video explaining quantum computers.
[ii] - A brief history of quantum computing.
[iii] - More than you would ever want to know about the Apple Lisa.
[iv] - Want to learn more about computer science? Here is a great crash course for it!
[v] - What does quantify mean?
[vi] - More info about Bitcoin private keys.
[vii] - A good example of the deference between Hash and Encryption
[viii] - The Large Bitcoin Collider.
[ix] - A list of every possible Bitcoin private key. This website is a clever way of converting the 64 character uncompressed key to the private key 128 at a time. Since it is impossible to save all this data in a database and search, it is not considered a threat! It’s equated with looking for a single needle on the entire planet.
[x] – Brief overview of Superposition and Entanglement.
[xi] – A review of the Penetrating Hard Targets project.
[xii] - Explains post-quantum cryptography.
[xiii] - The nebulas project has some amazing technology planned in their roadmap. They are currently in testnet stage with initial launch expected taking place in a few weeks. If you don’t know about Nebulas, you should check them out. [xiv] - Country’s stance on crypto currencies.
[xv] - Don’t be a miner in Venezuela!
[xvi] - Russia’s plan for their own crypto currency.
[xvii] - Recent attack from visa against crypto currency.
[xviii] - Mastercards position about Bitcoin.
[xix] - Discovers position about Bitcoin.
[xx] - Mastercard is making their own blockchain.
[xxi] - News about Bitcoin capacity. Not a lot of news…
[xxii] - IOTA and quantum encryption.
[xxiii] - The whitepaper of Winternitz One-Time Signature Scheme
[xxiv] - The Cardano project roadmap.
[xxv] - More about the BLISS hash system.
[xxvi] - Home of the Ethereum project.
[xxvii] – SHA3 hash algorithm vs quantum computers.
[xxviii] - Lamport signature information.
[xxix] - Home of the Quantum Resistant Ledger project.
submitted by satoshibytes to CryptoCurrency [link] [comments]

Gridcoin Mandatory Update Released - CBR!

Well folks the day has finally come. Gridcoin's newest mandatory,, CBR is here.
Betsy is ready for showtime! is a mandatory update for all users. This means you must update your wallet before the hard fork date or you will be left behind. The hard fork is set at block 1,420,000. This is approximately 20 days from now. We expect the hard fork to occur on either November 7th or 8th. Please update before then!
The biggest change in 4.0 is, of course our new block version, v10. This brings CBR (constant block rewards) to Gridcoin. Instead of earning 1.5% APR from Proof of Stake, every block will instead be worth a static 10 GRC. This change follows network consensus through three seperate polls with the aim to increase network difficulty, and by extension, increase the strength and security of the Gridcoin blockchain.
If you wish to see resources about how to optimize your staking for CBR, please see this excellent post by core developer @jamescowens.
Download the update from GitHub here.
Linux PPAs are now updated!
The Windows MSI can be downloaded here. Checksum.
Full changelog for the 4.0 release:
Linux nodes can now stake superblocks using forwarded contracts, #1060 (@tomasbrod). 
Replace interest with constant block reward #1160 (@tomasbrod). Fork is set to trigger at block 1420000. Raise coinstake output count limit to 8 #1261 (@tomasbrod). Port of Bitcoin hash implementation #1208 (@jamescowens). Minor canges for the build documentation #1091 (@Lenni). Allow sendmany to be used without an account specified #1158 (@Foggyx420). 
Fix cpids and validcpids not returning the correct data #1233 (@Foggyx420). Fix listsinceblock not showing mined blocks to change addresses, #501 (@Foggyx420). Fix crash when raining using a locked wallet #1236 (@Foggyx420). Fix invalid stake reward/fee calculation (@jamescowens). Fix divide by zero bug in getblockstats RPC #1292 (@Foggyx420). Bypass historical bad blocks on testnet #1252 (@Quezacoatl1). Fix MacOS memorybarrier warnings #1193 (@ghost). 
Remove neuralhash from the getpeerinfo and node stats #1123 (@Foggyx420). Remove obsolete NN code #1121 (@Foggyx420). Remove (lower) Mint Limiter #1212 (@tomasbrod). 
Thank you to all our of dedicated developers for all the hard work and long nights that have gone into making this release a reality. Thank you also to all of the dedicated folks on testnet who have been so helpful in finding and helping solve critical issues before we released this massive overhaul. We couldn't have done this without your help.
submitted by barton26 to gridcoin [link] [comments]

The difference between GPU and CPU mining

The difference between GPU and CPU mining

GPU Mining
  • Coins Mined with CPU: Ethereum, Monero, Bitcoin Gold, Zcash, Electroneum, and many others
GPU (or Graphics Processing Unit) is the chip on your graphics card that does repetitive calculations for processing graphics and was initially used mainly by gamers for better graphics. But once Ethereum came along people started buying them up, the price skyrocketed and now there is a certain shortage of gaming graphics cards on the market.
Ethereum Mining with GPUs
All Ethereum based coins use the Ethash algorithm for mining, an algorithm “designed to be ASIC-resistant via memory-hardness.” There might be several reasons behind this, one of them being the possibility of Ethereum switching from Proof of Work to Proof of Stake.
And since ASIC mining is off-limits for Ethereum, using a GPU is a good alternative.
CPU Mining
  • Coins Mined with CPU: Monero, Electroneum, and Bytecoin
The CPU is the Central Processing Unit of any computer. Basically, it is the brains of the computer.
When Bitcoin was first released, you could mine 100 coins a day using just your CPU, which is impossible today.
CPU design optimizes for quickly switching between different tasks. If a coin allows CPU mining, there’s less power in the hands of large mining farms because everyone who has a computer can easily start mining.
The hashing required for Proof of Work is a repetitive mathematical calculation. CPUs have fewer arithmetic logic units, circuits that perform arithmetic operations, and thus are relatively slow when it comes to performing large amounts of calculations.
The Main Difference
GPU mining is the more powerful and lucrative version of CPU mining and yields a better return on investment. GPUs offer a higher level of processing power which in some cases are up to 800 times more than that of a CPU.
#mining #blockchain #ethereum #fintech #bitcoin #MiningOS #COS#CoinFly #CoinflyCOS #GPUmining #Software
submitted by coinfly to CoinFly [link] [comments]



What are cryptocurrencies?
Cryptocurrencies are peer to peer technology protocols which rely on the block-chain; a system of decentralized record keeping which allows people to exchange unmodifiable and indestructible information “coins,” globally in little to no time with little to no fees – this translates into the exchange of value as these coins cannot be counterfeit nor stolen. This concept was started by Satoshi Nakamoto (allegedly a pseudonym for a single man or organization) whom described and coded Bitcoin in 2009.
What is DigiByte?
DigiByte (DGB) is a cryptocurrency like Bitcoin. It is also a decentralized applications protocol in a similar fashion to Neo or Ethereum.
DigiByte was founded and created by Jared Tate in 2014. DigiByte allows for fast (virtually instant) and low cost (virtually free) transactions. DigiByte is hard capped at 21 billion coins which will ever be mined, over a period of 21 years. DigiByte was never an ICO and was mined/created in the same way that Bitcoin or Litecoin initially were.
DigiByte is the fastest UTXO PoW scalable block-chain in the world. We’ll cover what this really means down below.
DigiByte has put forth and applied solutions to many of the problems that have plagued Bitcoin and cryptocurrencies in general – those being:
We will address these point by point in the subsequent sections.
The DigiByte Protocol
DigiByte maintains these properties through use of various technological innovations which we will briefly address below.
Why so many coins? 21 Billion
When initially conceived Bitcoin was the first of a kind! And came into the hands of a few! The beginnings of a coin such as Bitcoin were difficult, it had to go through a lot of initial growth pains which following coins did not have to face. It is for this reason among others why I believe Bitcoin was capped at 21 million; and why today it has thus secured a place as digital gold.
When Bitcoin was first invented no one knew anything about cryptocurrencies, for the inventor to get them out to the public he would have to give them away. This is how the first Bitcoins were probably passed on, for free! But then as interest grew so did the community. For them to be able to build something and create something which could go on to have actual value, it would have to go through a steady growth phase. Therefore, the control of inflation through mining was extremely important. Also, why the cap for Bitcoin was probably set so low - to allow these coins to amass value without being destroyed by inflation (from mining) in the same way fiat is today! In my mind Satoshi Nakamoto knew what he was doing when setting it at 21 million BTC and must have known and even anticipated others would take his design and build on top of it.
At DigiByte, we are that better design and capped at 21 billion. That's 1000 times larger than the supply of Bitcoin. Why though? Why is the cap on DigiByte so much higher than that of Bitcoin? Because DigiByte was conceived to be used not as a digital gold, nor as any sort of commodity, but as a real currency!
Today on planet Earth, we are approximately 7.6 billion people. If each person should want or need to use and live off Bitcoin; then equally split at best each person could only own 0.00276315789 BTC. The market cap for all the money on the whole planet today is estimated to have recently passed 80 trillion dollars. That means that each whole unit of Bitcoin would be worth approximately $3,809,523.81!
This is of course in an extreme case where everyone used Bitcoin for everything. But even in a more conservative scenario the fact remains that with such a low supply each unit of a Bitcoin would become absurdly expensive if not inaccessible to most. Imagine trying to buy anything under a dollar!
Not only would using Bitcoin as an everyday currency be a logistical nightmare but it would be nigh impossible. For each Satoshi of a Bitcoin would be worth much, much, more than what is realistically manageable.
This is where DigiByte comes in and where it shines. DigiByte aims to be used world-wide as an international currency! Not to be hoarded in the same way Bitcoin is. If we were to do some of the same calculations with DigiByte we'd find that the numbers are a lot more reasonable.
At 7.6 billion people, each person could own 2.76315789474 DGB. Each whole unit of DGB would be worth approximately $3,809.52.
This is much more manageable and remember in an extreme case where everyone used DigiByte for everything! I don't expect this to happen anytime soon, but with the supply of DigiByte it would allow us to live and transact in a much more realistic and fluid fashion. Without having to divide large numbers on our phone's calculator to understand how much we owe for that cup of coffee! With DigiByte it's simple, coffee cost 1.5 DGB, the cinema 2.8 DGB, a plane ticket 500 DGB!
There is a reason for DigiByte's large supply, and it is a good one!
Decentralisation is an important concept for the block-chain and cryptocurrencies in general. This allows for a system which cannot be controlled nor manipulated no matter how large the organization in play or their intentions. DigiByte’s chain remains out of the reach of even the most powerful government. This allows for people to transact freely and openly without fear of censorship.
Decentralisation on the DigiByte block-chain is assured by having an accessible and fair mining protocol in place – this is the multi-algorithm (MultiAlgo) approach. We believe that all should have access to DigiByte whether through purchase or by mining. Therefore, DigiByte is minable not only on dedicated mining hardware such as Antminers, but also through use of conventional graphics cards. The multi-algorithm approach allows for users to mine on a variety of hardware types through use of one of the 5 mining algorithms supported by DigiByte. Those being:
Please note that these mining algorithms are modified and updated from time to time to assure complete decentralisation and thus ultimate security.
The problem with using only one mining algorithm such as Bitcoin or Litecoin do is that this allows for people to continually amass mining hardware and hash power. The more hash power one has, the more one can collect more. This leads to a cycle of centralisation and the creation of mining centres. It is known that a massive portion of all hash power in Bitcoin comes from China. This kind of centralisation is a natural tendency as it is cheaper for large organisations to set up in countries with inexpensive electricity and other such advantages which may be unavailable to the average miner.
DigiByte mitigates this problem with the use of multiple algorithms. It allows for miners with many different kinds of hardware to mine the same coin on an even playing field. Mining difficulty is set relative to the mining algorithm used. This allows for those with dedicated mining rigs to mine alongside those with more modest machines – and all secure the DigiByte chain while maintaining decentralisation.
Low Fees
Low fees are maintained in DigiByte thanks to the MultiAlgo approach working in conjunction with MultiShield (originally known as DigiShield). MultiShield calls for block difficulty readjustment between every single block on the chain; currently blocks last 15 seconds. This continuous difficulty readjustment allows us to combat any bad actors which may wish to manipulate the DigiByte chain.
Manipulation may be done by a large pool or a single entity with a great amount of hash power mining blocks on the chain; thus, increasing the difficulty of the chain. In some coins such as Bitcoin or Litecoin difficulty is readjusted every 2016 blocks at approximately 10mins each and 2mins respectively. Meaning that Bitcoin’s difficulty is readjusted about every two weeks. This system can allow for large bad actors to mine a coin and then abandon it, leaving it with a difficulty level far too high for the present hash rate – and so transactions can be frozen, and the chain stopped until there is a difficulty readjustment and or enough hash power to mine the chain. In such a case users may be faced with a choice - pay exorbitant fees or have their transactions frozen. In an extreme case the whole chain could be frozen completely for extended periods of time.
DigiByte does not face this problem as its difficulty is readjusted per block every 15 seconds. This innovation was a technological breakthrough and was adopted by several other coins in the cryptocurrency environment such as Dogecoin, Z-Cash, Ubiq, Monacoin, and Bitcoin Gold.
This difficulty readjustment along with the MultiAlgo approach allows DigiByte to maintain the lowest fees of any UTXO – PoW – chain in the world. Currently fees on the DigiByte block-chain are at about 0.0001 DGB per transaction of 100 000 DGB sent. This depends on the amount sent and currently 100 000 DGB are worth around $2000.00 with the fee being less than 0.000002 cents. It would take 500 000 transactions of 100 000 DGB to equal 1 penny’s worth. This was tested on a Ledger Nano S set to the low fees setting.
Fast transaction times
Fast transactions are ensured by the conjunctive use of the two aforementioned technology protocols. The use of MultiShield and MultiAlgo allows the mining of the DigiByte chain to always be profitable and thus there is always someone mining your transactions. MultiAlgo allows there to a greater amount of hash power spread world-wide, this along with 15 second block times allows for transactions to be near instantaneous. This speed is also ensured by the use DigiSpeed. DigiSpeed is the protocol by which the DigiByte chain will decrease block timing gradually. Initially DigiByte started with 30 second block times in 2014; which today are set at 15 seconds. This decrease will allow for ever faster and ever more transactions per block.
Robust security + The Immutable Ledger
At the core of cryptocurrency security is decentralisation. As stated before decentralisation is ensured on the DigiByte block chain by use of the MultiAlgo approach. Each algorithm in the MultiAlgo approach of DigiByte is only allowed about 20% of all new blocks. This in conjunction with MultiShield allows for DigiByte to be the most secure, most reliable, and fastest UTXO block chain on the planet. This means that DigiByte is a proof of work (PoW) block-chain where all transactional activities are stored on the immutable public ledger world-wide. In DigiByte there is no need for the Lightning protocol (although we have it) nor sidechains to scale, and thus we get to keep PoW’s security.
There are many great debates as to the robustness or cleanliness of PoW. The fact remains that PoW block-chains remain the only systems in human history which have never been hacked and thus their security is maximal.
For an attacker to divert the DigiByte chain they would need to control over 93% of all the hashrate on one algorithm and 51% of the other four. And so DigiByte is immune to the infamous 51% attack to which Bitcoin and Litecoin are vulnerable.
Moreover, the DigiByte block-chain is currently spread over 200 000 plus servers, computers, phones, and other machines world-wide. The fact is that DigiByte is one of the easiest to mine coins there is – this is greatly aided by the recent release of the one click miner. This allows for ever greater decentralisation which in turn assures that there is no single point of failure and the chain is thus virtually un-attackable.
On Chain Scalability
The biggest barrier for block-chains today is scalability. Visa the credit card company can handle around 2000 transactions per second (TPS) today. This allows them to ensure customer security and transactional rates nation-wide. Bitcoin currently sits at around 7 TPS and Litecoin at 28 TPS (56 TPS with SegWit). All the technological innovations I’ve mentioned above come together to allow for DigiByte to be the fastest PoW block-chain in the world and the most scalable.
DigiByte is scalable because of DigiSpeed, the protocol through which block times are decreased and block sizes are increased. It is known that a simple increase in block size can increase the TPS of any block-chain, such is the case with Bitcoin Cash. This is however not scalable. The reason a simple increase in block size is not scalable is because it would eventually lead to some if not a great amount of centralization. This centralization occurs because larger block sizes mean that storage costs and thus hardware cost for miners increases. This increase along with full blocks – meaning many transactions occurring on the chain – will inevitably bar out the average miner after difficulty increases and mining centres consolidate.
Hardware cost, and storage costs decrease over time following Moore’s law and DigiByte adheres to it perfectly. DigiSpeed calls for the increase in block sizes and decrease in block timing every two years by a factor of two. This means that originally DigiByte’s block sizes were 1 MB at 30 seconds each at inception in 2014. In 2016 DigiByte increased block size by two and decreased block timing by the same factor. Perfectly following Moore’s law. Moore’s law dictates that in general hardware increases in power by a factor of two while halving in cost every year.
This would allow for DigiByte to scale at a steady rate and for people to adopt new hardware at an equally steady rate and reasonable expense. Thus so, the average miner can continue to mine DigiByte on his algorithm of choice with entry level hardware.
DigiByte was one of the first block chains to adopt segregated witness (SegWit in 2017) a protocol whereby a part of transactional data is removed and stored elsewhere to decrease transaction data weight and thus increase scalability and speed. This allows us to fit more transactions per block which does not increase in size!
DigiByte currently sits at 560 TPS and could scale to over 280 000 TPS by 2035. This dwarfs any of the TPS capacities; even projected/possible capacities of some coins and even private companies. In essence DigiByte could scale worldwide today and still be reliable and robust. DigiByte could even handle the cumulative transactions of all the top 50 coins in and still run smoothly and below capacity. In fact, to max out DigiByte’s actual maximum capacity (today at 560 TPS) you would have to take all these transactions and multiply them by a factor of 10!
Oher Uses for DigiByte
Note that DigiByte is not only to be used as a currency. Its immense robustness, security and scalability make it ideal for building decentralised applications (DAPPS) which it can host. DigiByte can in fact host DAPPS and even centralised versions which rely on the chain which are known as Digi-Apps. This application layer is also accompanied by a smart contract layer.
Thus, DigiByte could host several Crypto Kitties games and more without freezing out or increasing transaction costs for the end user.
Currently there are various DAPPS being built on the DigiByte block-chain, these are done independently of the DigiByte core team. These companies are simply using the DigiByte block-chain as a utility much in the same way one uses a road to get to work. One such example is Loly – a Tinderesque consensual dating application.
DigiByte also hosts a variety of other platform projects such as the following:
The DigiByte Foundation
As previously mentioned DigiByte was not an ICO. The DigiByte foundation was established in 2017 by founder Jared Tate. Its purpose is as a non-profit organization dedicated to supporting and developing the DigiByte block-chain.
DigiByte is a community effort and a community coin, to be treated as a public resource as water or air. Know that anyone can work on DigiByte, anyone can create, and do as they wish. It is a permissionless system which encourages innovation and creation. If you have an idea and or would like to get help on your project do not hesitate to contact the DigiByte foundation either through the official website and or the telegram developer’s channel.
For this reason, it is ever more important to note that the DigiByte foundation cannot exist without public support. And so, this is the reason I encourage all to donate to the foundation. All funds are used for the maintenance of DigiByte servers, marketing, and DigiByte development.
DigiByte Resources and Websites
Please refer to the sidebar of this sub-reddit for more resources and information.
Edit - Removed Jaxx wallet.
Edit - A new section was added to the article: Why so many coins? 21 Billion
Edit - Adjusted max capacity of DGB's TPS - Note it's actually larger than I initially calculated.
Edit – Grammar and format readjustment
I hope you’ve enjoyed my article, I originally wrote this for the reddit sub-wiki where it generally will most likely, probably not, get a lot of attention. So instead I've decided to make this sort of an introductory post, an open letter, to any newcomers to DGB or for those whom are just curious.
I tried to cover every aspect of DGB, but of course I may have forgotten something! Please leave a comment down below and tell me why you're in DGB? What convinced you? Me it's the decentralised PoW that really convinced me. Plus, just that transaction speed and virtually no fees! Made my mouth water!
-Dereck de Mézquita
I'm a student typing this stuff on my free time, help me pay my debts? Thank you!
submitted by xeno_biologist to Digibyte [link] [comments]

Technical discussion of Gavin's O(1) block propagation proposal

I think there isn't wide appreciation of how important Gavin's proposal is for the scalability of Bitcoin. It's the real deal, and will get us out of this sort of beta mode we've been in of a few transactions per second globally. I spent a few hours reviewing the papers referenced at the bottom of his excellent write-up and think I get it now.
If you already get it, then hang around and answer questions from me and others. If you don't get it yet, start by very carefully reading
The big idea is twofold: fix the miner's incentives to align better with users wanting transactions to clear, and eliminate the sending of redundant data in the newblock message when a block is solved to save bandwidth.
I'll use (arbitrarily) a goal of 1 million tx per block, which is just over 1000 TPS. This seems pretty achievable, without a lot of uncertainty. Really! Read on.
Today, a miner really wants to propagate a solved block as soon as possible to not jeopardize their 25 BTC reward. It's not the cpu cost for handling the transactions on the miner's side that's the problem, it's the sending of a larger newblock message around the network that just might cause her block to lose a race condition with another solution to the block.
So aside from transactions with fees of more than 0.0008 BTC that can make up for this penalty (, or simply the goodwill of benevolent pools to process transactions, there is today an incentive for miners not to include transactions in a block. The problem is BTC price has grown so high so fast that 0.0008 BTC is about 50 cents, which is high for day-to-day transactions (and very high for third world transactions).
The whole idea centers around an old observation that since the network nodes (including miners) have already received transactions by the normal second-by-second operation of the p2p network, the newblock announcement message shouldn't have to repeat the transaction details. Instead, it can just tell people, hey, I approve these particular transactions called XYZ, and you can check me by taking your copy of those same transactions that you already have and running the hash to check that my header is correctly solved. Proof of work.
A basic way to do this would be to send around a Bloom filter in the newblock message. A receiving node would check all the messages they have, see which of them are in this solved block, and mark them out of their temporary memory pool. Using a BF calculator you can see that you need about 2MB in order to get an error rate of 10e-6 for 1 million entries. 2MB gives 16 million bits which is enough to almost always be able to tell if a tx that you know about is in the block or not.
There are two problems with this: there may be transactions in the solved block that you don't have, for whatever p2p network or policy reason. The BF can't tell you what those are. It can just tell you there were e.g. 1,000,000 tx in this solved block and you were able to find only 999,999 of them. The other glitch is that of those 999,999 it told you were there, a couple could be false positives. I think there are ways you could try to deal with this--send more types of request messages around the network to fill in your holes--but I'll dismiss this and flip back to Gavin's IBLT instead.
The IBLT works super well to mash a huge number of transactions together into one fixed-size (O(1)) data structure, to compare against another set of transactions that is really close, with just a few differences. The "few differences" part compared to the size of the IBLT is critical to this whole thing working. With too many differences, the decode just fails and the receiver wouldn't be able to understand this solved block.
Gavin suggests key size of 8B and data of 8B chunks. I don't understand his data size--there's a big key checksum you need in order to do full add and subtract of IBLTs (let's say 8B, although this might have to be 16B?) that I would rather amortize over more granular data chunks. The average tx is 250B anyway. So I'm going to discuss an 8B key and 64B data chunks. With a count field, this then gives 8 key + 64 data + 16 checksum + 4 count = 92B. Let's round to 100B per IBLT cell.
Let's say we want to fix our newblock message size to around 1MB, in order to not be too alarming for the change to this scheme from our existing 1MB block limit (that miners don't often fill anyway). This means we can have an IBLT with m=10K, or 10,000 cells, which with the 1.5d rule (see the papers) means we can tolerate about 6000 differences in cells, which because we are slicing transactions into multiple cells (4 on average), means we can handle about 1500 differences in transactions at the receiver vs the solver and have faith that we can decode the newblock message fully almost all the time (has to be some way to handle the occasional node that fails this and has to catch up).
So now the problem becomes, how can we define some conventions so that the different nodes can mostly agree on which of the transactions flying around the network for the past N (~10) minutes should be included in the solved block. If the solver gets it wrong, her block doesn't get accepted by the rest of the network. Strong incentive! If the receiver gets it wrong (although she can try multiple times with different sets), she can't track the rest of the network's progress.
This is the genius part around this proposal. If we define the convention so that the set of transactions to be included in a block is essentially all of them, then the miners are strongly incentivized, not just by tx fees, but by the block reward itself to include all those transactions that happened since the last block. It still allows them to make their own decisions, up to 1500 tx could be added where convention would say not to, or not put in where convention says to. This preserves the notion of tx-approval freedom in the network for miners, and some later miner will probably pick up those straggler tx.
I think it might be important to provide as many guidelines for the solver as possible to describe what is in her block, in specific terms as possible without actually having to give tx ids, so that the receivers in their attempt to decode this block can build up as similar an IBLT on their side using the same rules. Something like the tx fee range, some framing of what tx are in the early part and what tx are near the end (time range I mean). Side note: I guess if you allow a tx fee range in this set of parameters, then the solver could put it real high and send an empty block after all, which works against the incentive I mentioned above, so maybe that particular specification is not beneficial.
From for example, the propagation delay is about 30-40 seconds before almost all nodes have received any particular transaction, so it may be useful for the solver to include tx only up to a certain point in time, like 30 seconds ago. Any tx that is younger than this just waits until the next block, so it's not a big penalty. But some policy like this (and some way to communicate it in the absence of centralized time management among the nodes) will be important to keep the number of differences in the two sets small, below 1500 in my example. The receiver of the newblock message would know when trying to decode it, that they should build up an IBLT on their side also with tx only from up to 30 seconds ago.
I don't understand Gavin's requirement for canonical ordering. I see that it doesn't hurt, but I don't see the requirement for it. Can somebody elaborate? It seems that's his way to achieve the same framing that I am talking about in the previous paragraph, to obtain a minimum number of differences in the two sets. There is no need to clip the total number of tx in a block that I see, since you can keep shoving into the IBLT as much as you want, as long as the number of differences is bounded. So I don't see a canonical ordering being required for clipping the tx set. The XOR (or add-subtract) behavior of the IBLT doesn't require any ordering in the sets that I see, it's totally commutative. Maybe it's his way of allowing miners some control over what tx they approve, how many tx into this canonical order they want to get. But that would also allow them to send around solved empty blocks.
What is pretty neat about this from a consumer perspective is the tx fees could be driven real low, like down to the network propagation minimum which I think as of this spring per Mike Hearn is now 0.00001 BTC or 10 "bits" (1000 satoshis), half a US cent. Maybe that's a problem--the miners get the shaft without being able to bid on which transactions they approve. If they try to not approve too many tx their block won't be decoded by the rest of the network like all the non-mining nodes running the bitpay/coinbases of the world.
Edit: 10 bits is 1000 satoshis, not 10k satoshis
submitted by sandball to Bitcoin [link] [comments]

An in depth look into Sparkster and why I believe it is in a league of its own

Today I am writing about a project I truly believe in. I am on the same page with Ian Balina when I state that I see this project is an all-star ICO. This is not your average run-of-the-mill vapour ware ICO with No MVP. This is a working platform with a great team behind it. You can find AMA’s on YouTube(Link 1)with live demonstrations of their TPS progress to date and you can also try out their platform for yourself on their website, these are linked at the end of the article for your convenience. Also, they have a pretty good bounty programme running at the moment which I shall link also(Link 2).
Please don’t consider this investment advice, I hope you will read this article and consider it a starting point for your own research. At the time of writing the market has taken another nasty dip, however this is the time when smart investments need to be made, And I truly fell this is one of them. I would also like any of you who enjoy this article to please upvote it and check out my previous work and stay tuned for more.
I will be diving deep into this whitepaper (Link 3) today and basing my article off videos and my personal experience on their platform. All this information can be found within their website and whitepaper. As such I imagine this is going to be a long article.
So, to begin Sparkster is essentially a decentralised cloud platform that will allow anybody to build software in plain English via simple drag and drop function. In their whitepaper they confess that this was inspired by MIT scratch. In today’s world programmers work in various kinds of code languages, these all require training in different types of languages. For example solidity is one of the most popular used today which “is a contract-oriented programming language for writing smart contracts” (Wikipedia, 2018). This is currently used on many blockchain platforms, it was developed by Ethereum’s project solidity team for use on the Ethereum virtual machine and is the most popular language used at present.
Sparkster aims to provide a platform which will allow smart contracts to run at 10 million TPS per second, which would make it the fastest decentralised cloud software in the world.
Concept development
In their whitepaper they suggest this project was conceived after spending 14 years working with software engineers designing and building ERP software for a start-up. Sparkster was born from the frustration of this process and after 6 years of R&D they have the working product we see today. This is an enterprise ready platform. They also claim they have already signed deals with large tech companies (ARM & Libelium).
They also talk about how the entry is trying to make things more practical but it is not far enough. Sparkster are the market leaders here as they are targeting an audience of 99% non-software developers and allowing them to build software. Interestingly in 2018 at the mobile world congress they presented the use of this platform using AI facial recognition to detect a cleaner in a house and opening a door lock, I seen this on YouTube video, which I will link below(Link 4). This is a team which have proved they have a working product.
Claims/ Vision
  1. In their whitepaper they claim they want to become the world’s first platform where people can build their own visions into reality and create financial independence for themselves and contribute to society.
  2. Sparkster will tear apart the barrier to entry to software creation. Their drag and drop functionality on the platform allows this. Up until yesterday I had no clue how a smart contract worked at the basic level, now I consider myself an expert software developer- Who would have thought I could throw away my old life and upskill over 24 hours? Ha.
  3. What I also love about this project is that it will empower people to bring their own ideas to the table and be able to sell them, thus creating financial independence.
  4. The Sparkster, (2018) website(Link 5)suggests they will further disrupt the 200 billion cloud computing industry and combat the extortionate prices large centralised cloud provides like AWS, Microsoft, Google and IBM charge.
  5. This is a finished product guys, please try it yourself if you don’t believe me.
Problem today
As per Sparkster, (2018) claim the biggest problem faced today is that organisations and individuals who wish to implement AI, IOT and smart contract technology have limitations placed on them. Most notably being that their own IT departments are adapting too slowly and there is a serious lack of experienced personnel in these areas. When I watched the AMA that Sajjad Daya (CEO) did with IAN Balina, he described that it is hard to interpret what you want to a developer and get the result you require; the end result then often does not meet your expectations. This of course leads to time wasting as it requires much back and forth correspondence. He stated that this can be months down the line (Something I have experienced in my own organisation). This traditional “software development lifecycle” is truly a slow and painful process, just as they claim in their whitepaper. Also, when changes need to be made to the software down the line it is very expensive.
Further-more the team claim that most business software used today (SAP, Oracle, Microsoft etc… is in-capable of interfacing with the technologies of the future (IOT, AI, Smart contracts). The Sparkster whitepaper further goes on to suggest that the talent is just not there in the industry today to face this challenge either and much up-skilling is required. The team believe that the high capital cost and time periods to replicate vision onto software in the traditional manner is the biggest problem facing enterprises today as it curbs innovation. I concur with this sentiment.
How they will achieve/solve this
According to their whitepaper, this platform is the solution to all of the above problems. It is a Platform which targets the new era of AI, IOT and smart contracts and all tailored to non-developesoftware experts “making is accessible to the 99% who do not know how to code and don’t want to learn” (Sparkster, 2018).
They will create this platform by targeting users of cell phones, notebooks, laptops and other personal devices- who in essence will all become miners on the network. This will then in turn provide users with Spark tokens as a reward for contributing spare capacity. Using these devices is far cheaper than todays centralised systems according to the team. They further proclaim in their whitepaper this lower cost will arise from using inexpensive nodes and as this scales the cost goes down; compared to traditional cloud computing which remains constant. Companies will provide the value via paying for the software creation.
To scale the platform, they will make personal use free, but limited to a certain number of transactions per month. This restriction can be lifted by referring others. The commercial use will be via ongoing fees (licences, transaction fees, storage fees etc...). The team also describe how the platform and cloud are complimentary, which will allow users to build software 100x faster and cheaper than traditional means, so this will be a very popular mass blockchain adoption platform in my view.
Their plan for growth
A marketplace will essentially become available when users sell their software creations via peer to peer transactions. So, value really depends on how users use the platform. Also, users lending their free memory (CPU) on phones etc… will be awarded spark tokens. These can all be used to negate the fees paid.
According to their whitepaper they will also focus on strategic partnership. As mentioned above they have already partnerships with ARM (World’s largest computer chip designer) and Libellium (Industrial sensor and gateway distributor). They also plan to target vertical markets, specifically IOT and smart contracts as growth is forecast to be huge in both. I personally see the use of smart contracts in society as the single biggest use case of blockchain in the future.
What is amazing about this platform is that you can actually try it for yourself on their website. I conducted the 6 walkarounds myself and was very impressed by what I experienced. I have never attempted to try create anything with software, but the process was made so simple by Sparkster. You can literally drag and drop different interfaces together and define the behaviour of each block. It’s a very simple and intuitive approach to building smart contracts. As described by Sparkster, (2018) whitepaper you just snap together blocks that describe the “what” you want without worry about “how” it works, they even attribute it to building with Lego. The walkthroughs bring you through how to create a simple calculator and by the 6th lesson you have developed a complex insurance smart contract from which premiums can be calculated and payments automatically made.
Sparkster claim that this will make the creation of smart contracts 100 times faster and cheaper than traditional software development, a claim which I am starting to believe after experience their walkthrough. This is a rare project which already has a working platform- Why wouldn’t you be impressed?
Most ICO’s today are nothing but vapourware, who look for you money and don’t even have minimum viable projects to offer. I would advise you all to look at their AMA’s on YouTube and partake in their walkthroughs and you will see for yourself.
A more detailed look into their platform
According to Sparkster, (2018) their smart software is made up of:
  1. Flows- The definition of the software, made up of all core components of the platform.
  2. Functions- Single building blocks that perform units of work which can be plugged together to build processes (e.g. an insurance policy as seen in their walkthrough video). The have a well-defined user interface also.
  3. Documents- Basic data storage entities on the platform, they differ from functions as they are there to retrieve, persist, update and delete data. Sparkster say that they are there to represent an entity in the real world e.g. a user’s car insurance policy. Furthermore, storage nodes on the cloud will be rewarded for this storage and retrieval of data.
  4. Integrations- This is the interface to the outside world. Sparkster say they provide a simple abstraction to a 3rd party API or webservices. What I like about this is that somebody can create this (e.g. shipping quotation) and allow others to use after its created via the market place. Sparkster aim to allow people to do this without worrying how it all works.
  5. Devices- These replicate devices in the real world comprising of commands and fields (Bidirectional data transfer). In their whitepaper they use an example of a temperature probe in a greenhouse where the temperature feeds back to the action field. It is very complex stuff.
  6. Gateways- these represent a group of devices connected to one gateway. Sparkster say these are all connected to the internet allowing the platform interact with them all individually or as a group.
  7. Smart Contracts- This is the element I found most fascinating during the Sparkster walkthrough videos. This allows you to create smart contracts to allow transactions on the platform. Currently they are using Ethereum smart contracts and Iota smart transactions. I found the whole process so easy. They further state that all the above components can interact with the smart contracts, which was proven to me in the walkarounds.
Their claim of 10 million TPS
From what I can understand from their whitepaper and from an AMA with their CEO this will be a step by step approach to 10 million tps, admittedly a few years down the line but they already proved their platform works and is running at over 50k TPS with 50 cells. They don’t seem to have hit any scalability issues just yet. And I should not need to remind you that 50k TPS is much more than other blockchains products out there.
In their whitepaper they tell us that this is designed to be a specialised blockchain for the use of “smart software”, What is important to understand is that they can reach higher TPS because they don’t have to “act” like other blockchains, in that most of their clients will want to keep data private which “eliminates the necessity of maintaining global state” (Sparkster, 2018). This in turns allows them to shard their distributed hash tables into client groups, where “one shard never needs to have any awareness of another other shard” (Sparkster, 2018). They will essentially isolate cells from one another in order to scale to this level. They give a great example in their whitepaper where if a company like Air BNB want to put customer data into cells (usernames broke into separate letters per cell), where millions of customers make up their base.
Overall their theory is that there is technically no limit to the number of TPS they can achieve, this is just a target number. I have full confidence they can pull it off, what other blockchain is proving this live on air like this team is?
Decentralised cloud
Sparkster, (2018) website describes how traditional cloud providers such as Amazon, Microsoft and Google have huge costs, relating to server costs, backup power, staff, security and cooling. Decentralised cloud computing will be the death of these organisations. For instance, Sparkster claim that by executing small software components on one’s mobile phone these costs fall near to zero, they envisage a world where a lot of these miners will join the decentralised cloud and make reduce the costs further.
Their cloud will facilitate the execution of smart software created on the platform. Their whitepaper further suggests that one can simply download the Sparkster mining app on their phones which will provide user generated smart software environment (SRE). Companies will stake bids on the exchange for their software to run in a decentralised fashion and stake Spark tokens (Amount willing to pay). The team are envisaging this as a free market where bidders can stake as much as they like and miners ask for anything they like. Payment is made to the miners via these tokens.
They further say computer and storage nodes can join the network and be paid in Spark tokens, but they are required to stake tokens themselves as collateral to ensure they operate honestly. Sparkster will have verification nodes to validate transactions from computer and storage nodes and if any “bad behaviour” is found then they take these staked tokens in the form of a “bounty”. In my opinion this will make it a very secure platform
Sparkster Technology Stack
The below image from their whitepaper shows the levels “smart software” goes through to facilitate decentralised cloud computing.
Source: Sparkster, (2018)
What is very interesting is the high throughput they can sustain with such a high TPS. If you know anything about blockchain you will understand this is a challenge for every blockchain, the more users to a platform the more scaling is required. For instance, in the bull run in December I remember how slow the Ethereum blockchain became, this was also attributed to the increase in ICOs and DApps launching on the platform.
Sparkster claim their cloud is capable of “scaling linearly without any overhead curtailing its meteoric performance” (Sparkster, 2018). They can achieve this by isolating cells within the chain. They further claim that the whole Idea is to “isolate” chains, essentially creating independent blockchains which have their own hash tables and never synchronize with each other- they describe this like a human cell, which once splits never shares anything with another cell. It is a very simple concept, user’s data is stored in a specific cell, so why would another unrelated company using the Sparkster platform need to know about of access the information in the 1st cell. Each “cell” is capable of 1000tps and because they each have their own hash table this results in 2k tps and so on and so forth.
Essentially data is streamed in parallel but synching is never needed. This is huge- this is a platform which unlike any other blockchain is designed for mainstream adoption. Any company can use it and store data and be sure of a high throughput. As mentioned above they are already at 50k TPS- which is far better than most blockchains today. This is a true working product and I can see this getting to 10 million.
Time for a quick history lesson, bitcoin uses proof of work and Ethereum use proof of stake. These are two most common consensuses used today by blockchains. Bitcoin relies on the party with the highest hashing power whereas Ethereum on the party with the highest amount of money. This team has chosen to implement the Steller Consensus Protocol (SCP), because it is better.
Sparkster describe this as a commercial version of the Federate Byzantine Agreement System (FBAS) (1000tps per second). They will also implement a layer for incentives to keep parties honest and minimise risk of attack as SCP does not have this. This will be done by awarding of Spark tokens to computer (donate CPU memory on device) and storage nodes (contribute storage space and network bandwidth). Clients of the platform will be covering these incentives. The team believe this extra layer is required to ensure the platform surpasses traditional cloud platforms and I tend to agree with them.
Their whitepaper further suggests that a proof of work consensus will be used to calculate these incentives. This will allow misbehaviour to be detected and stakes taken from them by verification nodes. Page 35-39 of the whitepaper goes into detail how these are all calculated, which is linked below for your interest.
Consistent hashing
As they don’t use global state this algorithm allows the platform to “hash the clients ID and extract a bounded number” (Sparkster, 2018). This will identify a particular client within a cell.
One of the biggest fears of any data platform is privacy protection. The Sparkster team say that their cloud deconstructs data into fragments, encrypts them and disseminates them across the network of nodes. This is particularly important now with the EU’s general Data Protection Regulation (GDPR), as discussed in their whitepaper. So, any hack to the platform will wield meaningless returns. They also claim they use “zk-SNARKs… a zero-knowledge proof to ensure that client data is obfuscated, even from other network participants” (Sparkster, 2018).
They also claim they can detect software intrusions such as tampering with the code, memory or thread. Once their system detects this all client data is automatically deleted from the memory along with the access keys to the Sparkster network, as claimed on their website.
In their whitepaper they also claim that any software built on the platform is “entirely bug free”. This is true because even though you as the users dictate the logic, the actually underlying code is very uniform and consistent.
Their app will also use public/ private keys and digital signatures and check sums will be used to detect file tampering. In their whitepaper they also state that cache data won’t be stored, all data will be encrypted, all communication is SSL/TLS and they will employ 3rd parties to detect malicious payloads in the memory.
Multi chain interoperability
Sparkster can already be used with both Ethereum and Iota, with plans to increase this down the line. This is all to cater for preferences of the user. This is a very transparent platform and tailored around usability and ease.
Source: (Sparkster, 2018)
Token economics
The value model proposed by their whitepaper suggests that the global marketplace will be the value driver of the platform. So, people can create and sell content on an open peer-peer market, with the value flowing though the Spark token. Small platform fees will be charged on transactions on the platform (Not on free contributions).
It is a utility tokens because its purpose is to facilitate payments, it will also be the only currency accepted on the platform. Once the decentralised cloud is released in Q4 2018, miners will be able to earn Spark tokens.
I believe this will be a market leader when it comes to mass adoption of blockchain, this is truly a one model fits all platform and it is with growth of the platform which will drive the value of the tokens up. Also, the Spark token is essential to the cloud functionality as miners need to stake tokens to ensure good behaviour, if the opposite occur verification nodes claim these takes, this makes the tokens essential to the smooth running of the platform.
Breakdown of token distribution
Use of funds:
In my view the team has a huge wealth of experience within it. This consists of:
2 all-star advisors
4 on the leadership team
17 further team members (Sparkster Warriors)
· These team members range from software engineers, developers, designers, project team leaders, programmers and digital marketers.
· There is so much experience in this team it would take all day to write about them, but a wide encompassing team like this shows they are serious about what they doing.
This is a not to be missed ICO. I really feel like this is one of the all star ICO’s this year. There is nothing more that really needs to be said, I would just advise you that if you are considering this project then go to their website and test the platform for yourself. It is the walkthroughs that sold me on this project and one which I will be investing in.
Additional reading (Links)
Link 1- AMA with Ian Balina (All-star ICO):
Link 2- Sparkster bounty programme:
Link 3- Sparkster whitepaper:
Link 4: Sparkster founder &Ceo speaking at MWC 2018:
Link 5: Sparkster website:
· (2018). Solidity. [online] Available at: [Accessed 11 Jun. 2018].
· Sparkster (2018). Build and Run Decentralized Software in Plain English. [ebook] Sparkster whitepaper, pp.1-57. Available at: [Accessed 11 Jun.
· (2018). Sparkster – Build Apps, Write No Code. [online] Available at: [Accessed 11 Jun. 2018].018].
submitted by Mick2018 to Sparkster [link] [comments]

Tell me if I correctly answered my own question about mining ...

So like any n00b I want to get into Bitcoin mining but I see CPUs are worthless plus the amount of electricity you'll use will make it a negative net gain etc etc.
But I've been trying to find out given I have space at my work to stack up say 20 generic business desktop PCs running an Intel i5 processor.
The moment I start I should be profitable since I'm not investing a penny into this (Aside from my wasted time at work).
So I find TP's Bitcoin Calculator
I don't know WTF my hash rate is so I look it up here.
I don't know the exact processor so I'll start at the lowest value of 1.8 and round it up to 2. Multiplying by 20 desktops I get 40.
Fast forward the clock. Ring in 2015. Bitch about flying cars not being here yet, then open up my BitCoin wallet to see all that amounted to $5 (Assuming 1 BTC = $1000 USD)
Do I do the math right here?
submitted by Kicker774 to BitcoinMining [link] [comments]

How to calculate Hash power output of your cryptocurrency mining/ cloud mining Bitcoin Hash Calculator How To Calculate Hash Value Retro calculator CANON TP-7 with thermal printing. Bitcoin - Bitcoin For Beginners - Bitcoin Mining with

Ultimate Bitcoin Calculator. Bitcoin Mining, Profitability and Power Calculator. Calculate how much your shiny new rig is making you. Daily, weekly, monthly and annual net profit, power consumption cost, break even time. Everything you can ever need! Written in Google Go (golang), running on Google App Engine (GAE). Bitcoin Mining Calculator, Bitcoin Power Calculator, Bitcoin Profitability Bitcoin mining calculator Summary. Enter the hash rate of your Bitcoin mining hardware (mandatory). Enter additional optional information, such as pool fees, electricity costs, etc. The more information you enter, the more accurate the result will be. Bitcoin Mining Calculator. Got your shiny new ASIC miner? Wondering when it will pay off? If you enter your hash rate below, this page will calculate your expected earnings in both Bitcoins and dollars over various time periods (day, week, and month). Find out what your expected return is depending on your hash rate and electricity cost. Find out if it's profitable to mine Bitcoin, Ethereum, Litecoin, DASH or Monero. Do you think you've got what it takes to join the tough world of cryptocurrency mining? The unique Bitcoin tool set with numerous and useful features. Suitable for learning and testing.

[index] [2266] [24825] [30975] [5452] [16759] [22746] [7316] [24310] [25315] [26743]

How to calculate Hash power output of your cryptocurrency mining/ cloud mining

Bitcoin mining a block is difficult because the SHA-256 hash of a block's header must be lower than or equal to the target in order for the block to be accepted by the network. By using this calculator we can calculatehow mutch BTC amount generate in Hash power output in cryptocurrency mining and cloud mining, ... Bitcoin Mining Roi Calculator 2017 with Genesis Mining! You can see dashboard and hash calculator introducing 100usd on each crypto for 2years contract. ... Cloud Mining 2020 App 📲💻 Bitcoin ETHEREUM LiteCoin Dash 2 years Contract ... hash rate definition hash rate calculator gpu hash rate of graphics cards HashOcean How to setup, fund and get started making money HashOcean calculator HashOcean bitcoin HashOcean cloud mining ... How to Mine Bitcoin on Nicehash from start to finish ubit pci risers affiliate link Easy way to get started mining crypto download Cudominer

Flag Counter