Best mining rigs and mining PCs for Bitcoin, Ethereum and

Some hints and tips for newer players

This wipe I have progressed more than I have in past wipes, and I think a lot of it has to do with my play-style changing.
I know I am still nowhere near as stacked stash wish or stats wise than a lot of players, but thought I might share some hints and tips for new players that have been helpful for me.
I play mainly duos, sometimes trios with a couple of mates. My play style is relatively quiet (i.e. little sprinting, lots of pausing walking to listen) until I engage the enemy, then as much aggressive flanking and pushing as possible.
So you can see where I currently am this wipe my stash and play stats are here (it's zoomable). TLDR - 41% Survival rate, 120mil stash value, 5.77 KDR.
So onto the tips:
1) Optimise what you bring out of raids:
2) Run. Good. Ammo
3) Keep what you loot in raid
4) Run gear appropriate to your stash
5) Make the most of your hideout
6) Once you have engaged an enemy play aggressively
7) Sound is your friend
8) When poor, scav in
9) Complete quests
10) Don't be afraid to use gear
11) Play with others
I hope these tips help at least one person! Happy to expand on any of them if anyone has questions....
submitted by TomSchofield to EscapefromTarkov [link] [comments]

Mining: Weird Time to Start, a Good Time to Think

Mining: Weird Time to Start, a Good Time to Think
Well, it’s supposed to be an optimistic article about most promising mining cryptos, but then something happened. No one was too naive to believe that the events unfolded around the COVID-19 pandemic will not affect global markets, but the turbulence that occurred was very significant and, what is most sad, it is still very difficult to say how soon the situation will stabilize.
https://preview.redd.it/9xxheofluzp41.png?width=1024&format=png&auto=webp&s=cd8ca033faddf57ea041e82ceadee1037b8587f1
Many people were already bothered that crypto mining is becoming less profitable in 2020 and will be meaningless very soon, but even though big companies having bigger resources took over most of the industry, cryptocurrency mining using video cards remains available to common users and still has potential.
Despite, the volatility of the cryptocurrency market hashrate of the Bitcoin blockchain network yet remains almost at the same level and that is a quite positive sign. At the moment, the most reliable option seems to be to leave mining to large ASIC-farms and return when the stock panic subsides and the prospects will be clearer.
Although Bitcoin is still the most popular cryptocurrency on the market, every year the complexity of operations necessary for its production increases, and rewards fall (after halving in May 2020, we will talk about 6.25 BTC per block). For mining many altcoins, the threshold for entry is much lower, therefore it makes sense to look for a more profitable option among them.
But first, let’s try to understand a little what conditions we need for profitable mining.
There are several crucial aspects that determine how profitable mining will be. These are such obvious things as the price of the currency or the amount of reward for the generated block.
And this is the reason it is now very difficult to calculate the possible income. One way or another, the market price of altcoins depends on the position of bitcoin, which is experiencing bad times. For several months, the world of crypto mining has been preparing for the May halving, because the reduced supply led to a significant increase in prices. This time should not have been an exception, but now when bitcoin does not rise above $5500 and risks falling below $3500, we can only make vague guesses about its potential price in May. Many analysts tend to believe that closer to the middle of April, the negative effect of the crisis should be reduced, and positive expectations from halving and a large amount of cash from investors should have a positive impact on the price of bitcoin. Altcoins, as a rule, repeat the dynamics of the first cryptocurrency and will also continue their growth to historical highs in the year’s future.
Next, you should also pay attention to the complexity of mining because it affects the time and energy spent on generating the block. Do not forget about the cost of electricity in your region, as one extra-large bill can negate all your efforts to earn money on currency mining.
Do not forget about expenses on a mining rig and it’s amortisation.
In addition to the above, you should find out how practical the chosen currency is: whether it can be exchanged for fiat or more popular coins, what fees are charged by exchanges that work with it, and what reputation it has in general.
In order to avoid unpleasant mistakes, it is easier and more reliable to check the possible profit in one of the many calculators.

Best altcoins to mine in 2020

Monero is the currency with the highest anonymity rates, which stays attractive to many users and remains one of the strongest altcoins. The specific proof-of-work hashing algorithm does not allow ASIC-miners, so it is relatively easy to mine using personal computer’s processors and graphics cards. AMD graphic cards are preferable for this task, but NVidia suits as well. The current block reward is 2.47 XMR.
Litecoin is one of the oldest Bitcoin forks, but unlike it uses a different “Script” PoW algorithm which allows less powerful GPUs to mine coins. Litecoin is on the most popular, and successful Bitcoin forks and considered one of the most stable cryptocurrencies. Block mining reward is 12.5 LTC.
Ravencoin is another Bitcoin hardfork, and like Monero’s its X16R algorithm is practically unavailable for ASIC machines. Raven keeps gaining popularity for many reasons – it has faster block time, higher mining reward (5,000 RVN at the moment) and secure messaging system.
Dogecoin is not a joke anymore. Hard to believe, but this currency once made for fun, became one of the most valuable ones. Like Litecoin it uses Scrypt algorithm and great for mining with GPUs.
One more Bitcoin fork Bitcoin Gold was made specifically to kick out ASICs and clear the road for GPUs. It may not be the fastest-growing currency, but it is definitely one of the most stable.
That’s all for today. Stay safe, cause health is our most important asset.
Follow us on Medium, Twitter, Facebook, and Reddit to get StealthEX.io updates and the latest news about the crypto world. For all requests message us via [[email protected]](mailto:[email protected])
submitted by Stealthex_io to StealthEX [link] [comments]

AIOMiner - Alpha 7 Released

Hello All!
AIOMiner Alpha 7 is here and with it comes AMD support.
In this release we have given you the fastest way to mine for new people. In 5 clicks after install you can be mining and be on your way.
For anyone new, this is used to help you run your rig or desktop miner to help you manage your pools and mine with ease.
But here are some key new things, read more on the GitHub Page
Quick Start: Download,Install, Help, Add Wallet, Save, Click start
Screen Shots:
Main
Advanced
Supported Coins
ZCash, Vertcoin, BitcoinZ, Straks, MonaCoin, ZenCash, Ethereum, Hush, Komodo, Trezarcoin, Verge, Vivo, Bitcoin Gold, Zclassic, Ellaism, Pirl, Musicoin, Feathercoin, Monero, Ubiq, Expanse, Orbitcoin, Metaverse, Ethereum Classic, Sumokoin, Karbo, Electroneum, Bytecoin, Halcyon
Quick Help
Download Today
Feature Request or Chat
Community Driven, No Mining Fees, No Batch Files
submitted by xixspiderxix to gpumining [link] [comments]

Debunking myths about mining and GPUs

E: Going to bed, will contribute more tomorrow. Thanks for the discussion!
Myth: Mining is more stressful than gaming. Fact: It depends. During the old days, this was plausible, because older GPUs (Pre-polaris) are/were bottlenecked by core clock when mining the most profitable coins. Thus, miners overclocked and overvolted these cards quite frequently, especially with cheap electricity. This meant that those cards were often run hot, pushing the limits and stressing VRM and fans quite a lot. Nowadays, ethash (Ethereum) is the most profitable algorithm for AMD cards 99% of the time, and newer GPUs (Polaris) are limited by memory bandwidth and latency. Miners can underclock core to the low 1100MHz range before seeing performance drop. To save power, miners who know what they are doing also undervolt, since it is no longer necessary to sustain a high core clock. Thus, it is quite feasible to run polaris cards below 70C at a reasonable fan speed. However, dual mining (mining more than one coin at once) does increase power consumption by up to 20%, and there are also idiots who run their polaris cards OCd while mining. With the exception of a few idiots, miners treat their Polaris GPUs pretty much the same; that is, running underclocked and undervolted 24/7 with a memory strap mod and mem OC. On the other hand, former gaming cards are highly variable in use cases. Some gamers leave their cards at stock settings, some undervolt, and some OC and/or overvolt. Most of the time, these cards are thermal cycled far more often than mining cards, which is known to weaken solder. Another thing to consider is that manufacturers have learned (somewhat) from their mistakes of putting shit tier fans in GPUs, and many fans on modern GPUs are ball bearing and/or swappable. Even some budget cards, such as MSI Armor, use decent ball bearing fans. Bottom line: the risk of buying mined Polaris cards is not as high as the risk of buying older mined cards. I would not be against buying mined polaris cards, but it's not necessarily better than buying a gamer's card instead. At the end of the day, it depends more on how the owner treated it than what they used it for.
Myth: GPUs are obsolete because of FPGAs and ASICs Fact: Mostly false. Older algorithms such as scrypt and SHA256 (lite/doge/feathebitcoin etc) are no longer feasible to mine with GPUs, but there have been multiple algorithms since then that are built to deter ASICs; most of the time it is done by making it memory-hard because designing an ASIC with high memory throughput is considerably more expensive to design and manufacture. Many devs prefer their blockchain to be ASIC resistant to avoid the concentration of power problem that Bitcoin is having nowadays, where a giant, near-monopolistic ASIC manufacturer (Bitmain) is causing a lot of (subjective) controversy. Blockchains based on ethash (Ethereum and its forks), equihash (Zcash and its forks) and cryptonight (Monero and forks) are some examples, but there are scores of other shitcoins and a few other algos that are GPU dominant. It is almost impossible that there will be another ASIC takeover, which is what was responsible for the stop in GPU demand in the bitcoin and litecoin days. Bottom line: ASICs no longer threaten GPU miners, or the demand for GPUs
Myth: Ethereum switching to Proof of Stake will kill mining soon Fact: Doomsayers have been preaching about proof of stake since late 2015. It has always been "coming soon." The fact is, the Ethereum roadmap goes from proof of work (mining) -> Casper (mining + PoS) -> Metropolis (PoS). Currently, the release date of Casper is not even announced yet, nor is it being tested in a (public) testnet. Proof of Stake might one day take over, but mining is here to stay for a while yet. Another thing to consider is that there are tons of other GPU mineable blockchains, and although Ethereum is biggest, it is certainly feasible that mining stays profitable even after Ethereum goes PoS (if it ever does). However, it is possible that profits will be low enough to discourage new miners. Bottom line: It's very unlikely. E: I screwed up the roadmap; here is a better source than me with some interesting information: https://www.ethnews.com/ethereums-vitalik-buterin-gives-keynote-on-metropolis
Myth: The current Ethereum demand spike is a bubble Opinion: Honestly, I don't know. I would not be surprised if stricter regulations on ICOs come sooner or later, which would fuck with Ether prices. There is also the inherent volatility of cryptocurrencies. However, it is also possible that blockchain technology continues to gain traction; that is, the price could just as easily go up as go down. Although it's fun to read about other people's opinions, only time-travelling wizards can tell you when it will become economical again to upgrade your poor HD5770. Bottom line: No one knows.
Myth: Miners will "steal" all the RX Vegas Fact: Only a reckless miner would buy Vegas on release, since mining performance is not known. In fact, it is possible that it can't mine at all (or at some stupidly low speed) until devs add support to existing miners. It would be even more reckless than gamers who buy without seeing benchmarks, since at least gamers can expect the games to actually run. It's also not necessarily the case that Vega will be good once miners do add support. Maybe there will be enough reckless miners to affect supply, maybe not. Of course, it is possible that miners will deplete the supply after it is demonstrated that Vega is good for mining. Bottom line: Most miners won't preorder, but it's possible that a significant number will. E: Important to remember that even if mining demand isn't high, doesn't mean that supply will be plentiful.
Myth: Nvidia cards SUCK at mining Fact: Mostly false. They USED to suck in the old pre-Maxwell days, but now they are actually more efficient at mining Ethereum and Zcash compared to AMD cards, even after both cards are undervolted. The flipside is that they (used to) cost more for the equivalent hashrate. For reference, my old 5xRX470 rig drew just under 800W when mining ETH only and hashed at 150MH/s. My current 6xGTX1060 rig draws just over half of that (<450W) and hashes at about 135MH/s. Certainly not as good in raw performance, but they are viable nonetheless, especially given the AMD GPU shortage. In fact, Nvidia cards (1060 and especially 1070) are becoming scarce as well. Bottom line: Nvidia is still the underdog when it comes to mining, but far from irrelevant nowadays.
Myth: 4GB cards will be obsolete for mining soon Fact: FALSE. The Ethereum DAG is not even 3GB yet, and won't be for a few months. The recent reports of 4GB Polaris cards slowing down soon due to DAG size is caused by limited TLB capacity, not VRAM restrictions. Polaris cards will still be able to mine ETH forks such as Expanse and UBIQ without diminished speed, and even if they are used to mine ETH, it is not that much of a performance hit at first. It would certainly not make polaris useless or undesirable for mining anytime soon. Tahiti GPUs already suffer from this issue and Hawaii is the most resistant to this issue. Have not benched Nvidia at a later epoch.
Myth: Creating miner-bashing posts on Reddit will help alleviate the GPU supply problem Fact: False, you are simply giving cryptocurrencies and mining more exposure to the general public, increasing demand.
Myth: Mining-specific GPUs will solve the shortage problems Opinion: There's not enough info to tell yet, but I am a skeptic for the following reasons. First, no display limits the resale value of the card for obvious reasons. IMO, the whole point of crypto mining from a profitability standpoint is to have a hedge against coin volatility (hardware is still worth something if the coin crashes). Otherwise it is much less effort to just buy and hold the coin. If the hardware is useless without demand from other (significant) sources, then it doesn't make much sense to buy it unless the price is extremely low. I'm sure that cost-downing the PCB and warranty will make for a cheap card, but it has to be extremely cheap and plentiful in supply, or else miners will buy whatever they can get. I could envision "failed" chips (not meeting spec of consumer editions) being stuck in miner cards, but I doubt there are enough to meet demand without ramping up production as a whole, which carries its own risks. I guess that it would help a little, but probably not solve the problems. Alternatively, since modern GPUs are bottlenecked by RAM when mining, it might be enticing to miners to have the fastest (GDDR5) RAM on the market (probably the 9gbps chips from the 1060 6G 9gbps edition, although I don't have one to test). However, my previous points still apply; buying such a card without display outputs carries a big risk. Bottom line: It's not a great idea, unless they are super cheap or use really good RAM.
Hope this helped; if you have any further questions I will try to answer them. I'm both a gamer and miner who uses both AMD and Nvidia roughly equally and don't favor one group over another. I've mined and gamed on all high end AMD GPUs since Tahiti (except Tonga) and all Pascal cards except 1050ti.
submitted by key_smash to Amd [link] [comments]

The Problem with PoW

The Problem with PoW
Miners have always had it rough..
"Frustrated Miners"

The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.

Hashrates and Hardware Types

While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.

2 Guys 1 ASIC

One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.

Implications of Centralization

This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.

The Rise of FPGAs

With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called VerusCoin and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.

All is not lost thanks to.. um.. Technology?

Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"

If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.

In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to CryptoCurrency [link] [comments]

FYI How to Pi Coin Cryptocurrency Mining Made Simple for Everyday People (Passive Income)

FYI How to Pi Coin Cryptocurrency Mining Made Simple for Everyday People (Passive Income)

Pi Coin Cryptocurrency Mining Made Simple for Everyday People (Passive Income)

https://preview.redd.it/kxygj89rsej31.jpg?width=1236&format=pjpg&auto=webp&s=a27bf488767eb2a119d38e1f4120a280da692cf9

Combining Cryptocurrency Mining with Traveling

Chances are, you’ve heard about the buzzing prices of cryptocurrencies like Bitcoin but probably know jack about anything else of it. Understanding how Bitcoin and Blockchain technology works can be a bit intimidating and overwhelming for new users into the Crypto-space. But hey, I’m no expert either and I just dabble in it to know enough not to miss out.
As someone who aspires to become a digital nomad and travel the world, you may want to consider looking to start now. And here I can show you how easy it is to start with zero knowledge of Bitcoin, Blockchain, and cryptocurrency.
If you know how to download an app to your smartphone or tablet. Then you are ready to earn cryptocurrency. You use your smartphone for pretty much everything from communicating, paying bills, watching videos on YouTube all day, looking for restaurants and etc. So why not also use it to make you money every single day and doesn’t cost you anything more but to just download an app?
The tides are ever-changing, and to survive, you must sail with the wind. Not against it. Don’t leave free money on the table and read on.
Pi Network (Minepi.com) is getting increasingly popular every day and is perhaps one of the fastest-growing networks in the Cryptocurrency world as of 2019. It’s no wonder you would want to get as many members under you as quickly as possible.
Mining for Pi Coin cryptocurrency takes almost no effort but to simply make a single tap on the lightning symbol once a day. After that, you can continue to use your phone as normal or shut the screen off. The Pi coin is continuously mined for the next 24 hours in the background of your device without using massive amounts of energy unlike mining for Bitcoin.
As the Pi Network continues to grow with 1000’s of new miners signing up every day, the mining rate will be halved accordingly, which means it’ll be harder and slower to get Pi coins. So, it’s quite obvious to be one of those people to get a really early start on this popular rising cryptocurrency.
But why am I sharing my secrets? Aren’t you afraid of the competition? No, not really. Because for 1, I am a believer in Pi vision so I want to help it grow by helping you grow. Plus, even if this information is out there, I doubt that everyone who reads this will actually put in the effort to actually grow (please prove me wrong).
Please read on and learn the secret!
24h a day 7d a week
168h a week 4 weeks a month
672h a month
If the PI coin is listed on markets and the price is at $0,05 you have passive income from $336. If you mine passive 1 PI a hour. Its only one click on your mobile a day.

What Is Pi Network?

Pi Network is a small connected group of people within a security circle stitched along with other smaller security circles to create a “trust graph” that will help users know who to trust and transact with. The security circle is used to validate one’s identity to allow seamlessly and trusted transactions in the Pi cryptocurrency marketplace. This is to secure trust in the network that no fraudulent activities can take place.

The Core Team Members

Dr. Nicolas KokkalisHead of Technology Stanford Ph.D. and instructor of Stanford’s first decentralized applications class; combining distributed systems and human-computer interactions to bring cryptocurrency to everyday people.
Dr. Chengdiao Fan – Head of Product Stanford Ph.D. in Computational Anthropology harnessing social computing to unlock human potential on a global scale.
Vincent McPhillip – Head of Community Yale and Stanford-trained social movement builder on a mission to democratize how society defines, creates, and distributes wealth.

  • Their Mission is to build a cryptocurrency and smart contracts platform secured and operated by everyday people like us, but with simplicity.
  • Their vision is to make Cryptomining and spending as easy as “Pie”, making it the world’s most inclusive peer-to-peer marketplace that is fueled by Pi.
· The network has about 70k daily active members mining every day on the Pi Network app and is growing incredibly fast. Once they reach the first threshold of 100k users, they will HALVE the mining rate. That’s right, so mining speed for you will become lesser and lesser as more users join the network. However, if you join today and become a (PI)oneer and help steer the Pi network into the right general direction. You’ll get a larger piece of the PI(E) and grow it into massive savings until it hits the exchange market.
· They are going to release an update during Q4 2019 that will enable users to send their Pi coins to any other user on the network. This will be the beginning of seeing the true value of the coin. Personally, for me, I’m all in for the what if factor, especially when all it took was to download a single app.
· You don’t have to spend any extra money to mine so long as you have a smartphone or a tablet.

· Invitation Into The Pi Network

· Currently, Pi network is in beta and to join the network is through an invitation code only by someone who is already a member. You can join under my name and be added into my security circle.

What Can You Do with Your Pi Coins?

In the future:


  • Pioneers can wager Pi to engage the attention of other members of the network, by sharing content (e.g., text, images, videos) or asking questions
  • Trade Pi coins with other members
  • Use Pi to purchase goods in the Pi marketplace
  • Use Pi for advertisement
  • Exchange Pi for ETH or BTC which can be exchanged into fiat money
  • Much more as the network grows to mass adoption
As they are in beta now, you get to mine Pi coins at higher rates since there are fewer users on the network. Like Bitcoin for example; mining Bitcoin 10 years ago you would be able to get a few Bitcoin every hour or so. Now, you’ll get somewhere around 0.000001 BTC per hour since there are millions of miners now (mining rate differs for everyone depending on their rig).
Think of Pi like in the early stages of Bitcoin when nobody really knew or understood what its technology is really all about. Imagine if you knew then what you know now. Wouldn’t you mine it like crazy? It goes exactly the same for Pi coin.

Need To Know More?

https://minepi.com/white-paper https://minepi.com/faq
Comment down below if you joined so I can add you into my security circle.
Thanks for reading, You Like? Share It!
submitted by SandraThelenhere to PiNetworkMining [link] [comments]

AN INTRODUCTION TO DIGIBYTE

DigiByte

What are cryptocurrencies?
Cryptocurrencies are peer to peer technology protocols which rely on the block-chain; a system of decentralized record keeping which allows people to exchange unmodifiable and indestructible information “coins,” globally in little to no time with little to no fees – this translates into the exchange of value as these coins cannot be counterfeit nor stolen. This concept was started by Satoshi Nakamoto (allegedly a pseudonym for a single man or organization) whom described and coded Bitcoin in 2009.
What is DigiByte?
DigiByte (DGB) is a cryptocurrency like Bitcoin. It is also a decentralized applications protocol in a similar fashion to Neo or Ethereum.
DigiByte was founded and created by Jared Tate in 2014. DigiByte allows for fast (virtually instant) and low cost (virtually free) transactions. DigiByte is hard capped at 21 billion coins which will ever be mined, over a period of 21 years. DigiByte was never an ICO and was mined/created in the same way that Bitcoin or Litecoin initially were.
DigiByte is the fastest UTXO PoW scalable block-chain in the world. We’ll cover what this really means down below.
DigiByte has put forth and applied solutions to many of the problems that have plagued Bitcoin and cryptocurrencies in general – those being:
We will address these point by point in the subsequent sections.
The DigiByte Protocol
DigiByte maintains these properties through use of various technological innovations which we will briefly address below.
Why so many coins? 21 Billion
When initially conceived Bitcoin was the first of a kind! And came into the hands of a few! The beginnings of a coin such as Bitcoin were difficult, it had to go through a lot of initial growth pains which following coins did not have to face. It is for this reason among others why I believe Bitcoin was capped at 21 million; and why today it has thus secured a place as digital gold.
When Bitcoin was first invented no one knew anything about cryptocurrencies, for the inventor to get them out to the public he would have to give them away. This is how the first Bitcoins were probably passed on, for free! But then as interest grew so did the community. For them to be able to build something and create something which could go on to have actual value, it would have to go through a steady growth phase. Therefore, the control of inflation through mining was extremely important. Also, why the cap for Bitcoin was probably set so low - to allow these coins to amass value without being destroyed by inflation (from mining) in the same way fiat is today! In my mind Satoshi Nakamoto knew what he was doing when setting it at 21 million BTC and must have known and even anticipated others would take his design and build on top of it.
At DigiByte, we are that better design and capped at 21 billion. That's 1000 times larger than the supply of Bitcoin. Why though? Why is the cap on DigiByte so much higher than that of Bitcoin? Because DigiByte was conceived to be used not as a digital gold, nor as any sort of commodity, but as a real currency!
Today on planet Earth, we are approximately 7.6 billion people. If each person should want or need to use and live off Bitcoin; then equally split at best each person could only own 0.00276315789 BTC. The market cap for all the money on the whole planet today is estimated to have recently passed 80 trillion dollars. That means that each whole unit of Bitcoin would be worth approximately $3,809,523.81!
$3,809,523.81
This is of course in an extreme case where everyone used Bitcoin for everything. But even in a more conservative scenario the fact remains that with such a low supply each unit of a Bitcoin would become absurdly expensive if not inaccessible to most. Imagine trying to buy anything under a dollar!
Not only would using Bitcoin as an everyday currency be a logistical nightmare but it would be nigh impossible. For each Satoshi of a Bitcoin would be worth much, much, more than what is realistically manageable.
This is where DigiByte comes in and where it shines. DigiByte aims to be used world-wide as an international currency! Not to be hoarded in the same way Bitcoin is. If we were to do some of the same calculations with DigiByte we'd find that the numbers are a lot more reasonable.
At 7.6 billion people, each person could own 2.76315789474 DGB. Each whole unit of DGB would be worth approximately $3,809.52.
$3,809.52
This is much more manageable and remember in an extreme case where everyone used DigiByte for everything! I don't expect this to happen anytime soon, but with the supply of DigiByte it would allow us to live and transact in a much more realistic and fluid fashion. Without having to divide large numbers on our phone's calculator to understand how much we owe for that cup of coffee! With DigiByte it's simple, coffee cost 1.5 DGB, the cinema 2.8 DGB, a plane ticket 500 DGB!
There is a reason for DigiByte's large supply, and it is a good one!
Decentralisation
Decentralisation is an important concept for the block-chain and cryptocurrencies in general. This allows for a system which cannot be controlled nor manipulated no matter how large the organization in play or their intentions. DigiByte’s chain remains out of the reach of even the most powerful government. This allows for people to transact freely and openly without fear of censorship.
Decentralisation on the DigiByte block-chain is assured by having an accessible and fair mining protocol in place – this is the multi-algorithm (MultiAlgo) approach. We believe that all should have access to DigiByte whether through purchase or by mining. Therefore, DigiByte is minable not only on dedicated mining hardware such as Antminers, but also through use of conventional graphics cards. The multi-algorithm approach allows for users to mine on a variety of hardware types through use of one of the 5 mining algorithms supported by DigiByte. Those being:
Please note that these mining algorithms are modified and updated from time to time to assure complete decentralisation and thus ultimate security.
The problem with using only one mining algorithm such as Bitcoin or Litecoin do is that this allows for people to continually amass mining hardware and hash power. The more hash power one has, the more one can collect more. This leads to a cycle of centralisation and the creation of mining centres. It is known that a massive portion of all hash power in Bitcoin comes from China. This kind of centralisation is a natural tendency as it is cheaper for large organisations to set up in countries with inexpensive electricity and other such advantages which may be unavailable to the average miner.
DigiByte mitigates this problem with the use of multiple algorithms. It allows for miners with many different kinds of hardware to mine the same coin on an even playing field. Mining difficulty is set relative to the mining algorithm used. This allows for those with dedicated mining rigs to mine alongside those with more modest machines – and all secure the DigiByte chain while maintaining decentralisation.
Low Fees
Low fees are maintained in DigiByte thanks to the MultiAlgo approach working in conjunction with MultiShield (originally known as DigiShield). MultiShield calls for block difficulty readjustment between every single block on the chain; currently blocks last 15 seconds. This continuous difficulty readjustment allows us to combat any bad actors which may wish to manipulate the DigiByte chain.
Manipulation may be done by a large pool or a single entity with a great amount of hash power mining blocks on the chain; thus, increasing the difficulty of the chain. In some coins such as Bitcoin or Litecoin difficulty is readjusted every 2016 blocks at approximately 10mins each and 2mins respectively. Meaning that Bitcoin’s difficulty is readjusted about every two weeks. This system can allow for large bad actors to mine a coin and then abandon it, leaving it with a difficulty level far too high for the present hash rate – and so transactions can be frozen, and the chain stopped until there is a difficulty readjustment and or enough hash power to mine the chain. In such a case users may be faced with a choice - pay exorbitant fees or have their transactions frozen. In an extreme case the whole chain could be frozen completely for extended periods of time.
DigiByte does not face this problem as its difficulty is readjusted per block every 15 seconds. This innovation was a technological breakthrough and was adopted by several other coins in the cryptocurrency environment such as Dogecoin, Z-Cash, Ubiq, Monacoin, and Bitcoin Gold.
This difficulty readjustment along with the MultiAlgo approach allows DigiByte to maintain the lowest fees of any UTXO – PoW – chain in the world. Currently fees on the DigiByte block-chain are at about 0.0001 DGB per transaction of 100 000 DGB sent. This depends on the amount sent and currently 100 000 DGB are worth around $2000.00 with the fee being less than 0.000002 cents. It would take 500 000 transactions of 100 000 DGB to equal 1 penny’s worth. This was tested on a Ledger Nano S set to the low fees setting.
Fast transaction times
Fast transactions are ensured by the conjunctive use of the two aforementioned technology protocols. The use of MultiShield and MultiAlgo allows the mining of the DigiByte chain to always be profitable and thus there is always someone mining your transactions. MultiAlgo allows there to a greater amount of hash power spread world-wide, this along with 15 second block times allows for transactions to be near instantaneous. This speed is also ensured by the use DigiSpeed. DigiSpeed is the protocol by which the DigiByte chain will decrease block timing gradually. Initially DigiByte started with 30 second block times in 2014; which today are set at 15 seconds. This decrease will allow for ever faster and ever more transactions per block.
Robust security + The Immutable Ledger
At the core of cryptocurrency security is decentralisation. As stated before decentralisation is ensured on the DigiByte block chain by use of the MultiAlgo approach. Each algorithm in the MultiAlgo approach of DigiByte is only allowed about 20% of all new blocks. This in conjunction with MultiShield allows for DigiByte to be the most secure, most reliable, and fastest UTXO block chain on the planet. This means that DigiByte is a proof of work (PoW) block-chain where all transactional activities are stored on the immutable public ledger world-wide. In DigiByte there is no need for the Lightning protocol (although we have it) nor sidechains to scale, and thus we get to keep PoW’s security.
There are many great debates as to the robustness or cleanliness of PoW. The fact remains that PoW block-chains remain the only systems in human history which have never been hacked and thus their security is maximal.
For an attacker to divert the DigiByte chain they would need to control over 93% of all the hashrate on one algorithm and 51% of the other four. And so DigiByte is immune to the infamous 51% attack to which Bitcoin and Litecoin are vulnerable.
Moreover, the DigiByte block-chain is currently spread over 200 000 plus servers, computers, phones, and other machines world-wide. The fact is that DigiByte is one of the easiest to mine coins there is – this is greatly aided by the recent release of the one click miner. This allows for ever greater decentralisation which in turn assures that there is no single point of failure and the chain is thus virtually un-attackable.
On Chain Scalability
The biggest barrier for block-chains today is scalability. Visa the credit card company can handle around 2000 transactions per second (TPS) today. This allows them to ensure customer security and transactional rates nation-wide. Bitcoin currently sits at around 7 TPS and Litecoin at 28 TPS (56 TPS with SegWit). All the technological innovations I’ve mentioned above come together to allow for DigiByte to be the fastest PoW block-chain in the world and the most scalable.
DigiByte is scalable because of DigiSpeed, the protocol through which block times are decreased and block sizes are increased. It is known that a simple increase in block size can increase the TPS of any block-chain, such is the case with Bitcoin Cash. This is however not scalable. The reason a simple increase in block size is not scalable is because it would eventually lead to some if not a great amount of centralization. This centralization occurs because larger block sizes mean that storage costs and thus hardware cost for miners increases. This increase along with full blocks – meaning many transactions occurring on the chain – will inevitably bar out the average miner after difficulty increases and mining centres consolidate.
Hardware cost, and storage costs decrease over time following Moore’s law and DigiByte adheres to it perfectly. DigiSpeed calls for the increase in block sizes and decrease in block timing every two years by a factor of two. This means that originally DigiByte’s block sizes were 1 MB at 30 seconds each at inception in 2014. In 2016 DigiByte increased block size by two and decreased block timing by the same factor. Perfectly following Moore’s law. Moore’s law dictates that in general hardware increases in power by a factor of two while halving in cost every year.
This would allow for DigiByte to scale at a steady rate and for people to adopt new hardware at an equally steady rate and reasonable expense. Thus so, the average miner can continue to mine DigiByte on his algorithm of choice with entry level hardware.
DigiByte was one of the first block chains to adopt segregated witness (SegWit in 2017) a protocol whereby a part of transactional data is removed and stored elsewhere to decrease transaction data weight and thus increase scalability and speed. This allows us to fit more transactions per block which does not increase in size!
DigiByte currently sits at 560 TPS and could scale to over 280 000 TPS by 2035. This dwarfs any of the TPS capacities; even projected/possible capacities of some coins and even private companies. In essence DigiByte could scale worldwide today and still be reliable and robust. DigiByte could even handle the cumulative transactions of all the top 50 coins in coinmarketcap.com and still run smoothly and below capacity. In fact, to max out DigiByte’s actual maximum capacity (today at 560 TPS) you would have to take all these transactions and multiply them by a factor of 10!
Oher Uses for DigiByte
Note that DigiByte is not only to be used as a currency. Its immense robustness, security and scalability make it ideal for building decentralised applications (DAPPS) which it can host. DigiByte can in fact host DAPPS and even centralised versions which rely on the chain which are known as Digi-Apps. This application layer is also accompanied by a smart contract layer.
Thus, DigiByte could host several Crypto Kitties games and more without freezing out or increasing transaction costs for the end user.
Currently there are various DAPPS being built on the DigiByte block-chain, these are done independently of the DigiByte core team. These companies are simply using the DigiByte block-chain as a utility much in the same way one uses a road to get to work. One such example is Loly – a Tinderesque consensual dating application.
DigiByte also hosts a variety of other platform projects such as the following:
The DigiByte Foundation
As previously mentioned DigiByte was not an ICO. The DigiByte foundation was established in 2017 by founder Jared Tate. Its purpose is as a non-profit organization dedicated to supporting and developing the DigiByte block-chain.
DigiByte is a community effort and a community coin, to be treated as a public resource as water or air. Know that anyone can work on DigiByte, anyone can create, and do as they wish. It is a permissionless system which encourages innovation and creation. If you have an idea and or would like to get help on your project do not hesitate to contact the DigiByte foundation either through the official website and or the telegram developer’s channel.
For this reason, it is ever more important to note that the DigiByte foundation cannot exist without public support. And so, this is the reason I encourage all to donate to the foundation. All funds are used for the maintenance of DigiByte servers, marketing, and DigiByte development.
DigiByte Resources and Websites
DigiByte
Wallets
Explorers
Please refer to the sidebar of this sub-reddit for more resources and information.
Edit - Removed Jaxx wallet.
Edit - A new section was added to the article: Why so many coins? 21 Billion
Edit - Adjusted max capacity of DGB's TPS - Note it's actually larger than I initially calculated.
Edit – Grammar and format readjustment
Hello,
I hope you’ve enjoyed my article, I originally wrote this for the reddit sub-wiki where it generally will most likely, probably not, get a lot of attention. So instead I've decided to make this sort of an introductory post, an open letter, to any newcomers to DGB or for those whom are just curious.
I tried to cover every aspect of DGB, but of course I may have forgotten something! Please leave a comment down below and tell me why you're in DGB? What convinced you? Me it's the decentralised PoW that really convinced me. Plus, just that transaction speed and virtually no fees! Made my mouth water!
-Dereck de Mézquita
I'm a student typing this stuff on my free time, help me pay my debts? Thank you!
D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
https://digiexplorer.info/address/D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
submitted by xeno_biologist to Digibyte [link] [comments]

Why GPU Prices Went Up And Where They Will Go In The Future (Informed Speculation)

I am seeing a lot of comments in this sub in regards to the GPU apocalypse 2.0. I wanted to shed some insight about what is going on and what I believe is going to happen in the future. As a long time lurker in this sub, I believe in speaking the truth and getting past all the bs.
First, I need to tell you guys right now. I am a miner. I have 16 1070s and 12 1060 3gbs mining in unison in my basement (which most veteran miners would consider a mid-sized operation). My personal gaming rig is still powered by my GTX 980 and I mine with it as well when I am not using it. All that being said, I have been in the mining game for years and I saw this apocalypse coming with the climbing profitability (which is why about three weeks ago I ordered 12 1060s). In July, when the GPU apocalypse 1.0 happened, everyone was happy because a 6x1070 rig could make you upwards of $20 a day. Then when everyone started mining it dropped to $15 on average, then again to $12. In late November to early December I started to see a huge but gradual climb to over $20 a day again. I admitted all of that to establish myself as a mining veteran, NOT to troll anyone. If you want to try and predict GPU apocalypse 3.0 yourself, visit whattomine and calculate what 6x1070s are making and compare them to the numbers I have above. If they start to climb near $20 a day, you will know winter is coming again.
For the GPU market, I strongly believe that the market in the next few years will shift. There are two reasons for that. The first is I think that GPU mining is here to stay. Remember that our profitability is NOT completely linked to crypto prices, but in the difficulty of a particular coin. If the difficulty of ethereum were to double today and the price stay the same, I would be mining zcash etc. Even if the prices of ALL coins tanked, it wouldn't affect my profit as much as many of you seem to think it would. The reason is because when the reward goes down, many stop mining a coin and so the difficulty drops. It's just human behavior. That is why most days I make the exact same amount of USD $$$ even though the price of most coins shot up. It IS possible for profitability to drop as hard as it's went up recently, but the low profitability would cause a new coin to utilize all that untapped hashrate. There is a huge financial incentive to tap into all of this GPU power, and that is why I strongly believe it is going to continue to be profitable long term. Just look at how much etheruem has eaten into the Bitcoin dominance by market cap in the past year or two.
The second reason is I believe that this generation of cards has experienced demand the likes of which we have never seen! I mean seriously, those of us that got into PC gaming in the last 10 years, we have not seen demand like this EVER. Like you guys have been saying, this is going to cause a VERY healthy second hand market soon, which is a great thing. If the Volta profitability from the Titan V is anything to go by, that manufacturing process is going to be the new go-to for crypto mining. The only reason the Titan isn't coveted by crypto miners is the price. At 200w it's crypto numbers are actually the strongest per watt of anything out there (which will very likely translate down into the next gen 1080, 1070, and 1060 respectively). I don't think everyone is gonna just shut off their rx 570s, 1060s, 1070s, and 1080s just to upgrade to Volta, but all NEW rigs in the future will obviously be built with the best tech. Those old rigs will stay powered on until they are literally unprofitable to run OR they are so low-profit that they are no longer worth the effort to maintain.
For the two reasons above (continued profitability and better GPUs in the future) I do believe we are heading towards a situation where crypto miners get the latest gen cards and gamers get one generation back. The FPS/$ on the used market has mostly always been better than the new market, but we are heading for a time where that effect is going to be compounded. I strongly believe the go-to-cards for gamers once Volta spreads to miners is going to be rx 570-1080ti depending on budget.
One final piece of bad news is there is a new floor to GPU prices. Miners will ALWAYS buy whatever is going to ROI fastest and make the most money. If 1070s dropped to 100 bucks and Volta 2070s are $450 but only twice as good, we will demand 1070s until the price goes up beyond $200. That's just an example, but you get the idea. I think that second hand GPUs for the previous gen will be selling for slightly less than what they were MSRP at launch.
Personally, mining has changed my life for the better. My wife is preggo and we are about to move to a single income family because of the revenue generated by mining (about $100 every day with the cards listed above). I don't understand why gamers aren't subsidizing their purchase of GPUs by mining with new, easy software like Nicehash then cashing it out. I get not everyone has the up-front capital to get a 1080ti for $1300, but why not get a 1060 3gb for a little under 300 then set it to mine when you aren't using it? After a few months you will have made the price of your GPU under 200 despite electricity costs. Another benefit is technically you mining with that 1060 will decrease the profit of miners like me slightly. If you really wanna affect the bottom line of miners, mine with the cards you already have. It increases the difficulty and makes profit lower MUCH more so than a drop in price would. There is plenty of profit to go around. I would say you could buy games on steam with the resulting Bitcoin, but they eliminated that. You CAN buy from Newegg straight up or even from Amazon at a discount using purse.io in order to justify the extra price for the card. There are a TON of options of how you can spend your BTC.
I certainly don't know everything, but if you guys have any questions please let me know and I will try my best to answer them. I am sure an even more experienced miner will jump in if I don't know the answer and respond as well.
TL;DR: GPU apocalypse 2.0 happened because the profit of 6x1070s passed $20 per day again (all cards are high profit but I use 1070s as a reference). I believe gamers are going to get one generation of GPU back starting when the full Volta stack comes out.
submitted by compound-interest to pcmasterrace [link] [comments]

The Problem with PoW


Miners have always had it rough..
"Frustrated Miners"


The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.
Hashrates and Hardware Types
While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.
2 Guys 1 ASIC
One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.
Implications of Centralization
This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.
The Rise of FPGAs
With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called VerusCoin and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.
All is not lost thanks to.. um.. Technology?
Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"
If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.
In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to EtherMining [link] [comments]

The Problem with PoW

The Problem with PoW

Miners have always had it rough..
"Frustrated Miners"


The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.
Hashrates and Hardware Types
While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.
2 Guys 1 ASIC
One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.
Implications of Centralization
This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.
The Rise of FPGAs
With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called VerusCoin and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.
All is not lost thanks to.. um.. Technology?
Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"
If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.
In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to gpumining [link] [comments]

The Problem with PoW

"Frustrated Miners"

The Problem with PoW
(and what is being done to solve it)

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.
In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.
Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.

Hashrates and Hardware Types

While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.
When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.
This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.
Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.

2 Guys 1 ASIC

One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.
Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.
When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.
This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.

Implications of Centralization

This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).
This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.
The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.

The Rise of FPGAs

With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.
A perfect real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called Verus Coin (https://veruscoin.io/) and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.
Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.
Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.

All is not lost thanks to.. um.. Technology?

Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”
In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.
Digging a bit deeper it turns out the Verus development team are no rookies. The lead developer Michael F Toutonghi has spent decades in the field programming and is a former Vice President and Technical Fellow at Microsoft, recognized founder and architect of Microsoft's .Net platform, ex-Technical Fellow of Microsoft's advertising platform, ex-CTO, Parallels Corporation, and an experienced distributed computing and machine learning architect. The project he helped create employs and makes use of a diverse myriad of technologies and security features to form one of the most advanced and secure cryptocurrency to date. A brief description of what makes VerusCoin special quoted from a community member-
"Verus has a unique and new consensus algorithm called Proof of Power which is a 50% PoW/50% PoS algorithm that solves theoretical weaknesses in other PoS systems (Nothing at Stake problem for example) and is provably immune to 51% hash attacks. With this, Verus uses the new hash algorithm, VerusHash 2.0. VerusHash 2.0 is designed to better equalize mining across all hardware platforms, while favoring the latest CPUs over older types, which is also one defense against the centralizing potential of botnets. Unlike past efforts to equalize hardware hash-rates across different hardware types, VerusHash 2.0 explicitly enables CPUs to gain even more power relative to GPUs and FPGAs, enabling the most decentralizing hardware, CPUs (due to their virtually complete market penetration), to stay relevant as miners for the indefinite future. As for anonymity, Verus is not a "forced private", allowing for both transparent and shielded (private) transactions...and private messages as well"

If other projects can learn from this and adopt a similar approach or continue to innovate with new ideas, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing present an overall unprecedented level of decentralization not yet seen in cryptocurrency.
Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.

In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, cryptocurrency is here to stay and the projects that are doing something to solve the current problems in the proof of work consensus mechanism will be the ones that lead us toward our collective vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to CryptoTechnology [link] [comments]

Announcement: AIOMiner Now Supports Vertcoin!

Hello All!
AIOMiner Alpha 7 is here and with it comes AMD support as well as Vertcoin Support!
In this release we have given you the fastest way to mine for new people. In 5 clicks after install you can be mining and be on your way.
For anyone new, this is used to help you run your rig or desktop miner to help you manage your pools and mine with ease.
But here are some key new things, read more on the GitHub Page:
Quick Start: Download,Install, Help, Add Wallet, Save, Click start
Screen Shots:
Main
Advanced
Supported Coins: ZCash, Vertcoin, BitcoinZ, Straks, MonaCoin, ZenCash, Ethereum, Hush, Komodo, Trezarcoin, Verge, Vivo, Bitcoin Gold, Zclassic, Ellaism, Pirl, Musicoin, Feathercoin, Monero, Ubiq, Expanse, Orbitcoin, Metaverse, Ethereum Classic, Sumokoin, Karbo, Electroneum, Bytecoin, Halcyon
Quick Help
Download Today
Discord
Community Driven, No Mining Fees, No Batch Files
submitted by The_Brutally_Honest to vertcoin [link] [comments]

The rise of specialized hardware (particularly FPGAs) and its impact on the mining community

The rise of specialized hardware (particularly FPGAs) and its impact on the mining community

Proof of Work (PoW) is one of the most commonly used consensus mechanisms entrusted to secure and validate many of today’s most successful cryptocurrencies, Bitcoin being one. Battle-hardened and having weathered the test of time, Bitcoin has demonstrated the undeniable strength and reliability of the PoW consensus model through sheer market saturation, and of course, its persistency.

In addition to the cost of powerful computing hardware, miners prove that they are benefiting the network by expending energy in the form of electricity, by solving and hashing away complex math problems on their computers, utilizing any suitable tools that they have at their disposal. The mathematics involved in securing proof of work revolve around unique algorithms, each with their own benefits and vulnerabilities, and can require different software/hardware to mine depending on the coin.

Because each block has a unique and entirely random hash, or “puzzle” to solve, the “work” has to be performed for each block individually and the difficulty of the problem can be increased as the speed at which blocks are solved increases.

Hashrates and Hardware Types
While proof of work is an effective means of securing a blockchain, it inherently promotes competition amongst miners seeking higher and higher hashrates due to the rewards earned by the node who wins the right to add the next block. In turn, these higher hash rates benefit the blockchain, providing better security when it’s a result of a well distributed/decentralized network of miners.

When Bitcoin first launched its genesis block, it was mined exclusively by CPUs. Over the years, various programmers and developers have devised newer, faster, and more energy efficient ways to generate higher hashrates; some by perfecting the software end of things, and others, when the incentives are great enough, create expensive specialized hardware such as ASICs (application-specific integrated circuit). With the express purpose of extracting every last bit of hashing power, efficiency being paramount, ASICs are stripped down, bare minimum, hardware representations of a specific coin’s algorithm.

This gives ASICS a massive advantage in terms of raw hashing power and also in terms of energy consumption against CPUs/GPUs, but with significant drawbacks of being very expensive to design/manufacture, translating to a high economic barrier for the casual miner. Due to the fact that they are virtual hardware representations of a single targeted algorithm, this means that if a project decides to fork and change algorithms suddenly, your powerful brand-new ASIC becomes a very expensive paperweight. The high costs in developing and manufacturing ASICs and the associated risks involved, make them unfit for mass adoption at this time.

Somewhere on the high end, in the vast hashrate expanse created between GPU and ASIC, sits the FPGA (field programmable gate array). FPGAs are basically ASICs that make some compromises with efficiency in order to have more flexibility, namely they are reprogrammable and often used in the “field” to test an algorithm before implementing it in an ASIC. As a precursor to the ASIC, FPGAs are somewhat similar to GPUs in their flexibility, but require advanced programming skills and, like ASICs, are expensive and still fairly uncommon.

The Arms Race of the Geek
One of the issues with proof of work incentivizing the pursuit of higher hashrates is in how the network calculates block reward coinbase payouts and rewards miners based on the work that they have submitted. If a coin generated, say a block a minute, and this is a constant, then what happens if more miners jump on a network and do more work? The network cannot pay out more than 1 block reward per 1 minute, and so a difficulty mechanism is used to maintain balance. The difficulty will scale up and down in response to the overall nethash, so if many miners join the network, or extremely high hashing devices such as ASICs or FPGAs jump on, the network will respond accordingly, using the difficulty mechanism to make the problems harder, effectively giving an edge to hardware that can solve them faster, balancing the network. This not only maintains the block a minute reward but it has the added side-effect of energy requirements that scale up with network adoption.

Imagine, for example, if one miner gets on a network all alone with a CPU doing 50 MH/s and is getting all 100 coins that can possibly be paid out in a day. Then, if another miner jumps on the network with the same CPU, each miner would receive 50 coins in a day instead of 100 since they are splitting the required work evenly, despite the fact that the net electrical output has doubled along with the work. Electricity costs miner’s money and is a factor in driving up coin price along with adoption, and since more people are now mining, the coin is less centralized. Now let’s say a large corporation has found it profitable to manufacture an ASIC for this coin, knowing they will make their money back mining it or selling the units to professionals. They join the network doing 900 MH/s and will be pulling in 90 coins a day, while the two guys with their CPUs each get 5 now. Those two guys aren’t very happy, but the corporation is. Not only does this negatively affect the miners, it compromises the security of the entire network by centralizing the coin supply and hashrate, opening the doors to double spends and 51% attacks from potential malicious actors. Uncertainty of motives and questionable validity in a distributed ledger do not mix.

When technology advances in a field, it is usually applauded and welcomed with open arms, but in the world of crypto things can work quite differently. One of the glaring flaws in the current model and the advent of specialized hardware is that it’s never ending. Suppose the two men from the rather extreme example above took out a loan to get themselves that ASIC they heard about that can get them 90 coins a day? When they join the other ASIC on the network, the difficulty adjusts to keep daily payouts consistent at 100, and they will each receive only 33 coins instead of 90 since the reward is now being split three ways. Now what happens if a better ASIC is released by that corporation? Hopefully, those two guys were able to pay off their loans and sell their old ASICs before they became obsolete.

This system, as it stands now, only perpetuates a never ending hashrate arms race in which the weapons of choice are usually a combination of efficiency, economics, profitability and in some cases control.

Implications of Centralization
This brings us to another big concern with expensive specialized hardware: the risk of centralization. Because they are so expensive and inaccessible to the casual miner, ASICs and FPGAs predominantly remain limited to a select few. Centralization occurs when one small group or a single entity controls the vast majority hash power and, as a result, coin supply and is able to exert its influence to manipulate the market or in some cases, the network itself (usually the case of dishonest nodes or bad actors).

This is entirely antithetical of what cryptocurrency was born of, and since its inception many concerted efforts have been made to avoid centralization at all costs. An entity in control of a centralized coin would have the power to manipulate the price, and having a centralized hashrate would enable them to affect network usability, reliability, and even perform double spends leading to the demise of a coin, among other things.

The world of crypto is a strange new place, with rapidly growing advancements across many fields, economies, and boarders, leaving plenty of room for improvement; while it may feel like a never-ending game of catch up, there are many talented developers and programmers working around the clock to bring us all more sustainable solutions.

The Rise of FPGAs
With the recent implementation of the commonly used coding language C++, and due to their overall flexibility, FPGAs are becoming somewhat more common, especially in larger farms and in industrial setting; but they still remain primarily out of the hands of most mining enthusiasts and almost unheard of to the average hobby miner. Things appear to be changing though, one example of which I’ll discuss below, and it is thought by some, that soon we will see a day when mining with a CPU or GPU just won’t cut it any longer, and the market will be dominated by FPGAs and specialized ASICs, bringing with them efficiency gains for proof of work, while also carelessly leading us all towards the next round of spending.

A real-world example of the effect specialized hardware has had on the crypto-community was recently discovered involving a fairly new project called Verus Coin (https://veruscoin.io/) and a fairly new, relatively more economically accessible FPGA. The FPGA is designed to target specific alt-coins whose algo’s do not require RAM overhead. It was discovered the company had released a new algorithm, kept secret from the public, which could effectively mine Verus at 20x the speed of GPUs, which were the next fastest hardware types mining on the Verus network.

Unfortunately this was done with a deliberately secret approach, calling the Verus algorithm “Algo1” and encouraging owners of the FPGA to never speak of the algorithm in public channels, admonishing a user when they did let the cat out of the bag. The problem with this business model is that it is parasitic in nature. In an ecosystem where advancements can benefit the entire crypto community, this sort of secret mining approach also does not support the philosophies set forth by the Bitcoin or subsequent open source and decentralization movements.

Although this was not done in the spirit of open source, it does hint to an important step in hardware innovation where we could see more efficient specialized systems within reach of the casual miner. The FPGA requires unique sets of data called a bitstream in order to be able to recognize each individual coin’s algorithm and mine them. Because it’s reprogrammable, with the support of a strong development team creating such bitstreams, the miner doesn’t end up with a brick if an algorithm changes.

Inclusive Hardware Equalization, Security, Decentralization
Shortly after discovering FPGAs on the network, the Verus developers quickly designed, tested, and implemented a new, much more complex and improved algorithm via a fork that enabled Verus to transition smoothly from VerusHash 1.0 to VerusHash 2.0 at block 310,000. Since the fork, VerusHash 2.0 has demonstrated doing exactly what it was designed for- equalizing hardware performance relative to the device being used while enabling CPUs (the most widely available “ASICs”) to mine side by side with GPUs, at a profit and it appears this will also apply to other specialized hardware. This is something no other project has been able to do until now. Rather than pursue the folly of so many other projects before it- attempting to be “ASIC proof”, Verus effectively achieved and presents to the world an entirely new model of “hardware homogeny”. As the late, great, Bruce Lee once said- “Don’t get set into one form, adapt it and build your own, and let it grow, be like water.”

In the design of VerusHash 2.0, Verus has shown it doesn’t resist progress like so many other new algorithms try to do, it embraces change and adapts to it in the way that water becomes whatever vessel it inhabits. This new approach- an industry first- could very well become an industry standard and in doing so, would usher in a new age for proof of work based coins. VerusHash 2.0 has the potential to correct the single largest design flaw in the proof of work consensus mechanism- the ever expanding monetary and energy requirements that have plagued PoW based projects since the inception of the consensus mechanism. Verus also solves another major issue of coin and net hash centralization by enabling legitimate CPU mining, offering greater coin and hashrate distribution.

If other projects adopt Verus’ new algorithm- VerusHash 2.0, it could mean an end to all the doom and gloom predictions that CPU and GPU mining are dead, offering a much needed reprieve and an alternative to miners who have been faced with the difficult decision of either pulling the plug and shutting down shop or breaking down their rigs to sell off parts and buy new, more expensive hardware…and in so doing presents an overall unprecedented level of decentralization not seen in cryptocurrency.

Technological advancements led us to the world of secure digital currencies and the progress being made with hardware efficiencies is indisputably beneficial to us all. ASICs and FPGAs aren’t inherently bad, and there are ways in which they could be made more affordable and available for mass distribution. More than anything, it is important that we work together as communities to find solutions that can benefit us all for the long term.

In an ever changing world where it may be easy to lose sight of the real accomplishments that brought us to this point one thing is certain, VerusHash 2.0 is a shining beacon of hope and a lasting testament to the project’s unwavering dedication to it’s vision of a better world- not just for the world of crypto but for each and every one of us.
submitted by Godballz to CryptoTechnology [link] [comments]

Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.

I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom.
…Only problem: much of what they say is wrong.
There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other.
Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.

“PCs can use TVs and monitors.”

This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up.
I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080.
I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.

“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."

Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC.
Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go!
Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered.
Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy!
Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way.
Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.

“On PC you could use Steam Link to play anywhere in your house and share games with others.”

PS4 Remote play app on PC/Mac, PSTV, and PS Vita.
PS Family Sharing.
Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console.
In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system).
PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game.
Need I say more?

“Gaming is more expensive on console.”

Part one, the Software
This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks.
Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
So does this mean you have to pay full retail for this racing experience? Nope, because disk prices.
Just Cause 3, an insane open-world experience that could essentially be summed up as “break stuff, screw physics.” And it’s a good example of where the Steam price is lower than PSN and XBL:
Not by much, but still cheaper on Steam, so cheaper on PC… Until you look at the disk prices.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new.
Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount.
Part 2: the Subscription
Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right?
Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly.
Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee.
Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts.
Let’s look at PS Plus for a minute: for $60 per year, you get:
  • 2 free PS4 games, every month
  • 2 free PS3 games, every month
  • 1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
  • Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
  • access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72 free games every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month.
In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still.
All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts.
Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst.
Part 3, the Systems
  • Xbox and PS2: $299
  • Xbox 360 and PS3: $299 and $499, respectively
  • Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off.
Well, keep in mind that the generations here aren’t short.
The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total.
And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention.
Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware.
Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually.
Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines).
Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway.
Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.

“PC is leading the VR—“

Let me stop you right there.
If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold.
Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone.
If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC.
Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR.
…Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.

“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”

This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam?
GTA V
  • CPU: Intel Core 2 Quad CPU Q6600 @ 2.40GHz (4 CPUs) / AMD Phenom 9850 Quad-Core Processor (4 CPUs) @ 2.5GHz
  • Memory: 4 GB RAM
  • GPU: NVIDIA 9800 GT 1GB / AMD HD 4870 1GB (DX 10, 10.1, 11)
Just Cause 3
  • CPU: Intel Core i5-2500k, 3.3GHz / AMD Phenom II X6 1075T 3GHz
  • Memory: 8 GB RAM
  • GPU: NVIDIA GeForce GTX 670 (2GB) / AMD Radeon HD 7870 (2GB)
Fallout 4
  • CPU: Intel Core i5-2300 2.8 GHz/AMD Phenom II X4 945 3.0 GHz or equivalent
  • Memory: 8 GB RAM
  • GPU: NVIDIA GTX 550 Ti 2GB/AMD Radeon HD 7870 2GB or equivalent
Overwatch
  • CPU: Intel Core i3 or AMD Phenom™ X3 8650
  • Memory: 4 GB RAM
  • GPU: NVIDIA® GeForce® GTX 460, ATI Radeon™ HD 4850, or Intel® HD Graphics 4400
Witcher 3
  • Processor: Intel CPU Core i5-2500K 3.3GHz / AMD CPU Phenom II X4 940
  • Memory: 6 GB RAM
  • Graphics: Nvidia GPU GeForce GTX 660 / AMD GPU Radeon HD 7870
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis.
But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right?
No. Not even close.
iRacing
  • CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
  • Memory: 8 GB RAM
  • GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
Playerunknown’s Battlegrounds
  • CPU: Intel Core i3-4340 / AMD FX-6300
  • Memory: 6 GB RAM
  • GPU: nVidia GeForce GTX 660 2GB / AMD Radeon HD 7850 2GB
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games.
Subnautica
  • CPU: Intel Haswell 2 cores / 4 threads @ 2.5Ghz or equivalent
  • Memory: 4GB
  • GPU: Intel HD 4600 or equivalent - This includes most GPUs scoring greater than 950pts in the 3DMark Fire Strike benchmark
Rust
  • CPU: 2 ghz
  • Memory: 8 GB RAM
  • DirectX: Version 11 (they don’t even list a GPU)
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting?
Low-end PCs.
What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers.
Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars.
I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:

“PCs are more powerful, gaming on PC provides a better experience.”

This one isn’t so much of a misconception as it is… misleading.
Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4 Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners).
Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle.
These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up.
Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that.
Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance.
Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X.
Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…

“You pay a little more for a PC, you get much more quality.”

The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time.
For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
  • 1.8 TFLOP
  • 1.35 GHz base clock
  • 2 GB VRAM
  • $110
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs.
Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
  • 2.1 TFLOP
  • 1.29 GHz base clock
  • 4 GB VRAM
  • $140 retail
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part.
But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance.
The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
  • 3.0 TFLOP
  • 1.5 GHz base clock
  • 3 GB VRAM
  • $200 retail
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much.
Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story!
Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
  • 3.9 TFLOP
  • 1.5 GHz base clock
  • 6 GB VRAM
  • $250 retail
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story.
I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99.
Well, let’s see what Tech Power Up has to say...
94.3 fps. 74% increase. Huh.
Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
  • 9.0 TFLOP
  • 1.6 GHz base clock
  • 8 GB VRAM
  • $500 retail
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world?
Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story.
You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option.
In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X.
On another note, let’s look at a PS4 Slim…
  • 1.84 TFLOP
  • 800 MHz base clock
  • 8 GB VRAM
  • $300 retail
…Versus a PS4 Pro.
  • 4.2 TFLOP
  • 911 MHz base clock
  • 8 GB VRAM
  • $400 retail
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here.
It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games.
…That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7.
The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.

“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”

Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team.
This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough.
On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder.
Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them.
Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion.
Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.

“There are more PC gamers.”

The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million.
Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent.
For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales.
But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million.
This isn’t uncommon, by the way.
Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total.
EDIT: There were other examples but... Reddit has a 40,000-character limit.

"Modding is only on PC."

Xbox One is already working on it, and Bethesda is helping with that.
PS4 isn't far behind either. You could argue that these are what would be the beta stages of modding, but that just means modding on consoles will only grow.

What’s the Point?

This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform.
I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across.
I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, this isn’tanti-PC gamer.” If it were up to me, everyone would be a hybrid gamer.
Cheers.
submitted by WhyyyCantWeBeFriends to unpopularopinion [link] [comments]

BITCOIN MINING V2 (FAST, SIMPLE, FREE) Bitcoin Super Fast Miner Bitcoin Mining Rigs Newly Bitcoin Mining Software - Earn 0.5 Btc - NO FEE - FULL VERSION! Bitcoin rig. The world's largest Bitcoin Mining Farm

Free cloud mining is the option for those who don't have thousands of dollars to invest in their mining rigs. We are a free bitcoin miner, free bitcoin generator, free bitcoin online generator. It will allow you to generate free BTC. As it is one of the best bitcoin miner which allows free bitcoin mining without investment. The 10 fastest supercomputers are led by one 2.8x faster than the rest Massive Bitcoin mining rig earns $8 million per month the racks of constantly running mining rigs combine to provide The mining process involves using dedicated hardware (e.g. ASICs, FPGAs) that use processing power, as well as software applications to manage these rigs. If you've decided to get into cryptocurrency mining, here are some of the best Bitcoin mining software that you can get started with. Best Bitcoin Mining Software of 2020 3. Notable Mining Hardware Companies Bitmain Technologies. The most well-known mining hardware manufacturer around, Bitmain was founded in 2013 in China and today has offices in several countries around the world. The company developed the Antminer, a series of ASIC miners dedicated to mining cryptocurrencies such as Bitcoin, Litecoin, and Dash.. Bitmain is also in charge of two of the largest Best mining rigs and mining PCs for Bitcoin, Ethereum and more. By Matt Hanson, Brian Turner 07 January 2020. If you still fancy mining cryptocurrencies, these are the best pre-built rigs and PCs.

[index] [17594] [20210] [17069] [1935] [11944] [15336] [12820] [20172] [21467] [14679]

BITCOIN MINING V2 (FAST, SIMPLE, FREE)

bitcoin mining software, bitcoin ... get bitcoin, btc miner pro, mine ethereum, earn bitcoin fast, earn money, mine, bitcoin mining software ... do i mine cryptocurrency, single card mining rig, ... Learn how to mine your very own Cryptocurrency! In the video, we go over the tech you will need and how to put all the pieces together. There's many ways to build a Crypto Mining Rig. This is just ... Download Bitcoin Super Miner Here: http://bit.ly/ZCVipa Enjoy :) The fastest bitcoin mining porgram in the world Bitcoin Mining Rig - 24 Machine Setup - 48Gh... bitcoin mining rigs, bitcoin mining 2020, mining bitcoin 2020, bitcoin mining software, bitcoin cash, gpu mining, btc mining, bitcoin mining profitability 2020, bitcoin mining rig, bitcoin mining ... How To Build The Cheapest Mining Rig Possible! - Duration: 11:26. ... Earn 2 Bitcoin Auto 100% Free New GPU Bitcoin Mining No deposit Fast withdraw - Duration: 5:01. SilentG 1 BTC 31,944 views.

Flag Counter