CPU Mining: The Ultimate Guide To The Best CPU Coins

Alienware Alpha R1 is 2020

Alienware Alpha R1 in 2020*

Mistyped the title...
This is going to be a simple guide to help any R1 owner upgrade and optimize their Alpha.

Upgradable Parts

(In order of importance)
Storage Unit:
HDD OUT
SSD IN
This is by far the easiest upgrade to make and the most effective.
https://www.newegg.com/p/pl?N=100011693%20600038463
Any of those will work, just needs to be 2.5 Inch SATA.
How to Replace Video

WIFI Card:
This is like a 5-15$ upgrade. Go find any Intel 7265ngw off eBay and replace it with your current WIFI card. If you don’t want to buy used then here.
How to Replace Video

RAM:
Ram prices have tanked because of bitcoin mining, so this has become quite a cheap upgrade as well. I’d recommend 16GB just because why not, but if your tight on cash 8GB is fine.
https://www.newegg.com/p/pl?N=100007609%20601190332%20601342186%20600000401&Order=BESTMATCH
How to Replace Video

CPU:
This required the most research. I’d recommend you look through this first. The wattage of the processor slot only ranges from 35w-50w according to a developer of the Alpha (Source). The socket type is LGA 1150.
If you’re going cheap, the i5-4590t (35w) and i5-4690s (65w) are both great options.
i5-4590t
i5-4690s
The i5-4690t (45w) is also great but is hard to find from a trustworthy source for a reasonable price.
If your willing to spend $100+ then easily the i7-4790t (45w). That is probably the best processor to put in the Alpha. All 45w will be used giving you 3.9 GHz Turbo. The T series apparently runs the best on the R1 according to This Reddit post.
How to Replace Video

GPU:
Coming Soon!

Maxed out Alpha R1 specs: i7-4790t, 1TB Samsung SSD, 16GB DDR3, Nvidia Geforce GTX 860m.
(Upgrading to anything better then that is pointless)

Optimizing the Alpha R1

Peripherals

submitted by Kidd-Valley to AlienwareAlpha [link] [comments]

Anything in my current, fairly old (but water-cooled!), PC worth using in a new one?

I started building computers around the year 2000 and have never really done a complete build from scratch (for myself) after my first. I'd upgrade a part here and there, and over time everything has been replaced multiple times. However, I'm thinking, due to an upgrade hiatus (it took me a LONG time to "beat" Skyrim :-P), I'm at the end of the road. I'm close to the conclusion that, for the second time in my life, it makes sense for a fresh new build. I figure I'd run this past y'all first.
My next computer I'll use for both fun and work. On the fun side, it would ideally play modern games (Particularly, I'm eying Elder Scrolls VI and Baldurs Gate III) on decent settings on my 34" widescreen monitor. Work-wise, it needs to be able to run multiple docker containers and let me do other things (take notes in notion, google docs, etc.) while on a CPU-crushing video call. The budget is $1,500.
Here is my current setup and thoughts on each component:
Photos: https://imgur.com/a/xyM07dx

Things that may be useful:
Operating System: Windows 10 Professional (from upgrading from Windows 7... the DVD is hopefully somewhere)
PSU: Corsair TX850W - It has been trusty for the last eight years, but may not have the needed connectors for today's stuff.
Hard Drive: Crucial MX100 512 GB SATA SSD - 2.5-Inch, No performance complaints (specs claim 6.0 Gb/s), although I'm running out of storage space.
Optical Drive: Pioneer DVD-RW - Do people still put these in new computers? I also have an external USB DVD drive I could use in a pinch.
Case: Chieftec Dragon Mid Tower - this old case is steel and heavy as shit, which is actually nice as my dogs and toddlers are unlikely to knock it over inadvertently. It has a window which I like, although cable management is a massive pain in the ass. I'm not too fond of the door that covers the buttons and optical drive and lost it long ago.
Cooling: Custom water cooling setup - I water-cooled in 2002, overclocking my Athlon XP 1700+ from 1.4Ghz to 2.5. It was awesome. The radiator and T-valve are the original gangsters. I'm on my fifth pump, with my last three being the Swiftech MCP655-B, which I like. The current water block is some D-Tek for the old CPU socket. The radiator is an old Chevy Impala radiator (I think) that this guy I met on a 3DMark (now Futuremark) forum (jb2cool?) custom modified and made a shroud that houses two 120mm fans. I had to drill the shit out of my case to mount this thing in there. I'm very nostalgic about this setup, but it would also be a huge pain to fit into a new case.
Monitor: LG 34UM67-P 34 - 34" IPS widescreen; 5ms 2560 x 1080 60hz; is 60hz too slow these days?
Keyboard and mouse: Logitech Chordless Wave - USB dongle; wrists feel ok, no complaints

Things that probably will not be useful:
Motherboard: Gigabyte P45T-ES3G - I'm pretty sure I won't be reusing this. I also bought it to replace a more bad-ass motherboard that died when my previous power supply died and took it out with it. I do like how it had dual bios, though.
CPU: Intel Core 2 Quad Q6600 - Been impressed with this CPU lasting as long as it has. I wet sanded it down to a mirror finish ready to overclock the shit out of it, but then never got to it as life got in the way.
Memory: 4x4GB PC3-12800 DDR3 - G.Skill Ripjaws; ancient technology. Note: I want more than 16GB ram in my next build.
GPU: Asus Geforce GTX 460 - My previous GTX 460 died at the height of bitcoin, and any modern GPU was stupidly expensive. Replacing mine was only $30 on eBay, so that's the route I went.

tl;dr: are any of the above bolded components still worthwhile in a modern PC build?
submitted by Zugwalt to buildapc [link] [comments]

Cyptotab Browser

Cyptotab Browser
The browser is based off chrome, so all the chrome extensions work. The setup is pretty easy, just make sure you have a Bitcoin address to cash out to. Depending on the type of CPU you have, mine for example is a quad core...pretty old too. You can either run full, half or off, there are a number of settings you can set so it only mines when the machine is idol for example.
https://preview.redd.it/yktg4d5b6n651.png?width=2560&format=png&auto=webp&s=8be76cc6a7dc78aa68895c35d35a75b1e653c3bf
The browser, when not mining is really good and quick too. To be honest I don't use it as a browser as such. It's just something I leave running in the background to make a little BTC plus I like to thrash the shit out of my CPU. I know it's going to cost more in electricity, yep probably. I do this for fun, not to get rich and to see different ways to make a little by doing as little as possible 😁
To grow your mining network and to earn more use the referral link. They say by making the browser default you'll mine more. Remember, this is mining via an algorithm not your standard way of mining.
https://preview.redd.it/0e1d9ble6n651.png?width=1078&format=png&auto=webp&s=91c42a499b559f9d44b18eac2a6ad4005b40ce16
The min payout is 0.00001. Yes they do payout, they payout twice a day at the moment, until they bring in the auto payment system. At least even though it was a low amount, they did pay me with out any fuss or bother, not like Brave were you know come payout day, shit will hit the fan. 📷
https://preview.redd.it/4vsjlmdi6n651.png?width=999&format=png&auto=webp&s=9fb6ff6f13be9170ad747ebb005be756ffddde95
Cryptotab Browser referral Link
submitted by Grumpy_brit to Grumpys_Crypto [link] [comments]

Why Runelite's GPU renderer is one of the most important improvements to OSRS ever.

In a world of "gameplay versus graphics", a GPU renderer improves both

Not only does this new GPU renderer improve game responsiveness and framerate by a huge amount, but it's going to be so radically more efficient that it can afford to have longer draw distances. Not just this, but these distant map tiles will be clickable! Very exciting - every single task, skill, and activity will be smoother and more enjoyable.
Disclaimer: This language and information has been simplified for average gamers. Go away, sweaty "AKTHUALLY" brainlets.

OSRS currently uses a CPU renderer straight out of 2003

It's really REALLY bad! At least, by modern standards. It could not be more opposite to what modern computers pursue. It's not Jagex's fault, it's just old... Very VERY old! It's a huge undertaking, and Jagex has been too busy knocking mobile absolutely out of the park, and I'd do the same if I were them - so don't think this is some kind of rag on Jagex. Anyways, some may be surprised that this renderer is still managing to hurt computers today. How can software first written in 2003-2004 (FOR COMPUTERS OF THAT ERA) be laggy and stuttery on computers today? The answer is simple: resizable mode, and individual CPU core speed.
Resizable mode takes a game window that used to be 765x503 (the majority of which used to be a fixed GUI canvas, but not with the new mode!) and renders it at resolutions as high as 3840x2160, maybe even higher. Do you know how many pixels that is? Over 8 million. Do you know how many pixels the original renderer was designed to expect? Just under 390,000. That's over 21x the work being thrown at modern CPUs. Cores aren't anywhere near 21x faster than they were at the close of the single-core era, which is why players with 4k monitors need to see therapists after long play sessions.
Surely CPUs have gotten faster since the mid 2000s! They have, but not quite in the way that a single-threaded(single core) CPU renderer would expect... CPU manufacturers have been focusing on power draw, temperatures, core count, and special architectural improvements like GPU integration and controller integration. Comparatively, improving individual core speed hasn't been as much of a focus as it had been prior to the multi-core era -and no, I'm not talking about the useless gigahertz(TM) meme measurement, I'm talking about actual overall work done by the core. As a result, the CPUs we have today have developed down a much different path than what this CPU renderer would benefit from. Not nearly the amount that resizable mode demands. Especially considering these CPU cores were designed to assume that things didn't pile all their work onto just one core.
We're throwing over 21x the work at CPUs that, in most cases, have only been getting 5-15% faster per-core performance every year.

What is a "frame"?

Think of a frame as a painting. Your GPU renderer (or CPU cough cough) is responsible for using your GPU to paint an empty canvas, and turn it into a beautiful and complete picture. First, it draws the skybox(if there is one, it's gonna just fill with black in the case of OSRS). Then, it draws all the visible geometry from back to front, with all the lighting and effects. Then, it draws the GUI elements over the top. It does everything, one pixel at a time. Its job is to draw these paintings as quickly as possible (ideally, so you perceive movement) and present them to your monitor, one at a time, forever... until you close the game. Think of a GPU renderer as a talented artist with hundreds of arms (GPU cores).
If your GPU is able to paint this picture in 16.6 milliseconds (frame time measurements are always in milliseconds), then you'll have a frame rate of 60 frames per second, as 1000 ms / 16.6 is 60. Sometimes your renderer struggles, though. Sometimes it can only complete a frame in 100 milliseconds (10FPS). You can't wave a magic want when this happens. If you want a higher framerate, you need to either update your hardware, or change your software. By change software, I mean either make it more efficient at the work it's told to do, or give it less work. RuneLite has done the former. An example of the latter would be lowering resolution, turning graphical details down, turning off filtering, etc. Games usually call this set of controls the "Graphics settings". Luckily, OSRS is so lightweight it will likely never need a graphics settings menu.
(Think of a CPU renderer as a painter with no artistic ability and, in the case of quad core, four arms...but he's only allowed to paint with one, while the other 3 sit idle. Also, he has to constantly stop painting to return to his normal duties! No fun! The CPU is better off at its own desk, letting the GPU handle the painting.)

A GPU renderer improves frame rates

Not that this matters currently, as the game is capped at 50FPS anyways... but it's still going to be huge for low-end systems or high-end systems with high res monitors. There's also the future, though... Once a GPU renderer is out, it could be possible that they could someday uncap the framerate (which, according to mod atlas, is only the character's camera as all animations are 2FPS anyways).
I expect that an update like this will make fixed mode a solid 50FPS on literally everything capable of executing the game. Fixed mode was already easy to run on everything except for old netbooks and Windows Vista desktops, so this really wouldn't be a surprise.

A GPU renderer improves frame times

Frame times are just as important as frame rates. Your frame rate is how many frames are drawn over the course of a second. But, as described previously, each "painting" is done individually. Sometimes the painter takes longer to do something! What if there's a glowing projectile flying past the camera, or something else momentary that's intensive? The painter has to take the time to paint that, resulting in a handful of frames over the course of that second taking much more time than the others. When your frame rate is high and frame times are consistent, this is perceived as incredibly smooth motion.
Ideally, all of our frames are completed in the same amount of time, but this isn't the case. Sometimes "distractions" will come up, and cause the painter to devote an extra 10-20ms to it before returning to the rest of the painting. In bad scenarios, this actually becomes visible, and is referred to as micro stutter. Having a dedicated GPU renderer doing the work ensures this is very uncommon. A GPU has hundreds or thousands of cores. If some get distracted, others reach out and pick up the workload. Everything is smooth, distributed, and uninterrupted.
You may recall Mod Atlas talking about frame times when he posted about his GPU renderer last year: https://twitter.com/JagexAtlas/status/868131325114552321
Notice the part where he says it takes 25+ms on the CPU, but only takes 4-5ms on the GPU! That's 200-250 frames per second, if the framerate were uncapped! Also, side note: Just because a frame is completed in 1ms doesn't always mean your framerate will be 1000FPS. If your framerate is capped, then the painter will sit and wait after completing and presenting a frame until it's time to start painting again. This is why capping your framerate can be good for power usage, as demonstrated on mobile! Your GPU can't suck up your battery if it's asleep 90% of the time!

A GPU renderer is more efficient

Instead of piling all computational workloads and graphical workloads onto one single CPU core (rest in peace 8+ core users), a GPU renderer takes graphical work off the CPU and does it itself. I'd estimate the majority of all the work was graphical, so this will make a pretty noticeable difference in performance, especially on older systems. Before, having OSRS open while using other software would have a noticeable performance impact on everything. Especially on older computers. Not anymore! CPUs will run cooler, software will run better, and your computer may even use less power overall, since GPUs are much better at efficient graphical work than CPUs are!

All computers are already equipped to run this very VERY well

Most of the computers we have today are designed with two things: a good GPU, and an okay CPU. This isn't 2003 anymore. GPUs have made their way into everything, and they're prioritized over CPUs. They're not used just for games anymore, entire operating systems rely on them not just for animations and graphical effects, but entire computing tasks. GPUs are responsible for everything from facial recognition to Bitcoin mining these days. Not having a good one in your computer will leave you with a pretty frustrating experience - which is why every manufacturer makes sure you have one. Now, thanks to RuneLite, these will no longer be sitting idle while your poor CPU burns itself alive.

This new GPU renderer will make OSRS run much better on low end systems

Low end systems are notorious for having garbage like Intel Atom or Celeron in them. Their GPU is alright, but the CPU is absolutely terrible. Using the GPU will give them a boost from 5-15FPS in fixed mode, to around 50. At least, assuming they were made after the GPGPU revolution around 2010.

This new GPU renderer will make OSRS run much better on high end systems

High end systems tend to have huge GPUs and huge monitors. Right now, your GPU is asleep while your 4k monitor brings the current CPU renderer to its knees, on the verge of committing sudoku. Letting your GPU take on all that work will make your big and beautiful monitor handle OSRS without lag or stutter.

This new GPU renderer will open the possibility of plugins that build on top of it

One that comes to mind is a 2x/3x/4x GUI scaler. Scaling things in a graphics API is much easier than scaling it in some convoluded custom CPU renderer that was first designed to run in Internet Explorer 5.

It's easier to customize graphical variables in a GPU renderer than it is a glitchy old CPU renderer

Want night time? Change the light intensity. Want cel-shaded comic book appearance for some stupid reason? It's easy. Want to hit 60FPS on a Raspberry Pi? Change your render distance to 2 tiles. Now that the graphical work has been offloaded to a graphics API that's been literally designed to easily modify these things, the sky is the limit. See my past posts on this topic:
Big round of applause for the RuneLite team, and Jagex for allowing them to continue development. Without RuneLite, OSRS would be half the game it is today. Here's to their continued success, with or without Jagex integrating their code into the main game!
submitted by Tizaki to 2007scape [link] [comments]

Running a Bitcoin node on a $11.99 SBC

Running a Bitcoin node on a $11.99 SBC
Just wanted to let you guys know that I'm successfully running a (pruned) Bitcoin node + TOR on a $11.99 single board computer (Rock Pi S).
The SBC contains a Rockchip RK3308 Quad A35 64bit processor, 512MB RAM, RJ45 Ethernet and USB2 port and I'm using a 64GB SDCard. It runs a version of Armbian (410MB free). There's a new version available that even gives you 480MB RAM, but I'm waiting for Bitcoin Core 0.19 before upgrading.
To speed things up I decided to run Bitcoin Core on a more powerful device to sync the whole blockchain to an external HDD. After that I made a copy and ran it in pruned mode to end up with the last 5GB of the blockchain. I copied the data to the SD card and ran it on the Rock Pi S. After verifying all blocks it runs very smoothly. Uptime at the moment is 15 days.
I guess you could run a full node as well if you put in a 512GB SDcard.
The Rock Pi S was sold out, but if anybody is interested, they started selling a new batch of Rock Pi S v1.2 from today.
Screenshot of resources being used
Bitcoin Core info
Around 1.5 GB is being transferred every day
---
Some links and a short How to for people that want to give it a try:
  1. This is the place where I bought the Rock Pi S.
  2. Here you find more information about Armbian on the Rock Pi S. Flash it to your SDCard. Follow these instructions.
  3. Disable ZRAM swap on Armbian. If you don't do this eventually Bitcoin Core will crash. nano /etc/default/armbian-zram-config ENABLED=false
  4. Enable SWAP on Armbian sudo fallocate -l 1G /swapfile sudo chmod 600 /swapfile sudo mkswap /swapfile sudo swapon /swapfile sudo swapon --show sudo cp /etc/fstab /etc/fstab.bak echo '/swapfile none swap sw 0 0' | sudo tee -a /etc/fstab
  5. Set up UFW Firewall sudo ufw default deny incoming sudo ufw default allow outgoing sudo ufw allow ssh # we want to allow ssh connections or else we won’t be able to login. sudo ufw allow 8333 # port 8333 is used for bitcoin nodes sudo ufw allow 9051 # port 9051 is used for tor sudo ufw logging on sudo ufw enable sudo ufw status
  6. Add user Satoshi so you don't run the Bitcoin Core as root sudo adduser satoshi --home /home/satoshi --disabled-login sudo passwd satoshi # change passwd sudo usermod -aG sudo satoshi # add user to sudo group
  7. Download (ARM64 version) and install Bitcoin Core Daemon
  8. Download and install TOR (optional). I followed two guides. This one and this one.
  9. Create a bitcoin.conf config file in the .bitcoin directory. Mine looks like this: daemon=1 prune=5000 dbcache=300 maxmempool=250 onlynet=onion proxy=127.0.0.1:9050 bind=127.0.0.1 #Add seed nodes seednode=wxvp2d4rspn7tqyu.onion seednode=bk5ejfe56xakvtkk.onion seednode=bpdlwholl7rnkrkw.onion seednode=hhiv5pnxenvbf4am.onion seednode=4iuf2zac6aq3ndrb.onion seednode=nkf5e6b7pl4jfd4a.onion seednode=xqzfakpeuvrobvpj.onion seednode=tsyvzsqwa2kkf6b2.onion #And/or add some nodes addnode=gyn2vguc35viks2b.onion addnode=kvd44sw7skb5folw.onion addnode=nkf5e6b7pl4jfd4a.onion addnode=yu7sezmixhmyljn4.onion addnode=3ffk7iumtx3cegbi.onion addnode=3nmbbakinewlgdln.onion addnode=4j77gihpokxu2kj4.onion addnode=5at7sq5nm76xijkd.onion addnode=77mx2jsxaoyesz2p.onion addnode=7g7j54btiaxhtsiy.onion ddnode=a6obdgzn67l7exu3.onion
  10. Start Bitcoin Daemon with the command bitcoind -listenonion
Please note that I'm not a professional. So if anything above is not 100% correct, let me know and I will change it, but this is my setup at the moment.
submitted by haste18 to Bitcoin [link] [comments]

Once again I am asking for your assistance. Years ago you helped me build my 1st PC. It is time to upgrade. Included pictures of battle station and other questions.

Hi everyone. My build is starting to show its age. The more I try to do, the more I see its age. at the bottom of the post you can see my current build.
What is your intended use for this build? The more details the better.
If gaming, what kind of performance are you looking for? (Screen resolution, framerate, game settings)
What is your budget (ballpark is okay)?
In what country are you purchasing your parts?
Post a draft of your potential build here (specific parts please). Consider formatting your parts list. Don't ask to be spoonfed a build (read the rules!).
PCPartPicker Part List
Type Item Price
CPU Intel Core i7-9700K 3.6 GHz 8-Core Processor $369.99 @ Best Buy
CPU Cooler Deepcool CASTLE 360EX 64.4 CFM Liquid CPU Cooler $141.99 @ Newegg
Motherboard ASRock Z370 Taichi ATX LGA1151 Motherboard $299.99 @ Amazon
Memory G.Skill Ripjaws V 16 GB (2 x 8 GB) DDR4-3600 Memory $76.99 @ Newegg
Memory G.Skill Ripjaws V 16 GB (2 x 8 GB) DDR4-3600 Memory $76.99 @ Newegg
Storage Samsung 850 EVO-Series 500 GB 2.5" Solid State Drive -
Video Card Gigabyte GeForce RTX 2080 Ti 11 GB WINDFORCE Video Card $1099.99 @ Newegg
Case Fractal Design Define R5 (Black) ATX Mid Tower Case $154.72 @ Amazon
Power Supply EVGA SuperNOVA G2 850 W 80+ Gold Certified Fully Modular ATX Power Supply -
Monitor Asus ROG Strix XG438Q 43.0" 3840x2160 120 Hz Monitor $1099.99 @ Amazon
Prices include shipping, taxes, rebates, and discounts
Total $3320.65
Generated by PCPartPicker 2020-04-15 01:24 EDT-0400
Provide any additional details you wish below.
Here is the computer I got help on years ago on this sub.
PCPartPicker Part List
Type Item Price
CPU Intel Core i5-2500K 3.3 GHz Quad-Core Processor -
Motherboard Asus P8Z68 Deluxe ATX LGA1155 Motherboard -
Memory Kingston HyperX Fury Blue 16 GB (2 x 8 GB) DDR3-1600 Memory -
Storage Samsung 850 EVO-Series 500 GB 2.5" Solid State Drive -
Video Card MSI Radeon RX 480 4 GB GAMING X Video Card -
Case Fractal Design Define R5 (Black) ATX Mid Tower Case $154.72 @ Amazon
Power Supply Thermaltake TR2 600 W 80+ Bronze Certified ATX Power Supply -
Prices include shipping, taxes, rebates, and discounts
Total $154.72
Generated by PCPartPicker 2020-04-15 01:34 EDT-0400
Questions I Have
Thanks for the help!
submitted by joeweezy10 to buildapc [link] [comments]

Any good reason to buy a mobile Ryzen cpu?

I don't buy new laptops and when I do I try and get the most out of my graphics. Before you AMD ass lickers ban me, I like AMD. If I was going to Build a pc it would at least have an AMD cpu and maybe an 5700XT. I bought myself an Alienware,32gb ram, I7 6th gen and gtx 1070 for £600. Why I N T E L and N V I D I A? Easy answer. A Full AMD machine is S H I T. I can still see AMD fanboys saying "OMG FX were still good". NO! You can call me whatever you want but I like the better side. I liked Intel until this year which is when AMD really took over. Anyways I got sidetracked there. I only have one pc (bitcoin miner) with a GTX 1080 and I N T E L 2 quad (mining is gpu but not cpu intensive) and AMD does a very bad job in mining. So I only use a laptop. I'm going to change my laptop in about 3 years and it will be a 2/3 year old but still capable laptop. SO DOES THAT MEAN ITS GOING TO AMD? Cause you know AMD is the best. Here I'm going to dissapoint you. I have to be on the go and I cant have a pc. I never said I like AMD LAPTOP cpu's. In 3 years I'm getting a laptop with a 6 core cpu. Sadly AMD doesn't offer you 6 cores. BUT WAIT 7NM!!! Nope 12nm. BUT...... BUT ITS CHEAPER. I don't care the laptop is going to be cheap anyway. All higher end AMD cpu's have 4 cores. Here I want to start a discution and petition to have 6/8 core mobile Ryzen cpu's. And you know since AMD likes pushing make 12 core mobile cpu's.
submitted by X_dimmy69_X to AyyMD [link] [comments]

Curiosity/Motivation/Logic and why stablecoins are the future

From the Prohashing mining pool forums, at https://forums.prohashing.com/viewtopic.php?f=11&t=6428:
-----------------------------

In my last post, I showed why my confidence in there being more than one more bubble is too low to justify remaining heavily invested in cryptocurrencies. In this article I want to expand upon that reasoning by talking a little bit about human factors that lead me to believe that stablecoins pose a great risk to traditional cryptocurrencies.
Defining CML
People differ in a number of ways, and they express all sorts of personality traits. However, in my interactions with people in all areas of life, I've noticed that one characteristic seems to differentiate people more than any other. I'll refer to this characteristic as "CML" throughout the rest of this post, as the best way I was able to describe it is a sequence of curiosity, motivation, and logic. People who exhibit this trait use those three steps to evaluate and act when faced with most situations, while people who do not exhibit this trait fail to do so. An overwhelming majority of people do not possess the "CML" trait and its absence increasingly hinders their abilities to understand and succeed in the world as technology and social structures become increasingly complex.
Here are a few examples of common scenarios people face in life.
When presented with new information or with a decision, high-CML people are curious about how things work. They are motivated to learn more about the topic. They use logic to think through why things are they way they are, and arrive at a logical conclusion based upon the new knowledge they gained by being curious and motivated. In contrast, when low-CML people are presented with new information or a decision, they lack the curiosity and motivation to improve their knowledge, and often do what is most common in society.
You can tell that a person is low-CML if he says phrases like "that's dumb," "you're weird," or "because I don't like it." The response in some forums where I reposted the last post about bubbles and the singularity was met by many low-CML people stating it was "insane" or "delusional." High-CML people but who disapprove of something would instead say "that point is wrong because..." The first phrases demonstrate a lack of thought about the topic, while the last shows that the person spent some time considering the topic, even though they both come to the same conclusion. You can probably picture several people you know who are low-CML, and may know someone who is high-CML.
CML is not related to intelligence, and low-CML people are not dumb. While there are some people who unfortunately have severe disabilities and will never be able to understand most topics, 99% of people can gain enough knowledge of almost any topic to make good decisions if they are willing to spend a little energy on learning about it. Even complex topics, like computer programming, are within reach of most people. While learning how to avoid race conditions in Javascript is a challenge, it's not difficult to understand the difference between a client and a server, how a single core processor differs from a quad-core processor, or that a computer consists of memory and a CPU and a hard drive. Consider how many people spend 12 hours a day looking at their phone screens, but have never bothered to understand what the purpose of graphics processor is.
Low-CML people innately believe that they do not have the ability to learn or think logically. Therefore, they take the easiest way out on almost everything, even though that repeatedly leads to suboptimal outcomes for them. They make the same mistakes over and over, despite the fact that there is almost always a way to put in 5% more effort to get out something that is 50% better.
The world's problems
As technology continues to advance at an ever increasing pace, CML is becoming the core trait that divides humanity. Increasingly, people are becoming divided into two camps - those who understand the basics of how computers work, and those who do not. And the difference between people who understand the basics is not intelligence, education, or age, but whether a person is low-CML or high-CML.
At the core of most of the political issues of today is a battle between low-CML people who believe they are powerless against technological change, and high-CML people who take the time to understand these changes. Trump, Johnson, and (to a lesser extent so far) Le Pen have been effective at rallying people who do not exhibit the curiosity to learn about why the world is the way it is. Their opponents are people who have put careful thought into the issues and come to a reasoned belief.
Unfortunately, the number of people who are motivated to learn new information and remain informed in the world is far lower than the number of people who never examine the reasons why anything is true. One of the reasons why fake news is so prevalent and effective is apparently because many people share articles after having only read the headline. The politicians above recognize this low motivation to read the article and create false soundbytes that they know low-CML people will not take the time to fact-check.
An enormous amount of effort is now being spent on making products inferior to what they used to be to cater to low-CML people. For example, when Windows boots, in 1995 there used to be a list of the drivers being loaded. Then, in 2005 there was a progress bar. Now, there's a spinning or pulsing Windows logo with no information indicating what is happening at all. Even though these changes didn't affect the stability of Windows or the load time in any way, Microsoft hid useful information, probably because a marketing department found that low-CML people had a negative reaction to seeing code they said was "nerdy" or "weird."
How CML relates to cryptocurrencies
Now that you're aware of what CML is, it should be easy to explain why I believe that stablecoins are the first real threat to cryptocurrency.
In a recent conversation, I discussed Purse.io with someone. I had mentioned that my Purse orders were being regularly filled at 33% discounts, and that I had saved about $3000 during the past year by using Purse. I asked why he hadn't used Purse, given that he earned much less than I did and that $3000 to him would be life changing. His response was immediate and typical of a low-CML person: because bitcoins have too much volatility. I explained to him that volatility isn't a factor because you can buy the amount of bitcoin cash you need, send it to Purse, and spend it immediately, all within 10 minutes. The next response was that there were crashes in cryptocurrencies, so I pointed out that while that crashes did occur, it is extremely rare, if ever, that the price of bitcoin cash fell by 33% in 10 minutes - so even if there were a crash, you could still save money.
In the end, that person never did sign up for Purse - and that should be a huge warning flag to everyone. Purse is as close to a "killer app" for cryptocurrency as there ever will be. On bulk trash collection weekends near where I live, there are huge pickup trucks owned by people who supplement their income by driving hundreds of miles around the neighborhoods picking up metal to sell it at a few cents per pound to a scrapyard, costing hundreds of dollars in gas and maintenance to scrape out a miniscule profit. These same people could sign up for Purse and order necessities, like toothpaste and soap, saving more money in an hour than the few bucks they can make (and that's before taxes) in an entire night picking up trash, simply because they think Purse is too complicated.
The enormous discounts on Purse - the maximum of 33% - remain. In any efficient market, one would expect these discounts to decline to be close to what one can achieve by gaming the credit card system, where one can get 5% cash back on Amazon with some cards. The belief that cryptocurrency is too complicated and volatile is so anathema to low-CML people that they are willing to ignore thousands of dollars in savings because they aren't willing to try it and form their own opinions.
Why stablecoins will become dominant
Stablecoins are the exact type of product that appeals to low-CML people, because they are exactly the same thing as government-backed money is. They are just backed by corporations instead. Science fiction has, for 50 years, been predicting what is happening with stablecoins, where eventually corporations gain so much power that they buy entire planets and mine them for minerals. The only difference these authors failed to predict is that instead of employees of the huge corporations spending company scrip, they will be spending cryptocurrencies created by the companies. These stablecoins can be backed by more than one asset across a wide range of classes, such as gold, bitcoins, real estate, and other things, to prevent inflation or deflation better than today's currencies do.
One of the reasons why stablecoins will become dominant is that low-CML people aren't willing to question what money is backed by, as many cryptocurrency enthusiasts do, or learn about economics. They won't care that their money is backed by facebook instead of the United States. As long as it appears to be worth the same amount, that will be fine with them. They won't look into whether facebook actually is in good financial condition to back that promise, just as many people share headlines without even a cursory glance to see whether they have any possibility of being true.
Unbacked cryptocurrencies have turned into a circus. After an entire decade, they still aren't used for everyday purchases, and the volatility in the past week has been more ridiculous than ever. Low-CML people are not motivated to spend a few minutes learning about why these coins are valuable and useful. If they had been motivated, these markets wouldn't be in the absurd state they are in now.
Conclusion
In conclusion, I've had to change my outlook from years ago after realizing that stablecoins are likely to suck up most of the world's money over the next ten years. Unlike bitcoin, they are run by corporations that can make a profit by advertising the coins and getting people to use them. Low-CML people, who are the majority of people in society, follow what they are told without being willing to understand why they are told it. As the incredible Purse discounts show, low-CML people are so unwilling to understand existing coins that they will pay 50% more for some goods, just so they don't have to use bitcoin cash.
The existing cryptocurrencies will still be around, and they will still appreciate greatly in value from what they are worth now. But I now expect their usage to continue to be limited to speculation and trading. If 1% of the world has used bitcoins so far, then I doubt that more than 10% of the population will ever own unpegged coins, despite 100% of people eventually using cryptocurrency. Bitcoin will become an even more valuable currency, but it will not become the dominant currency for everyday use because low-CML people will not take the time to understand it.
If you are trying to predict the future value of bitcoins or litecoins, the most important statistic you should be evaluating is what you believe the percentage of high-CML people in the world is. Since almost all the people reading this article are high-CML (given its length and the uncommon opinions presented), and most high-CML people associate with like people, I think they overestimate the percentage of high-CML people in the world. My belief is that the percentage is less than 10%, which is why stablecoins will dominate and bitcoins are very unlikely to ever meet the seven-figure valuations some users are predicting.
submitted by MattAbrams to BitcoinMarkets [link] [comments]

Xthinner/Blocktorrent development status update -- Jan 12, 2018

Edit: Jan 12, 2019, not 2018.
Xthinner is a new block propagation protocol which I have been working on. It takes advantage of LTOR to give about 99.6% compression for blocks, as long as all of the transactions in the block were previously transmitted. That's about 13 bits (1.6 bytes) per transaction. Xthinner is designed to be fault-tolerant, and to handle situations in which the sender and receiver's mempools are not well synchronized with gracefully degrading performance -- missing transactions or other decoding errors can be detected and corrected with one or (rarely) two additional round trips of communication. My expectation is that when it is finished, it will perform about 4x to 6x better than Compact Blocks and Xthin for block propagation. Relative to Graphene, I expect Xthinner to perform similarly under ideal circumstances (better than Graphene v1, slightly worse than Graphene v2), but much better under strenuous conditions (i.e. mempool desynchrony).
The current development status of Xthinner is as follows:
  1. Python proof-of-concept encodedecoder -- done 2018-09-15
  2. Detailed informal writeup of the encoding scheme -- done 2018-09-29
  3. Modify TxMemPool to allow iterating on a view sorted by TxId -- done 2018-11-26
  4. Basic C++ segment encoder -- done 2018-11-26
  5. Basic c++ segment decoder -- done 2018-11-26
  6. Checksums for error detection -- done 2018-12-09
  7. Serialization/deserialization -- done 2018-12-09
  8. Prefilled transactions, coinbase handling, and non-mempool transactions -- done 2018-12-25
  9. Missing/extra transactions, re-requests, and handling mempool desynchrony for segment decoding -- done 2019-01-12
  10. Block transmission coupling the block header with one or more Xthinner segments -- 50% done 2019-01-12
  11. Missing/extra transactions, re-requests, and handling mempool desynchrony for block decoding -- done 2019-01-12
  12. Integration with Bitcoin ABC networking code
  13. Networking testing on regtest/testnet/mainnet with real blocks
  14. Write BIP/BUIP and formal spec
  15. Bitcoin ABC pull request and begin of code review
  16. Unit tests, performance tests, benchmarks -- started
  17. Bitcoin Unlimited pull request and begin of code review
  18. Alpha release of binaries for testing or low-security block relay networks
  19. Merging code into ABC/BU, disabled-by-default
  20. Complete security review
  21. Enable by default in ABC and/or BU
  22. (Optional) parallelize encoding/decoding of blocks
Following is the debugging output from a test run done with coherent senderecipient mempools with a 1.25 million tx block, edited for readability:
Testing Xthinner on a block with 1250003 transactions with sender mempool size 2500000 and recipient mempool size 2500000 Tx/Block creation took 262 sec, 104853 ns/tx (mempool) CTOR block sorting took 2467 ms, 987 ns/tx (mempool) Encoding is 1444761 pushBytes, 2889520 1-bit commands, 103770 checksum bytes total 1910345 bytes, 12.23 bits/tx Single-threaded encoding took 2924 ms, 1169 ns/tx (mempool) Serialization/deserialization took 1089 ms, 435 ns/tx (mempool) Single-threaded decoding took 1912314 usec, 764 ns/tx (mempool) Filling missing slots and handling checksum errors took 0 rounds and 12 usec, 0 ns/tx (mempool) Blocks match! *** No errors detected 
If each transaction were 400 bytes on average, this block would be 500 MB, and it was encoded in 1.9 MB of data, a 99.618% reduction in size. Real-world performance is likely to be somewhat worse than this, as it's not likely that 100% of the block's transactions will always be in the recipient's mempool, but the performance reduction from mempool desychrony is smooth and predictable. If the recipient is missing 10% of the sender's transactions, and has another 10% that the sender does not have, the transaction list is still able to be successfully transmitted and decoded, although in that case it usually takes 2.5 round trips to do so, and the overall compression ratio ends up being around 71% instead of 99.6%.
Anybody who wishes can view the WIP Xthinner code here.
Once Xthinner is finished, I intend to start working on Blocktorrent. Blocktorrent is a method for breaking a block into small independently verifiable chunks for transmission, where each chunk is about one IP packet (a bit less than 1500 bytes) in size. In the same way that Bittorrent was faster than Napster, Blocktorrent should be faster than Xthinner. Currently, one of the big limitations on block propagation performance is that a node cannot forward the first byte of a block until the last byte of the block has been received and completely validated. Blocktorrent will change that, and allow nodes to forward each IP packet shortly after that packet was received, regardless of whether any other packets have also been received and regardless of the order in which the packets are received. This should dramatically improve the bandwidth utilization efficiency of nodes during block propagation, and should reduce the block propagation latency for reaching the full network quite a lot -- my current estimate is about 10x improvement over Xthinner. Blocktorrent achieves this partial validation of small chunks by taking advantage of Bitcoin blocks' Merkle tree structure. Chunks of transactions are transmitted in a packet along with enough data from the rest of the Merkle tree's internal nodes to allow for that chunk of transactions to be validated back to the Merkle root, the block header, and the mining PoW, thereby ensuring that packet being forwarded is not invalid spam data used solely for a DoS attack. (Forwarding DoS attacks to other nodes is bad.) Each chunk will contain an Xthinner segment to encode TXIDs My performance target with Blocktorrent is to be able to propagate a 1 GB block in about 5-10 seconds to all nodes in the network that have 100 Mbps connectivity and quad core CPUs. Blocktorrent will probably perform a bit worse than FIBRE at small block sizes, but better at very large blocksizes, all without the trust and centralized infrastructure that FIBRE uses.
submitted by jtoomim to btc [link] [comments]

Geez, I'm middle aged now. Help me tweak my build.

I have a toddler now, and I can't completely re-educate myself for 2019 parts, so I'm hoping you all can help me figure out if/how I'm going wrong, or if I can get better value for money.
 
My rig from 2010 is on its last legs, and I'm looking to replace.
(In case anyone's nostalgic, it's an i5-760//4GB//Radeon HD5800)
 
I don't want anyone to build this for me - I'm just looking for advice.
My budget is in the 600-900 range. I'm not looking to max-out my budget, but I'd love to know if there are places where I can get better value for money.
I'm in Connecticut.
Keyboard, mouse, speakers, and monitor are separate. I'm fine on my own with that.
 
Use Case - probably pretty light.
Productivity: I'd like to be able to either dual-monitor or use a 4k (not both at the same time).
Gaming: Single monitor, 1080P, and not necessarily the latest and greatest. I'm a patient gamer, and considering my next game will be Axiom Verge or something N64-era.
Overclocking is not expected.
My toddler may end up using it for awhile, but I'm sure he'll need a new one by the time he's 8 or so.
 
Build:
 
PCPartPicker Part List
Type Item Price
CPU AMD Ryzen 3 2200G 3.5 GHz Quad-Core Processor $79.89 @ OutletPC
Thermal Compound Arctic Silver 5 High-Density Polysynthetic Silver 3.5 g Thermal Paste $6.16 @ Amazon
Motherboard ASRock B450M PRO4 Micro ATX AM4 Motherboard $79.78 @ OutletPC
Memory G.Skill Ripjaws V Series 16 GB (2 x 8 GB) DDR4-3200 Memory $77.99 @ Newegg
Storage Samsung 970 Evo 500 GB M.2-2280 NVME Solid State Drive $89.89 @ OutletPC
Storage Western Digital Caviar Blue 1 TB 3.5" 7200RPM Internal Hard Drive $44.84 @ Amazon
Video Card Sapphire Radeon RX 580 8 GB PULSE Video Card $169.99 @ Newegg
Case NZXT H500 ATX Mid Tower Case $69.99 @ Best Buy
Power Supply Corsair CX (2017) 450 W 80+ Bronze Certified ATX Power Supply $44.89 @ OutletPC
Operating System Microsoft Windows 10 Home OEM 64-bit $99.99 @ Best Buy
Prices include shipping, taxes, rebates, and discounts
Total (before mail-in rebates) $798.41
Mail-in rebates -$35.00
Total $763.41
Generated by PCPartPicker 2019-08-13 22:25 EDT-0400
 
Pointed questions:
  1. Am I wasting money with both a 2200G AND dedicated graphics? If so, will the CPU be enough for me, or ought I go with a different chip/card combo?
  2. I kind of guessed at the Graphics card, based on the stickied builds and number of reviews. I'm happy to hear suggestions.
  3. Do I need a separate cooleheatsink? (judging from the stickied builds, no?) Does the CPU come with a stock one?
  4. I could save a few bucks with a 0.25TB SSD. Meh, I'll go with 0.5TB.
  5. I could probably go for 8GB memory, but 16GB may make it last longer.
  6. HDD reliability is pretty important to me. Any insights on manufacturer reputations are very welcome.
  7. Boring cases are fine.
 
Thanks to all for your help!
 
Edit: Learning to format.
 
 
Edit 2:
Guys, GUYS!
There's been a lot of really good suggestions here. Thanks to everyone.
 
But we're not maxing out the budget for the sake of it.
Check out the use case - or the title! I gave up on current-level graphics and FPS games some time ago. I'm not paying $250 for a graphics card (or competing with BitCoin miners).
 
In fact, how far could I downgrade my graphics card, and still hit my targets for desktop apps (and still be able to do much older games)?
For games, let's target: "I could ably play MineCraft without gameplay problems, but the graphics might be mid-level."
 
Here's my "Mark 2":
 
Upgraded the processor - thanks, dar! The 2600X was only $10 extra so I went with that.
Changed the Storage solution to a SSD. - thanks lild.
Changed PSU. - thanks lild.
Removed Windows 10 - will look into that, but it's not something that needs to go into compatibility/performance discussions.
submitted by QuicklyReged to buildmeapc [link] [comments]

Technical Cryptonight Discussion: What about low-latency RAM (RLDRAM 3, QDR-IV, or HMC) + ASICs?

The Cryptonight algorithm is described as ASIC resistant, in particular because of one feature:
A megabyte of internal memory is almost unacceptable for the modern ASICs. 
EDIT: Each instance of Cryptonight requires 2MB of RAM. Therefore, any Cryptonight multi-processor is required to have 2MB per instance. Since CPUs are incredibly well loaded with RAM (ie: 32MB L3 on Threadripper, 16 L3 on Ryzen, and plenty of L2+L3 on Skylake Servers), it seems unlikely that ASICs would be able to compete well vs CPUs.
In fact, a large number of people seem to be incredibly confident in Cryptonight's ASIC resistance. And indeed, anyone who knows how standard DDR4 works knows that DDR4 is unacceptable for Cryptonight. GDDR5 similarly doesn't look like a very good technology for Cryptonight, focusing on high-bandwidth instead of latency.
Which suggests only an ASIC RAM would be able to handle the 2MB that Cryptonight uses. Solid argument, but it seems to be missing a critical point of analysis from my eyes.
What about "exotic" RAM, like RLDRAM3 ?? Or even QDR-IV?

QDR-IV SRAM

QDR-IV SRAM is absurdly expensive. However, its a good example of "exotic RAM" that is available on the marketplace. I'm focusing on it however because QDR-IV is really simple to describe.
QDR-IV costs roughly $290 for 16Mbit x 18 bits. It is true Static-RAM. 18-bits are for 8-bits per byte + 1 parity bit, because QDR-IV is usually designed for high-speed routers.
QDR-IV has none of the speed or latency issues with DDR4 RAM. There are no "banks", there are no "refreshes", there are no "obliterate the data as you load into sense amplifiers". There's no "auto-charge" as you load the data from the sense-amps back into the capacitors.
Anything that could have caused latency issues is gone. QDR-IV is about as fast as you can get latency-wise. Every clock cycle, you specify an address, and QDR-IV will generate a response every clock cycle. In fact, QDR means "quad data rate" as the SRAM generates 2-reads and 2-writes per clock cycle. There is a slight amount of latency: 8-clock cycles for reads (7.5nanoseconds), and 5-clock cycles for writes (4.6nanoseconds). For those keeping track at home: AMD Zen's L3 cache has a latency of 40 clocks: aka 10nanoseconds at 4GHz
Basically, QDR-IV BEATS the L3 latency of modern CPUs. And we haven't even begun to talk software or ASIC optimizations yet.

CPU inefficiencies for Cryptonight

Now, if that weren't bad enough... CPUs have a few problems with the Cryptonight algorithm.
  1. AMD Zen and Intel Skylake CPUs transfer from L3 -> L2 -> L1 cache. Each of these transfers are in 64-byte chunks. Cryptonight only uses 16 of these bytes. This means that 75% of L3 cache bandwidth is wasted on 48-bytes that would never be used per inner-loop of Cryptonight. An ASIC would transfer only 16-bytes at a time, instantly increasing the RAM's speed by 4-fold.
  2. AES-NI instructions on Ryzen / Threadripper can only be done one-per-core. This means a 16-core Threadripper can at most perform 16 AES encryptions per clock tick. An ASIC can perform as many as you'd like, up to the speed of the RAM.
  3. CPUs waste a ton of energy: there's L1 and L2 caches which do NOTHING in Cryptonight. There are floating-point units, memory controllers, and more. An ASIC which strips things out to only the bare necessities (basically: AES for Cryptonight core) would be way more power efficient, even at ancient 65nm or 90nm designs.

Ideal RAM access pattern

For all yall who are used to DDR4, here's a special trick with QDR-IV or RLDRAM. You can pipeline accesses in QDR-IV or RLDRAM. What does this mean?
First, it should be noted that Cryptonight has the following RAM access pattern:
QDR-IV and RLDRAM3 still have latency involved. Assuming 8-clocks of latency, the naive access pattern would be:
  1. Read
  2. Stall
  3. Stall
  4. Stall
  5. Stall
  6. Stall
  7. Stall
  8. Stall
  9. Stall
  10. Write
  11. Stall
  12. Stall
  13. Stall
  14. Stall
  15. Stall
  16. Stall
  17. Stall
  18. Stall
  19. Read #2
  20. Stall
  21. Stall
  22. Stall
  23. Stall
  24. Stall
  25. Stall
  26. Stall
  27. Stall
  28. Write #2
  29. Stall
  30. Stall
  31. Stall
  32. Stall
  33. Stall
  34. Stall
  35. Stall
  36. Stall
This isn't very efficient: the RAM sits around waiting. Even with "latency reduced" RAM, you can see that the RAM still isn't doing very much. In fact, this is why people thought Cryptonight was safe against ASICs.
But what if we instead ran four instances in parallel? That way, there is always data flowing.
  1. Cryptonight #1 Read
  2. Cryptonight #2 Read
  3. Cryptonight #3 Read
  4. Cryptonight #4 Read
  5. Stall
  6. Stall
  7. Stall
  8. Stall
  9. Stall
  10. Cryptonight #1 Write
  11. Cryptonight #2 Write
  12. Cryptonight #3 Write
  13. Cryptonight #4 Write
  14. Stall
  15. Stall
  16. Stall
  17. Stall
  18. Stall
  19. Cryptonight #1 Read #2
  20. Cryptonight #2 Read #2
  21. Cryptonight #3 Read #2
  22. Cryptonight #4 Read #2
  23. Stall
  24. Stall
  25. Stall
  26. Stall
  27. Stall
  28. Cryptonight #1 Write #2
  29. Cryptonight #2 Write #2
  30. Cryptonight #3 Write #2
  31. Cryptonight #4 Write #2
  32. Stall
  33. Stall
  34. Stall
  35. Stall
  36. Stall
Notice: we're doing 4x the Cryptonight in the same amount of time. Now imagine if the stalls were COMPLETELY gone. DDR4 CANNOT do this. And that's why most people thought ASICs were impossible for Cryptonight.
Unfortunately, RLDRAM3 and QDR-IV can accomplish this kind of pipelining. In fact, that's what they were designed for.

RLDRAM3

As good as QDR-IV RAM is, its way too expensive. RLDRAM3 is almost as fast, but is way more complicated to use and describe. Due to the lower cost of RLDRAM3 however, I'd assume any ASIC for CryptoNight would use RLDRAM3 instead of the simpler QDR-IV. RLDRAM3 32Mbit x36 bits costs $180 at quantities == 1, and would support up to 64-Parallel Cryptonight instances (In contrast, a $800 AMD 1950x Threadripper supports 16 at the best).
Such a design would basically operate at the maximum speed of RLDRAM3. In the case of x36-bit bus and 2133MT/s, we're talking about 2133 / (Burst Length4 x 4 read/writes x 524288 inner loop) == 254 Full Cryptonight Hashes per Second.
254 Hashes per second sounds low, and it is. But we're talking about literally a two-chip design here. 1-chip for RAM, 1-chip for the ASIC/AES stuff. Such a design would consume no more than 5 Watts.
If you were to replicate the ~5W design 60-times, you'd get 15240 Hash/second at 300 Watts.

RLDRAM2

Depending on cost calculations, going cheaper and "making more" might be a better idea. RLDRAM2 is widely available at only $32 per chip at 800 MT/s.
Such a design would theoretically support 800 / 4x4x524288 == 95 Cryptonight Hashes per second.
The scary part: The RLDRAM2 chip there only uses 1W of power. Together, you get 5 Watts again as a reasonable power-estimate. x60 would be 5700 Hashes/second at 300 Watts.
Here's Micron's whitepaper on RLDRAM2: https://www.micron.com/~/media/documents/products/technical-note/dram/tn4902.pdf . RLDRAM3 is the same but denser, faster, and more power efficient.

Hybrid Cube Memory

Hybrid Cube Memory is "stacked RAM" designed for low latency. As far as I can tell, Hybrid Cube memory allows an insane amount of parallelism and pipelining. It'd be the future of an ASIC Cryptonight design. The existence of Hybrid Cube Memory is more about "Generation 2" or later. In effect, it demonstrates that future designs can be lower-power and give higher-speed.

Realistic ASIC Sketch: RLDRAM3 + Parallel Processing

The overall board design would be the ASIC, which would be a simple pipelined AES ASIC that talks with RLDRAM3 ($180) or RLDRAM2 ($30).
Its hard for me to estimate an ASIC's cost without the right tools or design. But a multi-project wafer like MOSIS offers "cheap" access to 14nm and 22nm nodes. Rumor is that this is roughly $100k per run for ~40 dies, suitable for research-and-development. Mass production would require further investments, but mass production at the ~65nm node is rumored to be in the single-digit $$millions or maybe even just 6-figures or so.
So realistically speaking: it'd take ~$10 Million investment + a talented engineer (or team of engineers) who are familiar with RLDRAM3, PCIe 3.0, ASIC design, AES, and Cryptonight to build an ASIC.

TL;DR:

submitted by dragontamer5788 to Monero [link] [comments]

Want to upgrade my PC, would like advice

Hi all,
I built my current PC for ~£600 about five years ago, so its starting to get a bit outdated and occasionally laggy. I'm looking to upgrade but I don't know where to start!
I think I want to be upgrading my GPU and SSD, but I'm not sure if this would mean that I need up update my mobo/psu/cooler too. If something else is the bottleneck here, or you need any other information, please let me know!
My current setup looks like this: PCPartPicker Part List
Type Item Price
CPU Intel - Core i5-4430 3 GHz Quad-Core Processor -
Motherboard Asus - H81M-K Micro ATX LGA1150 Motherboard -
Memory Kingston - 16 GB (2 x 8 GB) DDR3-1600 Memory £78.62 @ Amazon UK
Storage SanDisk - Ultra Plus 128 GB 2.5" Solid State Drive -
Storage Seagate - Barracuda 1 TB 3.5" 7200RPM Internal Hard Drive -
Video Card XFX - Radeon R9 280X 3 GB Double Dissipation Video Card -
Case Cooler Master - N200 MicroATX Mini Tower Case £42.33 @ Amazon UK
Power Supply EVGA - 500 W 80+ Bronze Certified ATX Power Supply £52.98 @ CCL Computers
Optical Drive Samsung - SH-224DB/BEBE DVD/CD Writer -
Thanks for any advice!
submitted by Sibblin to buildapc [link] [comments]

Vertnode - An automated solution for installing Vertcoin node(s) on Single Board Computers

Hello Vertcoin Community,
Eager to contribute to the Vertcoin Community I began creating step by step walkthrough guides on how to get a Vertcoin node up and running on a Raspberry Pi, Raspberry Pi Zero and Intel NUC. Along with information to get a Vertcoin node up and running was also optional steps to install p2pool-vtc.
I decided that while this step by step guide might be helpful to a few, a setup script may prove to be useful to a wider range of people. I have this script to a point where I think it may be productive to share with a bigger audience, for those who are brave and have this hardware sitting around or like to tinker with projects; I invite you to test this setup script if you are interested, if you run into errors any sort of verbose console output of the error proves to be extremely helpful in troubleshooting.
The script was designed to produce a “headless” server... meaning we will not be using a GUI to configure Vertcoin or check to see how things are running. In fact, once the server is set up, you will only interact with it using command line calls over SSH. The idea is to have this full node be simple, low-power, with optimized memory usage and something that “just runs” in your basement, closet, etc.
Why run a headless node on a Single Board Computer?
The idea is to have this full node be simple, low-power, with optimized memory usage and something that “just runs” in your basement, closet, etc.
Required: USB Flash Drive 6GB - 32GB
Please note that the script was designed for Single Board Computers first and looks for an accessible USB Flash Drive to use for storing the blockchain and swap file, as constant writing to a microSD can degrade the health of the microSD.
Supports

Hardware

All of the hardware listed above is hardware that I have personally tested / am testing on myself. The plan is to continue expanding my arsenal of single board computers and continue to add support for more hardware to ensure as much compatibility as possible.
Functionality
It is worth noting that LIT can be ran with multiple configurations, the ones displayed in the Post Installation Report reflect values that run LIT with the Vertcoin Mainnet. Please be aware that the Vertcoin Testnet chain has not been mined 100% of the time in the past, if you make transactions on the Vertcoin testnet that do not go through it is likely because the chain has stopped being mined.
BE CAREFUL WITH YOUR COINS, ONLY TEST WITH WHAT YOU ARE OKAY WITH LOSING IF YOU USE THE MAINNET.

Vertcoin Testnet Coins

https://tvtc.blkidx.org/faucet/
I've included some documentation on LIT I created which includes information I found to be useful: https://github.com/e-corp-sam-sepiol/vertnode/blob/mastedocs/lit.md
Please visit the mit-dci/lit github repository for the most up to date information on lit: https://github.com/mit-dci/lit

Vertnode | Automated Vertcoin Node Installation Script

https://github.com/e-corp-sam-sepiol/vertnode

Recommended: Use Etcher to install the chosen OS to your microSD card / USB flash drive.

If you intend on installing Ubuntu Server 16.04 to your Intel NUC please use Etcher to install the .iso to your USB flash drive.
https://etcher.io/
PLEASE NOTE THIS SCRIPT MAY GIVE AN ERROR. THIS IS THE NATURE OF TESTING. PLEASE REPORT YOUR ERRORS IF YOU WANT THEM TO BE FIXED/RESOLVED. THANK YOU FOR BETTERING THE DEVELOPMENT OF THIS SCRIPT.

Ubuntu Server 16.04 Setup Details

You can use different clients to ssh into your node. One option is using PuTTY or Git Bash on Windows which is included in the desktop version of Git. If you are using Linux you can simply open a new terminal window and ssh to the IP address of your node (hardware you intend installing the Vertcoin node on).
You will need to know the IP address of your node, this can be found on your router page.
ssh 192.168.1.5 -l pi For example, this command uses ssh to login to 192.168.1.5 using the -l login name of pi. The IP address of your node will likely be different for you, in this example I am logging into a Raspberry Pi which has a default login name of pi.
A brief list of commands that can be used to check on the Vertcoin node status:
vertcoin-cli getblockchaininfo | Grab information about your blockchain
vertcoin-cli getblockcount | Grab the current count of blocks on your node
vertcoin-cli getconnectioncount | Grab the current count of connections to your node. A number of connections larger than 8 means that you have incoming connections to your node. The default settings are to make 8 outgoing connections. If you want incoming connections please port forward your Raspberry Pi in your Router settings page.
vertcoin-cli getpeerinfo | Grab the information about the peers you have connected to / are connected to
vertcoin-cli getnettotals | Grab network data, how much downloaded/upload displayed in bytes
tail -f ~/.vertcoin/debug.log | Output the latest lines in the Vertcoin debug.log to see verbose information about the Vertcoin daemon (ctrl+c to stop)
Thank you to all who have helped me and inspired me thus far, @b17z, @jamesl22, @vertcoinmarketingteam, @canen, @flakfired, @etang600, @BDF, @tucker178, @Xer0
This work is dedicated to the users of Vertcoin, thank you for making this possible.
7/20/2018 Thank you @CommodoreAmiga for the incredibly generous tip <3
You can reach me @Sam Sepiol#3396 on the Vertcoin Discord, here on reddit or @ [email protected]
submitted by ecorp-sam-sepiol to vertcoin [link] [comments]

Glad to see my intuition being right (on Ethereum)

Soon after Ethereum was announced, on January 24, 2014 I've made a comment:
But there might be a problem with resource usage... Let's say I own a lot of bitcoins and I do not want Ethereum to exist.
So I'll run multiple high-performance, clustered nodes and use them to process transactions which will consume as much resources as possible. Soon running Ethereum nodes requires 1 TB of RAM.
People say: "What the fuck? Clearly making scripts Turing-complete was a bad idea". And Ethereum is abandoned as a broken project... (Few people can afford to run full nodes, so it is as good as centralized.)
This attack might costs many millions USD, but if that helps to protect my Bitcoin investment, it makes sense.
Note that this was written before any details on Ethereum were settled, just general thoughts based on Ethereum's idea of running "Turing-complete scripts".
So it looks like this kind of a scenario is unfolding now, 2.5 years after I've written then comment:
  1. September 18, 2016: All geth nodes crash due to an out of memory bug. A specially crafted block makes geth, the most popular Ethereum node software, to request huge amounts of RAM, and thus crash. According to some reports, 85% of all Ethereum nodes are running Geth at the time. All of them were crashing, services (and wallets) which relied on them couldn't function.
  2. September 22: "Today the network was attacked by a transaction spam attack that repeatedly called the EXTCODESIZE opcode (see trace sample here), thereby creating blocks that take up to ~20-60 seconds to validate due to the ~50,000 disk fetches needed to process the transaction. The result of this was a ~2-3x reduction in the rate of block creation while the attack was taking place; there was NO consensus failure". Ethereum blocks should normally appear each ~15 seconds, but they take ~20-60 seconds to validate. Thus a normal node just couldn't keep up with blocks. Thankfully, miners got slowed down too, so there was "NO consensus failure" this time.
  3. September 25: "attacker has changed strategy ... Basically, it's now a quadratic memory complexity attack but using CALL instead of EXTCODESIZE. However because the gas limit is only 1.5m, the effect is lower, so geth nodes are just running more slowly and not crashing outright. "
jtoomim shared some details on what it's like to run an Ethereum node:
On my nodes, I'm seeing up to 16 GiB of virtual memory being used. This crashed one of my nodes twice, since it only had 8 GiB of RAM and 2 GiB of swap. I added more swap space, and that seems to have helped the crashing. I also changed the db cache size according to the blog post recommendations, and I'm now making it through the attack blocks in about 5 seconds on that machine. My other server has 16 GiB of RAM and a 4.4 GHz quad-core CPU, and it makes it through the attack blocks in about 2-3 seconds. Both have SSDs and are running Parity 1.3.
With geth, some of these blocks take up to 2 minutes to verify.
So it seems like fairly decent server-class hardware is necessary to keep up with the Ethereum blockchain now. If you run the heavily optimized Ethereum implementation, Parity.
Ethereum devs try to mitigate the issue by recommending miners to increase transaction fees (gas price) and reduce block size (gas limit). This could hurt apps/users, if there were any.
Now, this attack isn't going to kill Ethereum, of course. It's more like a warning. The cost of the attack is estimated to be on the scale of $5000 per day, so it's not some kind of largescale attempt to kill Ethereum.
I think things could be much worse if an attacker also had an access to significant amounts of mining hashpower: this would have allowed him to mine huge blocks at zero cost.
Also Ethereum node hardware requirements might grow due to demands of legitimate applications.
submitted by killerstorm to Bitcoin [link] [comments]

Console gaming is hardly different from PC gaming, and much of what people say about PC gaming to put it above console gaming is often wrong.

I’m not sure about you, but for the past few years, I’ve been hearing people go on and on about PCs "superiority" to the console market. People cite various reasons why they believe gaming on a PC is “objectively” better than console gaming, often for reasons related to power, costs, ease-of-use, and freedom.
…Only problem: much of what they say is wrong.
There are many misconceptions being thrown about PC gaming vs Console gaming, that I believe need to be addressed. This isn’t about “PC gamers being wrong,” or “consoles being the best,” absolutely not. I just want to cut through some of the stuff people use to put down console gaming, and show that console gaming is incredibly similar to PC gaming. I mean, yes, this is someone who mainly games on console, but I also am getting a new PC that I will game on as well, not to mention the 30 PC games I already own and play. I’m not particularly partial to one over the other.
Now I will mainly be focusing on the PlayStation side of the consoles, because I know it best, but much of what I say will apply to Xbox as well. Just because I don’t point out many specific Xbox examples, doesn’t mean that they aren’t out there.

“PCs can use TVs and monitors.”

This one isn’t so much of a misconception as it is the implication of one, and overall just… confusing. This is in some articles and the pcmasterrace “why choose a PC” section, where they’re practically implying that consoles can’t do this. I mean, yes, as long as the ports of your PC match up with your screen(s) inputs, you could plug a PC into either… but you could do the same with a console, again, as long as the ports match up.
I’m guessing the idea here is that gaming monitors often use Displayport, as do most dedicated GPUs, and consoles are generally restricted to HDMI… But even so, monitors often have HDMI ports. In fact, PC Magazine has just released their list of the best gaming monitors of 2017, and every single one of them has an HDMI port. A PS4 can be plugged into these just as easily as a GTX 1080.
I mean, even if the monitoTV doesn’t have HDMI or AV to connect with your console, just use an adaptor. If you have a PC with ports that doesn’t match your monitoTV… use an adapter. I don’t know what the point of this argument is, but it’s made a worrying amount of times.

“On PC, you have a wide range of controller options, but on console you’re stuck with the standard controller."

Are you on PlayStation and wish you could use a specific type of controller that suits your favorite kind of gameplay? Despite what some may believe, you have just as many options as PC.
Want to play fighting games with a classic arcade-style board, featuring the buttons and joystick? Here you go!
Want to get serious about racing and get something more accurate and immersive than a controller? Got you covered.
Absolutely crazy about flying games and, like the racers, want something better than a controller? Enjoy!
Want Wii-style motion controls? Been around since the PS3. If you prefer the form factor of the Xbox One controller but you own a PS4, Hori’s got you covered. And of course, if keyboard and mouse it what keeps you on PC, there’s a PlayStation compatible solution for that. Want to use the keyboard and mouse that you already own? Where there’s a will, there’s a way.
Of course, these aren’t isolated examples, there are plenty of options for each of these kind of controllers. You don’t have to be on PC to enjoy alternate controllers.

“On PC you could use Steam Link to play anywhere in your house and share games with others.”

PS4 Remote play app on PC/Mac, PSTV, and PS Vita.
PS Family Sharing.
Using the same PSN account on multiple PS4s/Xbox Ones and PS3s/360s, or using multiple accounts on the same console.
In fact, if multiple users are on the same PS4, only one has to buy the game for both users to play it on that one PS4. On top of that, only one of them has to have PS Plus for both to play online (if the one with PS Plus registers the PS4 as their main system).
PS4 Share Play; if two people on separate PS4s want to play a game together that only one of them owns, they can join a Party and the owner of the game can have their friend play with them in the game.
Need I say more?

“Gaming is more expensive on console.”

Part one, the Software
This is one that I find… genuinely surprising. There’s been a few times I’ve mentioned that part of the reason I chose a PS4 is for budget gaming, only to told that “games are cheaper on Steam.” To be fair, there are a few games on PSN/XBL that are more expensive than they are on Steam, so I can see how someone could believe this… but apparently they forgot about disks.
Dirt Rally, a hardcore racing sim game that’s… still $60 on all 3 platforms digitally… even though its successor is out.
So does this mean you have to pay full retail for this racing experience? Nope, because disk prices.
Just Cause 3, an insane open-world experience that could essentially be summed up as “break stuff, screw physics.” And it’s a good example of where the Steam price is lower than PSN and XBL:
Not by much, but still cheaper on Steam, so cheaper on PC… Until you look at the disk prices.
See my point? Often times the game is cheaper on console because of the disk alternative that’s available for practically every console-available game. Even when the game is brand new.
Dirt 4 - Remember that Dirt Rally successor I mentioned?
Yes, you could either buy this relatively new game digitally for $60, or just pick up the disk for a discounted price. And again, this is for a game that came out 2 months ago, and even it’s predecessor’s digital cost is locked at $60. Of course, I’m not going to ignore the fact that Dirt 4 is currently (as of writing this) discounted on Steam, but on PSN it also happens to be discounted for about the same amount.
Part 2: the Subscription
Now… let’s not ignore the elephant in the room: PS Plus and Xbox Gold. Now these would be ignorable, if they weren’t required for online play (on the PlayStation side, it’s only required for PS4, but still). So yes, it’s still something that will be included in the cost of your PS4 or Xbox One/360, assuming you play online. Bummer, right?
Here’s the thing, although that’s the case, although you have to factor in this $60 cost with your console, you can make it balance out, at worst, and make it work out for you as a budget gamer, at best. As nice as it would be to not have to deal with the price if you don’t want to, it’s not like it’s a problem if you use it correctly.
Imagine going to a new restaurant. This restaurant has some meals that you can’t get anywhere else, and fair prices compared to competitors. Only problem: you have to pay a membership fee to have the sides. Now you can have the main course, sit down and enjoy your steak or pasta, but if you want to have a side to have a full meal, you have to pay an annual fee.
Sounds shitty, right? But here’s the thing: not only does this membership allow you to have sides with your meal, but it also allows you to eat two meals for free every month, and also gives you exclusive discounts for other meals, drinks, and desserts.
Let’s look at PS Plus for a minute: for $60 per year, you get:
  • 2 free PS4 games, every month
  • 2 free PS3 games, every month
  • 1 PS4/PS3 and Vita compatible game, and 1 Vita-only game, every month
  • Exclusive/Extended discounts, especially during the weekly/seasonal sales (though you don’t need PS Plus to get sales, PS Plus members get to enjoy the best sales)
  • access to online multiplayer
So yes, you’re paying extra because of that membership, but what you get with that deal pays for it and then some. In fact, let’s ignore the discounts for a minute: you get 24 free PS4 games, 24 free PS3 games, and 12 Vita only + 12 Vita compatible games, up to 72 free games every year. Even if you only one of these consoles, that’s still 24 free games a year. Sure, maybe you get games for the month that you don’t like, then just wait until next month.
In fact, let’s look at Just Cause 3 again. It was free for PS Plus members in August, which is a pretty big deal. Why is this significant? Because it’s, again, a $60 digital game. That means with this one download, you’ve balanced out your $60 annual fee. Meaning? Every free game after that is money saved, every discount after that is money saved. And this is a trend: every year, PS Plus will release a game that balances out the entire service cost, then another 23 more that will only add icing to that budget cake. Though, you could just count games as paying off PS Plus until you hit $60 in savings, but still.
All in all, PS Plus, and Xbox Gold which offers similar options, saves you money. On top of that, again, you don't need to have these to get discounts, but with these memberships, you get more discounts.
Now, I’ve seen a few Steam games go up for free for a week, but what about being free for an entire month? Not to mention that; even if you want to talk about Steam Summer Sales, what about the PSN summer sale, or again, disc sale discounts? Now a lot of research and math would be needed to see if every console gamer would save money compared to every Steam gamer for the same games, but at the very least? The costs will balance out, at worst.
Part 3, the Systems
  • Xbox and PS2: $299
  • Xbox 360 and PS3: $299 and $499, respectively
  • Xbox One and PS4: $499 and $399, respectively.
Rounded up a few dollars, that’s $1,000 - $1,300 in day-one consoles, just to keep up with the games! Crazy right? So called budget systems, such a rip-off.
Well, keep in mind that the generations here aren’t short.
The 6th generation, from the launch of the PS2 to the launch of the next generation consoles, lasted 5 years, 6 years based on the launch of the PS3 (though you could say it was 9 or 14, since the Xbox wasn’t discontinued until 2009, and the PS2 was supported all the way to 2014, a year after the PS4 was released). The 7th gen lasted 7 - 8 years, again depending on whether you count the launch of the Xbox 360 to PS3. The 8th gen so far has lasted 4 years. That’s 17 years that the console money is spread over. If you had a Netflix subscription for it’s original $8 monthly plan for that amount of time, that would be over $1,600 total.
And let’s be fair here, just like you could upgrade your PC hardware whenever you wanted, you didn’t have to get a console from launch. Let’s look at PlayStation again for example: In 2002, only two years after its release, the PS2 retail price was cut from $300 to $200. The PS3 Slim, released 3 years after the original, was $300, $100-$200 lower than the retail cost. The PS4? You could’ve either gotten the Uncharted bundle for $350, or one of the PS4 Slim bundles for $250. This all brings it down to $750 - $850, which again, is spread over a decade and a half. This isn’t even counting used consoles, sales, or the further price cuts that I didn’t mention.
Even if that still sounds like a lot of money to you, even if you’re laughing at the thought of buying new systems every several years, because your PC “is never obsolete,” tell me: how many parts have you changed out in your PC over the years? How many GPUs have you been through? CPUs? Motherboards? RAM sticks, monitors, keyboards, mice, CPU coolers, hard drives— that adds up. You don’t need to replace your entire system to spend a lot of money on hardware.
Even if you weren’t upgrading for the sake of upgrading, I’d be amazed if the hardware you’ve been pushing by gaming would last for about 1/3 of that 17 year period. Computer parts aren’t designed to last forever, and really won’t when you’re pushing them with intensive gaming for hours upon hours. Generally speaking, your components might last you 6-8 years, if you’ve got the high-end stuff. But let’s assume you bought a system 17 years ago that was a beast for it’s time, something so powerful, that even if it’s parts have degraded over time, it’s still going strong. Problem is: you will have to upgrade something eventually.
Even if you’ve managed to get this far into the gaming realm with the same 17 year old hardware, I’m betting you didn’t do it with a 17 year Operating System. How much did Windows 7 cost you? Or 8.1? Or 10? Oh, and don’t think you can skirt the cost by getting a pre-built system, the cost of Windows is embedded into the cost of the machine (why else would Microsoft allow their OS to go on so many machines).
Sure, Windows 10 was a free upgrade for a year, but that’s only half of it’s lifetime— You can’t get it for free now, and not for the past year. On top of that, the free period was an upgrade; you had to pay for 7 or 8 first anyway.
Point is, as much as one would like to say that they didn’t need to buy a new system every so often for the sake of gaming, that doesn’t mean they haven’t been paying for hardware, and even if they’ve only been PC gaming recently, you’ll be spending money on hardware soon enough.

“PC is leading the VR—“

Let me stop you right there.
If you add together the total number of Oculus Rifts and HTC Vives sold to this day, and threw in another 100,000 just for the sake of it, that number would still be under the number of PSVR headsets sold.
Why could this possibly be? Well, for a simple reason: affordability. The systems needed to run the PC headsets costs $800+, and the headsets are $500 - $600, when discounted. PSVR on the other hand costs $450 for the full bundle (headset, camera, and move controllers, with a demo disc thrown in), and can be played on either a $250 - $300 console, or a $400 console, the latter recommended. Even if you want to say that the Vive and Rift are more refined, a full PSVR set, system and all, could cost just over $100 more than a Vive headset alone.
If anything, PC isn’t leading the VR gaming market, the PS4 is. It’s the system bringing VR to the most consumers, showing them what the future of gaming could look like. Not to mention that as the PlayStation line grows more powerful (4.2 TFLOP PS4 Pro, 10 TFLOP “PS5…”), it won’t be long until the PlayStation line can use the same VR games as PC.
Either way, this shows that there is a console equivalent to the PC VR options. Sure, there are some games you'd only be able to play on PC, but there are also some games you'd only be able to play on PSVR.
…Though to be fair, if we’re talking about VR in general, these headsets don’t even hold a candle to, surprisingly, Gear VR.

“If it wasn’t for consoles holding devs back, then they would be able to make higher quality games.”

This one is based on the idea that because of how “low spec” consoles are, that when a developer has to take them in mind, then they can’t design the game to be nearly as good as it would be otherwise. I mean, have you ever seen the minimum specs for games on Steam?
GTA V
  • CPU: Intel Core 2 Quad CPU Q6600 @ 2.40GHz (4 CPUs) / AMD Phenom 9850 Quad-Core Processor (4 CPUs) @ 2.5GHz
  • Memory: 4 GB RAM
  • GPU: NVIDIA 9800 GT 1GB / AMD HD 4870 1GB (DX 10, 10.1, 11)
Just Cause 3
  • CPU: Intel Core i5-2500k, 3.3GHz / AMD Phenom II X6 1075T 3GHz
  • Memory: 8 GB RAM
  • GPU: NVIDIA GeForce GTX 670 (2GB) / AMD Radeon HD 7870 (2GB)
Fallout 4
  • CPU: Intel Core i5-2300 2.8 GHz/AMD Phenom II X4 945 3.0 GHz or equivalent
  • Memory: 8 GB RAM
  • GPU: NVIDIA GTX 550 Ti 2GB/AMD Radeon HD 7870 2GB or equivalent
Overwatch
  • CPU: Intel Core i3 or AMD Phenom™ X3 8650
  • Memory: 4 GB RAM
  • GPU: NVIDIA® GeForce® GTX 460, ATI Radeon™ HD 4850, or Intel® HD Graphics 4400
Witcher 3
  • Processor: Intel CPU Core i5-2500K 3.3GHz / AMD CPU Phenom II X4 940
  • Memory: 6 GB RAM
  • Graphics: Nvidia GPU GeForce GTX 660 / AMD GPU Radeon HD 7870
Actually, bump up all the memory requirements to 8 GBs, and those are some decent specs, relatively speaking. And keep in mind these are the minimum specs to even open the games. It’s almost as if the devs didn’t worry about console specs when making a PC version of the game, because this version of the game isn’t on console. Or maybe even that the consoles aren’t holding the games back that much because they’re not that weak. Just a hypothesis.
But I mean, the devs are still ooobviously having to take weak consoles into mind right? They could make their games sooo much more powerful if they were PC only, right? Right?
No. Not even close.
iRacing
  • CPU: Intel Core i3, i5, i7 or better or AMD Bulldozer or better
  • Memory: 8 GB RAM
  • GPU: NVidia GeForce 2xx series or better, 1GB+ dedicated video memory / AMD 5xxx series or better, 1GB+ dedicated video memory
Playerunknown’s Battlegrounds
  • CPU: Intel Core i3-4340 / AMD FX-6300
  • Memory: 6 GB RAM
  • GPU: nVidia GeForce GTX 660 2GB / AMD Radeon HD 7850 2GB
These are PC only games. That’s right, no consoles to hold them back, they don’t have to worry about whether an Xbox One could handle it. Yet, they don’t require anything more than the Multiplatform games.
Subnautica
  • CPU: Intel Haswell 2 cores / 4 threads @ 2.5Ghz or equivalent
  • Memory: 4GB
  • GPU: Intel HD 4600 or equivalent - This includes most GPUs scoring greater than 950pts in the 3DMark Fire Strike benchmark
Rust
  • CPU: 2 ghz
  • Memory: 8 GB RAM
  • DirectX: Version 11 (they don’t even list a GPU)
So what’s the deal? Theoretically, if developers don’t have to worry about console specs, then why aren’t they going all-out and making games that no console could even dream of supporting?
Low-end PCs.
What, did you think people only game on Steam if they spent at least $500 on gaming hardware? Not all PC gamers have gaming-PC specs, and if devs close their games out to players who don’t have the strongest of PCs, then they’d be losing out on a pretty sizable chunk of their potential buyers.
Saying “devs having to deal with consoles is holding gaming back” is like saying “racing teams having to deal with Ford is holding GT racing back.” A: racing teams don’t have to deal with Ford if they don’t want to, which is probably why many of them don’t, and B: even though Ford doesn’t make the fastest cars overall, they still manage to make cars that are awesome on their own, they don’t even need to be compared to anything else to know that they make good cars.
I want to go back to that previous point though, developers having to deal with low-end PCs, because it’s integral to the next point:

“PCs are more powerful, gaming on PC provides a better experience.”

This one isn’t so much of a misconception as it is… misleading.
Did you know that according to the Steam Hardware & Software Survey (July 2017) , the percentage of Steam gamers who use a GPU that's less powerful than that of a PS4 Slim’s GPU is well over 50%? Things get dismal when compared to the PS4 Pro (Or Xbox One X). On top of that, the percentage of PC gamers who own a Nvidia 10 series card is about 20% (about 15% for the 1060, 1080 and 1070 owners).
Now to be fair, the large majority of gamers have CPUs with considerably high clock speeds, which is the main factor in CPU gaming performance. But, the number of Steam gamers with as much RAM or more than a PS4 or Xbox One is less than 50%, which can really bottleneck what those CPUs can handle.
These numbers are hardly better than they were in 2013, all things considered. Sure, a PS3/360 weeps in the face of even a $400 PC, but in this day in age, consoles have definitely caught up.
Sure, we could mention the fact that even 1% of Steam accounts represents over 1 million accounts, but that doesn’t really matter compared to the 10s of millions of 8th gen consoles sold; looking at it that way, sure the number of Nvidia 10 series owners is over 20 million, but that ignores the fact that there are over 5 times more 8th gen consoles sold than that.
Basically, even though PCs run on a spectrum, saying they're more powerful “on average” is actually wrong. Sure, they have the potential for being more powerful, but most of the time, people aren’t willing to pay the premium to reach those extra bits of performance.
Now why is this important? What matters are the people who spent the premium cost for premium parts, right? Because of the previous point: PCs don’t have some ubiquitous quality over the consoles, developers will always have to keep low-end PCs in mind, because not even half of all PC players can afford the good stuff, and you have to look at the top quarter of Steam players before you get to PS4-Pro-level specs. If every Steam player were to get a PS4 Pro, it would be an upgrade for over 60% of them, and 70% of them would be getting an upgrade with the Xbox One X.
Sure, you could still make the argument that when you pay more for PC parts, you get a better experience than you could with a console. We can argue all day about budget PCs, but a console can’t match up to a $1,000 PC build. It’s the same as paying more for car parts, in the end you get a better car. However, there is a certain problem with that…

“You pay a little more for a PC, you get much more quality.”

The idea here is that the more you pay for PC parts, the performance increases at a faster rate than the price does. Problem: that’s not how technology works. Paying twice as much doesn’t get you twice the quality the majority of the time.
For example, let’s look at graphics cards, specifically the GeForce 10 series cards, starting with the GTX 1050.
  • 1.8 TFLOP
  • 1.35 GHz base clock
  • 2 GB VRAM
  • $110
This is our reference, our basis of comparison. Any percentages will be based on the 1050’s specs.
Now let’s look at the GTX 1050 Ti, the 1050’s older brother.
  • 2.1 TFLOP
  • 1.29 GHz base clock
  • 4 GB VRAM
  • $140 retail
This is pretty good. You only increase the price by about 27%, and you get an 11% increase in floating point speed and a 100% increase (double) in VRAM. Sure you get a slightly lower base clock, but the rest definitely makes up for it. In fact, according to GPU boss, the Ti managed 66 fps, or a 22% increase in frame rate for Battlefield 4, and a 54% increase in mHash/second in bitcoin mining. The cost increase is worth it, for the most part.
But let’s get to the real meat of it; what happens when we double our budget? Surely we should see a massive increase performance, I bet some of you are willing to bet that twice the cost means more than twice the performance.
The closest price comparison for double the cost is the GTX 1060 (3 GB), so let’s get a look at that.
  • 3.0 TFLOP
  • 1.5 GHz base clock
  • 3 GB VRAM
  • $200 retail
Well… not substantial, I’d say. About a 50% increase in floating point speed, an 11% increase in base clock speed, and a 1GB decrease in VRAM. For [almost] doubling the price, you don’t get much.
Well surely raw specs don’t tell the full story, right? Well, let’s look at some real wold comparisons. Once again, according to GPU Boss, there’s a 138% increase in hashes/second for bitcoin mining, and at 99 fps, an 83% frame rate increase in Battlefield 4. Well, then, raw specs does not tell the whole story!
Here’s another one, the 1060’s big brother… or, well, slightly-more-developed twin.
  • 3.9 TFLOP
  • 1.5 GHz base clock
  • 6 GB VRAM
  • $250 retail
Seems reasonable, another $50 for a decent jump in power and double the memory! But, as we’ve learned, we shouldn’t look at the specs for the full story.
I did do a GPU Boss comparison, but for the BF4 frame rate, I had to look at Tom’s Hardware (sorry miners, GPU boss didn’t cover the mHash/sec spec either). What’s the verdict? Well, pretty good, I’d say. With 97 FPS, a 79% increase over the 1050— wait. 97? That seems too low… I mean, the 3GB version got 99.
Well, let’s see what Tech Power Up has to say...
94.3 fps. 74% increase. Huh.
Alright alright, maybe that was just a dud. We can gloss over that I guess. Ok, one more, but let’s go for the big fish: the GTX 1080.
  • 9.0 TFLOP
  • 1.6 GHz base clock
  • 8 GB VRAM
  • $500 retail
That jump in floating point speed definitely has to be something, and 4 times the VRAM? Sure it’s 5 times the price, but as we saw, raw power doesn’t always tell the full story. GPU Boss returns to give us the run down, how do these cards compare in the real world?
Well… a 222% (over three-fold) increase in mHash speed, and a 218% increase in FPS for Battlefield 4. That’s right, for 5 times the cost, you get 3 times the performance. Truly, the raw specs don’t tell the full story.
You increase the cost by 27%, you increase frame rate in our example game by 22%. You increase the cost by 83%, you increase the frame rate by 83%. Sounds good, but if you increase the cost by 129%, and you get a 79% (-50% cost/power increase) increase in frame rate. You increase it by 358%, and you increase the frame rate by 218% (-140% cost/power increase). That’s not paying “more for much more power,” that’s a steep drop-off after the third cheapest option.
In fact, did you know that you have to get to the 1060 (6GB) before you could compare the GTX line to a PS4 Pro? Not to mention that at $250, the price of a 1060 (6GB) you could get an entire PS4 Slim bundle, or that you have to get to the 1070 before you beat the Xbox One X.
On another note, let’s look at a PS4 Slim…
  • 1.84 TFLOP
  • 800 MHz base clock
  • 8 GB VRAM
  • $300 retail
…Versus a PS4 Pro.
  • 4.2 TFLOP
  • 911 MHz base clock
  • 8 GB VRAM
  • $400 retail
128% increase in floating point speed, 13% increase in clock speed, for a 25% difference in cost. Unfortunately there is no Battlefield 4 comparison to make, but in BF1, the frame rate is doubled (30 fps to 60) and the textures are taken to 11. For what that looks like, I’ll leave it up to this bloke. Not to even mention that you can even get the texture buffs in 4K. Just like how you get a decent increase in performance based on price for the lower-cost GPUs, the same applies here.
It’s even worse when you look at the CPU for a gaming PC. The more money you spend, again, the less of a benefit you get per dollar. Hardware Unboxed covers this in a video comparing different levels of Intel CPUs. One thing to note is that the highest i7 option (6700K) in this video was almost always within 10 FPS (though for a few games, 15 FPS) of a certain CPU in that list for just about all of the games.
…That CPU was the lowest i3 (6100) option. The lowest i3 was $117 and the highest i7 was $339, a 189% price difference for what was, on average, a 30% or less difference in frame rate. Even the lowest Pentium option (G4400, $63) was often able to keep up with the i7.
The CPU and GPU are usually the most expensive and power-consuming parts of a build, which is why I focused on them (other than the fact that they’re the two most important parts of a gaming PC, outside of RAM). With both, this “pay more to get much more performance” idea is pretty much the inverse of the truth.

“The console giants are bad for game developers, Steam doesn't treat developers as bad as Microsoft or especially Sony.”

Now one thing you might’ve heard is that the PS3 was incredibly difficult for developers to make games for, which for some, fueled the idea that console hardware is difficult too develop on compared to PC… but this ignores a very basic idea that we’ve already touched on: if the devs don’t want to make the game compatible with a system, they don’t have to. In fact, this is why Left 4 Dead and other Valve games aren’t on PS3, because they didn’t want to work with it’s hardware, calling it “too complex.” This didn’t stop the game from selling well over 10 million units worldwide. If anything, this was a problem for the PS3, not the dev team.
This also ignores that games like LittleBigPlanet, Grand Theft Auto IV, and Metal Gear Solid 4 all came out in the same year as Left 4 Dead (2008) on PS3. Apparently, plenty of other dev teams didn’t have much of a problem with the PS3’s hardware, or at the very least, they got used to it soon enough.
On top of that, when developing the 8th gen consoles, both Sony and Microsoft sought to use CPUs that were easier for developers, which included making decisions that considered apps for the consoles’ usage for more than gaming. On top of that, using their single-chip proprietary CPUs is cheaper and more energy efficient than buying pre-made CPUs and boards, which is far better of a reason for using them than some conspiracy about Sony and MS trying to make devs' lives harder.
Now, console exclusives are apparently a point of contention: it’s often said that exclusive can cause developers to go bankrupt. However, exclusivity doesn’t have to be a bad thing for the developer. For example, when Media Molecule had to pitch their game to a publisher (Sony, coincidentally), they didn’t end up being tied into something detrimental to them.
Their initial funding lasted for 6 months. From then, Sony offered additional funding, in exchange for Console Exclusivity. This may sound concerning to some, but the game ended up going on to sell almost 6 million units worldwide and launched Media Molecule into the gaming limelight. Sony later bought the development studio, but 1: this was in 2010, two years after LittleBigPlanet’s release, and 2: Media Molecule seem pretty happy about it to this day. If anything, signing up with Sony was one of the best things they could’ve done, in their opinion.
Does this sound like a company that has it out for developers? There are plenty of examples that people will use to put Valve in a good light, but even Sony is comparatively good to developers.

“There are more PC gamers.”

The total number of active PC gamers on Steam has surpassed 120 million, which is impressive, especially considering that this number is double that of 2013’s figure (65 million). But the number of monthly active users on Xbox Live and PSN? About 120 million (1, 2) total. EDIT: You could argue that this isn't an apples-to-apples comparison, sure, so if you want to, say, compare the monthly number of Steam users to console? Steam has about half of what consoles do, at 67 million.
Now, back to the 65 million total user figure for Steam, the best I could find for reference for PlayStation's number was an article giving the number of registered PSN accounts in 2013, 150 million. In a similar 4-year period (2009 - 2013), the number of registered PSN accounts didn’t double, it sextupled, or increased by 6 fold. Considering how the PS4 is already at 2/3 of the number of sales the PS3 had, even though it’s currently 3 years younger than its predecessor, I’m sure this trend is at least generally consistent.
For example, let’s look at DOOM 2016, an awesome faced-paced shooting title with graphics galore… Of course, on a single platform, it sold best on PC/Steam. 2.36 million Steam sales, 2.05 million PS4 sales, 1.01 million Xbox One sales.
But keep in mind… when you add the consoles sales together, you get over 3 million sales on the 8th gen systems. Meaning: this game was best sold on console. In fact, the Steam sales have only recently surpassed the PS4 sales. By the way VG charts only shows sales for physical copies of the games, so the number of PS4 and Xbox sales, when digital sales are included, are even higher than 3 million.
This isn’t uncommon, by the way.
Even with the games were the PC sales are higher than either of the consoles, there generally are more console sales total. But, to be fair, this isn’t anything new. The number of PC gamers hasn’t dominated the market, the percentages have always been about this much. PC can end up being the largest single platform for games, but consoles usually sell more copies total.
EDIT: There were other examples but... Reddit has a 40,000-character limit.

"Modding is only on PC."

Xbox One is already working on it, and Bethesda is helping with that.
PS4 isn't far behind either. You could argue that these are what would be the beta stages of modding, but that just means modding on consoles will only grow.

What’s the Point?

This isn’t to say that there’s anything wrong with PC gaming, and this isn’t to exalt consoles. I’m not here to be the hipster defending the little guy, nor to be the one to try to put down someone/thing out of spite. This is about showing that PCs and consoles are overall pretty similar because there isn’t much dividing them, and that there isn’t anything wrong with being a console gamer. There isn’t some chasm separating consoles and PCs, at the end of the day they’re both computers that are (generally) designed for gaming. This about unity as gamers, to try to show that there shouldn’t be a massive divide just because of the computer system you game on. I want gamers to be in an environment where specs don't separate us; whether you got a $250 PS4 Slim or just built a $2,500 gaming PC, we’re here to game and should be able to have healthy interactions regardless of your platform.
I’m well aware that this isn’t going to fix… much, but this needs to be said: there isn’t a huge divide between the PC and consoles, they’re far more similar than people think. There are upsides and downsides that one has that the other doesn’t on both sides. There’s so much more I could touch on, like how you could use SSDs or 3.5 inch hard drives with both, or that even though PC part prices go down over time, so do consoles, but I just wanted to touch on the main points people try to use to needlessly separate the two kinds of systems (looking at you PCMR) and correct them, to get the point across.
I thank anyone who takes the time to read all of this, and especially anyone who doesn’t take what I say out of context. I also want to note that, again, this isn’tanti-PC gamer.” If it were up to me, everyone would be a hybrid gamer.
Cheers.
submitted by WhyyyCantWeBeFriends to unpopularopinion [link] [comments]

[USA-CA] [H] Nearly-complete PC build (i5 CPU, 16GB RAM, MSI motherboard, Fractal Design case, Asus ROG Strix RX480 OC GPU) [W] Local cash, trade for Oculus Rift Touch

After some recent misadventures (I started out trying to upgrade this PC, but basically ended up building a new one by accident), I'd like to find a new home for my previous build.
As of right now, this rig is missing only a power supply and SSD or HDD (I migrated both items into my new rig), but is otherwise fine. This machine was built and used over the past couple years for light gaming and home-lab work, and the GPU was purchased new by me from Amazon and wasn't ever used for bitcoin mining or anything like that. I've found that it handles most modern games quite well at 1080p. Here are the specs of what I'm including:
Also, here's an album of timestamps.
I'm asking $350 (open to reasonable offers) local cash in the San Francisco Bay Area. I live in the East Bay and work in the South Bay, so I'm happy to meet up somewhere mutually convenient. I'm also open to trades for an Oculus Rift Touch setup, if anyone has one they're looking to trade. Let me know!
submitted by wowbobwow to hardwareswap [link] [comments]

Graphics Card Upgrade Suggestions

I've been holding off on upgrading my 670 due to the bitcoin mining price gouging; however, I think it's about that time to upgrade. I'm not interested in the 1080ti or 2080ti due to the crazy high prices. Even the 2080 seems a bit pricey at the moment. My question is, should I go with the 2070 or look to the 1070/1070ti series? I should note that I am also looking into purchasing Battlefield V, so those free downloads with the 2070 purchase are tempting.

My current PC specs are below.

LOOKING TO BUY:
2070

APPROXIMATE PURCHASE DATE:
ASAP

BUDGET RANGE:
Prefer to stay under $550

USAGE FROM MOST TO LEAST IMPORTANT:
Gaming; other tasks are minimal

CURRENT GPU AND POWER SUPPLY:
Gigabyte GeForce GTX 670 2GB
PC Power & Cooling 750W ATX12V / EPS12V

CURRENT MONITOR:
Acer Predator XB271HU 27" Monitor (1440p)

OTHER RELEVANT SYSTEM SPECS:
Intel Core i5-3570K 3.4GHz Quad-Core Processor
Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler
ASRock Z77 Extreme4 ATX LGA1155 Motherboard
G.SKILL TridentX Series 16GB (2 x 8GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800)
Seagate Barracuda 1TB 3.5" 7200RPM Internal Hard Drive
Cooler Master HAF 922 ATX Mid Tower Case

PREFERRED WEBSITE(S) FOR PARTS:
Newegg
Amazon

COUNTRY OF ORIGIN:
USA

PARTS PREFERENCES:
Nvidia

OVERCLOCKING/SLI OR CROSSFIRE:
No

MONITOR RESOLUTION:
1440p

ADDITIONAL COMMENTS:
There's an added incentive for me that some of the card listings come with Battlefield V (Newegg).
submitted by Kevin9809 to buildapc [link] [comments]

Mega FAQ (Or: Please come here for your questions first)

Qbundle Guide (Step by step setup & Bootstrap) https://burstwiki.org/wiki/QBundle
1( I want to mine or activate My account. Where do find the multiple coins?
You only need 1, an outgoing transaction or reward reassignment will set the public key. Get them from:
https://www.reddit.com/burstcoinmining/comments/7q8zve/initial_burstcoin_requests/
Or (Faucet list)
https://faucet.burstpay.net/ (if this is empty, come back later)
http://faucet.burst-coin.es
Or
https://forums.getburst.net/c/new-members-introductions/getting-started-initial-burstcoin-requests
2( I bought coins on Bittrex and want to move to my new wallet, but can't. Why?
Bittrex will only send to accounts with a public key (not a Burst requirement) so see number 1 and either set the name on the account (IF you will not mine) or set the reward recipient to the pool. Either action will enable the account and allow for transfers from Bittrex.
3( I sent coins from Poloniex/anywhere to Bittrex and they don’t show up after a considerable time. Why?
You need to set an unencrypted message on the transaction, informing Bittrex which account to send the funds to (this is in the directions on Bittrex). Did you do this? Contact Bittrex support with all the details and eventually you will get your funds.
4( How much can I make on Burst?
https://explore.burst.cryptoguru.org/tool/calculate
Gives you an average over time assuming a few things like: Average luck/100% uptime/no overlapping/fees on pool/good plot scan time (<20 seconds) if you do not have all of these, you may not see that number.
5( If I use SSD’s would I make more money?
No, it’s 95% capacity and 5% scan time that determine success. More plot area = better deadlines = better chance of forging a block, or better rates from a pool.
6( What is ‘solo’ and ‘pool’ (wasn’t his name Chewbacca?)
Solo is where you attempt to ‘forge’ (mine) a block by yourself; you get 100% of the block reward and fees. But you only receive funds if you forge, no burst for coming in second place.
Pools allow a group of miners to ‘pool’ together their resources and when a miner wins, they give the pool the winnings (this is done by the reward assignment you completed earlier), it is then divided according to different percentages and methods and burst is sent out according to pool rules (minimum pay-out, time, etc.)
7( I have been mining for 2 days and my wallet doesn’t show any Burst WHY?
Mining solo: it is win-or-lose, nothing in between, and wining is luck and plot size. Pool mining: because it costs 1 burst to send burst, the pools have either a time requirement (every X days) or a minimum amount (100 burst +) so you need to research your pool. Some pools allow for you to set the limit (cryptoGuru and similar) to be met before sending
8( How do I see what I have pending?
On CryptoGuru, based pools, it’s the ‘Pending (burst)’ column, other pools, look for the numbers next to your burst ID. One is Paid and the other pending.
9( I’m part of a pool and I forged a block, but I didn’t recieve the total value of the block, why?
A pool has 2 basic numbers that denote the pay-out method, in the format ‘XX-XX’ (i.e. 50-50) The first number is the % paid to the block forger (miner) and the second is the retained value, which is paid to historic ‘shares’ (or, past blocks that the pool didn’t win, but had a miner that was ‘close’ to winning with a good submitted deadline)
Examples of pools:
0-100 (good for <40TB)
20-80 (30-80TB)
50-50 (60-200TB)
80-20 (150-250)
100-0 (solo mine, 150+ TB)
Please note that there is an overlap as this is personal preference and just guidance; a higher historical share value means a smoother pay-out regime, which some people prefer. If fees are not factored in, or are the same on different pools, the pay-out value will be the same over a long enough period.
10( Is XXX model of hard drive good? Which one do you recommend?
CHEAP is best. If you have 2 new hard drives, both covered by warranty, get the one with the lowest cost per TB (expressed as $/TB , calculated by dividing the cost by the number of terabytes) because plot size is KING,
11( How many drives can I have on my machine?
For best performance, you can have up to 2 drives per thread (3 on a new fast AVX2 CPU). So that quad-core core-2-quad can have up to 8 drives, but a more modern i7 with 4 cores + hyper threading can squeeze 8 * 3 or 24 drives. (Performance while scanning will suffer)
12( Can I game while I mine?
Some people have done so, but you cannot have the ‘maximum’ number of drives and play games generally.
13( Can I mine Burst and GPU mine other coins?
Yes, if you CPU Mine Burst.
14( I’m GPU plotting Burst and GPU mining another coin, my plots are being corrupted, why?
My advice is dedicating a GPU to either mining or plotting, don’t try to do both.
15( What is a ‘plot’?
A plot is a file that contains Hashes, these hashes are used to mine burst. A plot is tied to an account, but they can be created (with the same account ID) on other machines and connected back to your miner(s).
16( Where can I trade/buy/sell Burst?
A list of exchanges is maintained on https://www.reddit.com/burstcoin/ (on the right, ‘Exchanges’ tab) the biggest at the moment are Bittrex and Poloniex, some offer direct Fiat-to-Burst purchase (https://indacoin.com for example)
17( Do I have to store my Burst off the exchange?
No, but it’s safer from hackers who target exchanges, if you cannot guarantee the safety or security of your home computer from Trojans etc, then it might be best to leave on an exchange (but enable 2FA security on your account PLEASE!)
18( What security measures can I take to keep my coin safe?
When you create an account, sign out and back in to your wallet (to make sure you have copied the pass phrase correctly) and keep multiple copies of the key (at least one physically printed or written down and in a safe place, better in 2 places) do not disclose the passphrase to anyone. Finally use either a local wallet or a trusted web wallet (please research before using any web wallet)
19( How can I help Burst?
Run a wallet, which will act as a node (or if you’re a programmer, contact the Dev team Bring attention to burst (without ‘shilling’ or trying to get people to buy) And help translate into your local language
Be a productive member of the community and contribute experience and knowledge if you can, or help others get into Burst.
20( Will I get coins on the fork(s) and where will they be?
There will be no new coin, and no new coins to be given/air dropped etc, the forks are upgrades to burst and there will not be a ‘classic’ or ‘new’ burst.
21( Will I need to move my Burst off of the exchange for the fork?
No, your transactions are on the block chain, which will be used on the fork, they will be visible after the move; nothing will need to be done on your side.
22( Where can I read about the progress of Burst and news in general on the community?
There is no finer place than https://www.burstcoin.ist/
23( What are the communities for Burst and the central website?
Main website: https://www.burst-coin.org/
Reddit: https://www.reddit.com/burstcoin and https://www.reddit.com/burstcoinmining/
Burstforum.net: https://www.burstforum.net/
Getburst forum: https://forums.getburst.net/
Official Facebook channel: https://m.facebook.com/groups/398967360565392
(these are the forums that are known to be supporting the current Dev Team)
Other ways to talk to the community:
Discord: https://discordapp.com/invite/RPhpjVv
Telegram (General): https://t.me/burstcoin
Telegram (Mining): https://t.me/BurstCoinMining
24( When will Burst partner up with a company?
Burst is a currency, the USD does not ‘partner up’ with a company, the DEV team will not partner up and give over to special interests.
25( Why is the DEV team anonymous?
They prefer anonymity, as it allows them to work without constant scrutiny and questions unless they wish to engage, plus the aim is for Burst to become a major contender, and this brings issues with security. They will work and produce results, they owe you nothing and if you cannot see the vision they provide then please do not ‘invest’ for short term gain.
26( When moon/Lambo/$100/make me rich?
My crystal ball is still broken, come back to the FAQ later for answer (seriously, this is a coin to hold, if you want to day-trade, good luck to you)
27( How can I better educate myself and learn about Dymaxion?
Read about the Dymaxion here: https://www.reddit.com/burstcoin/wiki/dymaxion
28( My reads are slow, why?
There are many reasons for this, if your computer has a decent spec it’s likely due to USB3 hub issues, or plugging into a USB2 hub, but other reasons can be multiple plots in the same folder, but it’s best to visit the mining subreddit. They can help more than an simple FAQ https://www.reddit.com/burstcoinmining/
29( I have a great idea for Burst (not proof of stake related)?
Awesome! Please discuss with the DEV team on discord https://discordapp.com/invite/RPhpjVv
(Please be aware that this is a public forum, you need to find who to ask/tell)
30( I have a great idea for Burst (Proof of stake related)?
No. if you want a POS, find a POS coin. On the tangle which is being implemented a POS/POW/POC coin can be created, but BURST will always be POC mined. You are welcome to implement a proof of stake coin on this!
31( Will the Dev team burn any coins?
Burst is not an ICO, so any coins will need to be bought to be burnt. You are welcome to donate, but the DEV team have no intention of burning any coins, or increasing the coin cap.
32( When will there be an IOS wallet?
IOS wallet is completed; we are waiting for it to go on the app store. Apple is the delaying factor.
33( Why do overlapping plots matter?
Plots are like collections of lottery tickets (and if only one ticket could win). Having 2 copies is not useful, and it means that you have less coverage of ‘all’ the possible numbers. It’s not good, avoid.
34( My local wallet used to run, I synchronised it before and now it says ‘stopped’. when I start it, it stops after a few seconds, what should I do?
I suggest that you change the database type to portable MariaDB (on Qbundle, at the top, ‘Database’ select, ‘change database’) and then re-import the database from scratch (see 35)
35( Synchronising the block chain is slow and I have the patience of a goldfish. What can I do?
On Qbundle , ‘Database’ select ‘Bootstrap chain’ and make sure the CryptoGuru repository is selected, then ‘start Import’ this will download and quickly stuff the local database (I suggest Portable MariaDB, see 34) (lol, loop)
36( What will the block reward be next month/will the block rewards run out in 6 months?
https://www.ecomine.earth/burstblockreward/ Rewards will carry on into 2026, but transaction fees will be a bigger % by then, and so profitable mining will continue.
37( How can I get started with Burst (wallet/mining/everything) and I need it in a video
https://www.youtube.com/watch?v=LJLhw37Lh_8 Watch and be enlightened.
38( Can I mine on multiple machines with the same account?
Yes, if you want to pool mine this can be done (but be prepared for small issues like reported size being incorrect. Just be sure to keep question 33 in mind.)
39( Why do some of my drives take forever to plot?
Most likely they are SMR drives, it’s best to plot onto another SSD and then move the finished plot/part of a plot across to the SMR drive as this is much quicker. SMR drives are fine on the read, just random writes that are terrible.
So plot an SMR drive quickly, plot to a non SMR or better still SSD drive, in as big a chunk as possible (fewer files better) and move. a version of Xplotter, called Splotter, can do this easily.
https://github.com/NoParamedic/SPlotter
40( I have a great idea; why not get listed on more exchanges!!
Exchanges list coins because of 2 reasons:
  1. Clients email and REQUESTING Burst and provide details like:
    i. https://www.burst-coin.org/information-for-exchanges
  2. The coin pays (often A LOT, seriously we’ve been asked for 50 BTC)
I suggest you speak with your exchange and ask ‘when will they offer Burst?’
41( Do you have a roadmap?
https://www.burst-coin.org/roadmap
42( Why is the price of Burst going up/down/sideways/looping through time?
The price of burst is still quite dependent upon Bitcoin, meaning that if Bitcoin gains, the value of Burst gains, if Bitcoin drops then Burst also drops. If there is news for Burst then we will see something independent of Bitcoin moving. Variations can be because of people buying in bulk or selling in bulk. There are also ‘pump and dump’ schemes that we detest, that can cause spikes in price that have nothing to do with news or Bitcoin, just sad people taking advantage of others.
43( Where is the best place to go with my mining questions?
https://www.reddit.com/burstcoinmining/
or https://t.me/BurstCoinMining
44( What hardware do you advise me to buy, is this computer good?
See question 43 for specific questions on hardware, it depends on so many variables. The ‘best’ in my opinion is a 36 bay Supermicro storage server, usually they have dual 6-core CPU’s and space for 36 drives. No USB cables, plotting and mining monster, anything else, DYOR.
45( Where do you buy your hard drives?
I have bought most from EBay in job lots, and some refurbished drives with short warranties. Everything else I have bought, from Amazon.
46( Can I mine on my Google drive/cloud based storage?
In short: no. If you want to try, and get to maybe 1 TB and then find that your local connection isn’t fast enough, or that shortly after, your account is blocked for various reasons. Please be my guest.
47( Can I mine on my NAS?
Some you can mine with the NAS (if it can run the miner, it can scan locally) but generally they’re not very fast. good for maybe 16 TB? Having a plot on a NAS and mining from another computer depends on the network speed between the NAS and scanning computer. I believe you can scan about 8 TB (maybe a bit more) and keep the scan times to within acceptable, but YMMV.
48( How can I set up a node?
No need to set up a node, just set up a wallet (version 2.0.4) or Qbundle (2.2) and it will do the rest
49( Are the passphrases secured?
I’ll leave the effort to a few people to show how secure a 12-word passphrase is: https://burstforum.net/topic/4766/the-canary-burst-early-warning-system Key point: brute forcing it will be around 13,537,856,339,904,134,474,012,675,034 years.
50( I logged into my account (maybe with a different burst ID) and see no balance!!
I have dealt with this very issue multiple times, and there are only 3 options:
  1. You have typed in the password incorrectly
  2. You have copy-pasted the password incorrectly
  3. You are trying to log into a ‘local wallet’ which the block chain has not finished updating
The last one generally leaves the burst ID the same, but old balances will show. No, this is not a security problem, and yes, windows loves to add spaces after the phrase you enter when copied, and that space is important in getting to your account.
51( Are there channels for my language?
Telegram:
Spanish: https://t.me/burstcoin_es
German: https://t.me/Burstcoinde
Italian: https://t.me/BurstCoinItalia
Forum:
Spanish: https://burst-coin.es/index.php/forum/index
Discord:
Spanish: https://discordapp.com/invite/RaaGna9
Bulgarian: https://discord.gg/r4uzTd
(there are others, please contact me to put up)
52( I am mining in a pool, and it says that my effective capacity is lower than I actually have, why?
  1. If you've not been mining for >48 hours, or just added additional capacity, it will take time.
  2. The value fluctuates (normally, +-5% but can be up to 10% at times)
  3. Read on the ‘Quick info’ tab about adjusting your deadline to compensate for changes i. revisit once a month for best results
  4. If you have overlapping plots it will also be lower so be aware of this (see question 33)
53( What pool should I join?
First of all, read question 9, after you have understood that it depends on the size (and how patient you are) select from the following list: https://www.ecomine.earth/burstpools/
54( What miner to use?
I use Blago’s miner, there are many out there but Blago’s works for me on CPU mining, it can be found in Qbundle.
55( What Wallet to use (I use windows)?
Qbundle: https://github.com/PoC-Consortium/Qbundle/releases/ guide: https://burstwiki.org/wiki/QBundle
56( What Wallet to use (Linux)?
https://package.cryptoguru.org/ for Debian and Ubuntu, for Mac. read:
https://www.ecomine.earth/macoswalletinstallguide/
57( Will i need to 'replot' after POC2 (second fork) happens?
No, there will be a tool which will optimise, but it is not CPU intensive (it basically re-shuffles your plot) and is just IO intensive. You do not need to replot.
TurboPlotter and https://github.com/PoC-Consortium/Utilities/tree/mastepoc1to2.pl are tools that will/can be used to actuate optimization, but PLEASE wait with optimization until after the hard fork.
58( Will the transaction fee always be 1 burst?
No, dynamic fees are coming in the next fork.
submitted by dan_dares to burstcoin [link] [comments]

How to mine bitcoins (solo mining) with the core client ... BTCFPGA.com ModMiner Quad Next Generation FPGA Bitcoin ... How I Increased My Mining Profits by 10x  Best CPUs for ... How To Mine Orbitcoin With GPU And CPU Quad Coin Mining - Tour of our Crypto Farm

MicroBitcoin CPU mining profitability calculator. On this site you can find out the income from mining on different processors and algorithms. Mining calculator yespower, yespowerr16, cpupower and yescrypt. Intel CPU i5, Xeon and new CPU AMD Ryzen. CPU mining profitability calculator. On this site you can find out the income from mining on different processors and algorithms. Mining calculator yespower, yespowerr16, cpupower and yescrypt. Intel CPU i5, Xeon and new CPU AMD Ryzen. The result: running Bitcoin mining software on those 600 quad-core servers for a year would earn about 0.43 Bitcoin, worth a total return of about $275.08 at current prices on major Bitcoin exchanges. "Its a waste of time, so any other company thinking about mining with their infrastructure, learn from us," said iDrive's Matthew Harvey. "Don't ... CPU: Quad core Intel Core i5-4440 CPU (-MCP-) cache: 6144 KB flags: (lm nx sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx) Clock Speeds: 1: 3101.00 MHz 2: 3101.00 MHz 3: 3101.00 MHz 4: 3101.00 MHz. Thank you for reading and be sure to leave your thoughts in the comments or hit me up on one of the social networks listed on the left! The short answer is about a penny per year, if you can get your electricity for free, and there is a catch. Here’s how you calculate this yourself. First, find out how many hashes per second your processor (or video card) can do. This page has sev...

[index] [34927] [25739] [33788] [26712] [7540] [27614] [12454] [29032] [24208] [33934]

How to mine bitcoins (solo mining) with the core client ...

Yes you heard it right we are currently QUAD mining or mining 4 coins at a time on our personal and clients miners. I thought I would share this first video to give you all a idea of what we do. *****UPDATE***** Solo mining has been removed from client. I'll keep the video up for how it used to work, it might still work for some alt coins (unsure) yo... Chamath Palihapitiya Live: Bitcoin Halving, BTC Updates, Price and Mining Chamath CEO 20,953 watching Live now Most Profitable and Efficient Way to Mine CrowdCoin Most Profitable Coin Today ... In this case, i am using an acer laptop V5-122p with AMD A6 quad core APU with 10GB ram. This laptop is 5 years old already. ... Mining Bitcoin Laptop AMD A8 quad core - Duration: 6:00. brandon ... QNAP TS-1277-1700 @ Container Station (LXC)

#