Granted, they can't do what this person's high-spec workstation can do, but they do most of the computing tasks most people use (used) noisy fanned computers with clacking disks for and in many cases do those tasks better.
And unless I'm just losing my hearing, my smartphone is completely silent as long as I don't accidentally press the Golem Invoker, er, Siri button.
4-6 months ago I built a new workstation for work. I had used one of those Corsair closed loop water coolers with the prior build, so I set one up on this guy. A month or so later my workstation was running really sluggishly, and I realized it was drastically throttling the CPU because of heat. I installed some software to spin up the fans on the cooler to keep it down under 100C, but now it'll get kind of loud when I run much heavy CPU.
Now, this is a pretty heavy duty workstation, 64GB of RAM and 3 displays. But, if I were doing a new machine for home to be quiet, I think it'd be a NUC. Then I'll put the box that has the 6 drive ZFS array in a closet and call it good.
Their highest end is available with ECC ram and a discrete graphics card: https://fit-iot.com/web/product/airtop2-build-to-order/
And the lowest end fitlet2 can be configured with an atom in a reasonable configuration for around $300 (it's $130-$200 for case/cpu/motherboard depending on which CPU you configure).
Gigabyte also has their BRIX products, which are similar to the NUC.
I was at a hotel this weekend and at check-in they had a monitor with a "ThinkVantage slotted in the back of the monitor, that might be a nice setup.
Eventually I just bought a cheap USB desktop fan and ran it facing the NUC.
It has made me think that instead of getting a NUC, for a quiet desktop system I should have just gotten a second used MBP and ran it permanently docked in clamshell mode with the monitors and keyboard attached. (with the added benefit of being able to go portable when I want to)
Neither the NUC nor my MBP is completely silent but for my purposes I find that I seldom tax either of them enough to where the fans become audible enough to be annoying. Still, I do find the difference in performance between them to be apparent just in things like iteration time on web development and IDE responsiveness.
You could hear the whine from across the room a few seconds before a call would ring through.
People always asked how I managed to answer the phone so fast. Electromagnetic Supplementary Perception, of course.
It was clearly electromagnetically "noisy", but I do't recall ever having heard any on my phones make any unexpected audio noise... (My old-and-abused rock concert and motorcycle weary ears probably can't get up as high as inverter whine any more though...)
Also, GSM phones used TDMA (keying the transmitter on and off to occupy one of -hm- eight, I believe - time slots on a given channel.)
This is practically asking for EMC issues.
LTE, on the other hand, transmits continously (I believe - I do not work in RF engineering anymore, but try to read up on new tech every now and then.:) - much less interference-causing than the constant on/off of TDMA.
Some were barely noticeable, while one in particular (a Droid Turbo) was so loud I could hear it getting ready to receive a call from another room. This was regardless of whether they were plugged in or not, although charger whine was its own separate issue.
Thankfully it does seem to be getting better over time- my current S8 is, as far as I can tell, genuinely silent.
There's a TDMA modulation frequency at 217Hz and this interferes with all sorts of nearby audio devices. CDMA and WCDMA phones have a much broader interference spectrum, which is why you don't hear it much anymore.
I considered returning it, but I find it charming. I miss the days when you could tell exactly what your PC was doing by all the sounds it was making, and I find dead silent electronics to be elegant but a little sad.
The easiest way to experience this is to plug in headphones and hear the clicking before/after a sound is made.
My Samsung Chromebook from 2012 has an ARM processor, solid state storage, and no fans. It is pretty slow by today's standards though.
I think several companies make cases for the Intel NUC boards that radiate the heat away and have no fans, too.
My Samsung Chromebook 3 gets a touch warm but never uncomfortably so like my 2012 Retina MacBook, which lets you really feel it when your code is inefficient. (Granted, the Chromebook is a lot less powerful)
525 lines / 2 for interlacing * 60 fields per second = 15750
They were still using CRT TVs in 2012 when I finished high school. I wouldn't be surprised if there were still plenty of schools with CRT TVs and VCRs for educational material.
I find coil whine a worse background sound than the lower frequency fan hum.
I was worried about performance, but it has been very acceptable. It depends what you need it for, but I can run 2 monitors, a Linux VM and atom all while streaming HD video. Or I can do light web browsing for 10 hours on battery. I love it.
On the rare occasions I need more power, I spin up a spot instance.
 If you use Linux on a laptop, install "tlp". It optimizes battery life without a noticeable reduction of performance.
Obviously newer models are different from older models, and Air models probably don't have the fans that Pro models do, but the claim that Macbooks don't have fans is a bit too broad to be true.
This is why these 'simple' naming schemes are confusing.
In Jobs' 2x2 matrix, the portable half was initially populated by iBook and PowerBook, later by MacBook and MacBook Pro.
It's unfortunate that Apple has confusing brand names, but the fact remains that the Macbook indeed has no fans so the original comment who hears fan noise is obviously using a different model of laptop.
So if you say that recent Macbooks have no fans, that may well be true. But it's not true for all Macbooks.
I mean it depends on what you do - but for many people it's a realistic solution.
Apple have put faster chips in their smartphones than in their laptops.
This is all alluded to in the article and the macrumors post that the article is based upon, here are some quotes:
> "Sure, that doesn’t mean the A11 Bionic can do all the things a desktop CPU does."
> "Though the iPhone X and the iPhone 8 offer impressive Geekbench scores, how that translates to real world performance remains to be seen."
There's no question that the iPhone chips deliver amazing performance, but there's a reason people still lug Macbooks around.
Geekbench always seems like an odd benchmark - the variability between runs alone is kind of odd. If I could run a compiler on an iPhone, for example, would I really see similar performance to my MBP?
It also doesn't pass the smell test. Even Atom CPUs are preferred over high-end ARM for netbooks. But a Xeon is way more powerful than any Atom.
Whereas on a normal computer, a light goes on and the fans start spinning, not because it's useful but just to indicate that it's on now. It's a miracle.
Then it dawned on me: I was currently logged in on the machine on his desk.
Some of the Chromebooks and such have no moving parts. I'm probably taking my Pinebook to the next conference I attend.
Can't you just use any USB-C Dockingstation?
This is due to the form factor, not the capability of the devices. A high end smart phone is more than capable of producing a good desktop experience.
> Even though this system is not meant to be a gaming rig, there’s no harm in putting in the best GPU you can without blowing the thermals.
You can get rid of that sound as well: just flip the switch on the left side of the phone.
Under-performant computers can and have been silent for a while. A phone falls into that category.
The trouble is making a good performance computer silent.
And even the case the article advertises, is pure garbage. I had the smaller ones (and the author should really have bought the black anodized one!). It works fine while underpowered. But as soon as you hit the 5h compile/rendering levels of workload, that thing cannot move heat away without airflow. period.
Smartphones are capable of computing and are sometimes very powerful, but the analogy is totally out of wack in my opinion.
If you can live without using an actual computer, it means you don't really need computers. You can check your email and browser the web on a Kindle, on your TV, or even your car.
Everything is a computer, then.
I think a computer is a productivity tool. Smartphones (and I'd definitely say tablets, too) are to consume content, not produce it. Some companies (most notably Apple I think) believe otherwise, but I think they'll have to come to realize that smartphones and tablets are horrible to produce most kinds of content.
You're just making up your own definition of "computer" and then claiming a smartphone isn't one because it doesn't match your made up definition.
You have programs, you have a UI to control them, they have a CPU and memory.
Otherwise everything is a computer.
Maybe cut to the chase: what specific capability is an iPhone lacking that every “real computer” has?
It's possible that I could be somewhat productive on a tablet in an emergency, but not as my main machine like Apple suggest people should do.
I have tried using an iPad Pro for productivity, and it's living hell.
Wouldn’t that mean most servers aren’t computers? Not to mention the DOS machines I grew up with, or everything made before the Xerox Alto?
> I have tried using an iPad Pro for productivity, and it's living hell.
Not going to disagree with you there. But that doesn’t make it not-a-computer.
If we want to classify devices, we need to group them somehow. Otherwise we call them devices and call it a day.
We already classify them: smartphones, tablets, laptops, desktops, servers are all groups of computers.
At that point, you're running a mouse-centric, multi-window OS with a wide array of software, that can run basically whatever you want.
So that's a computer, definitely, right? When does it stop being a computer? If you disconnect the display? Is it using a stylus instead of a mouse? Maybe the software keyboard instead of a hardware keyboard? (But then is the MS Surface not a computer when you detach the keyboard case?)
That said, I still think smartphones qualify as computers even by your productivity definition. Newer smartphones would sit somewhere above older netbooks on a ranking of overall utility.
I can on mine…
Later I changed desks to one that had one of those built in computer cabinets made of thick particle board. That did as much to silence a pc as all the tens of hours of effort I had put into meticulously researching and specc'ing the build before.
Super annoying compared to the rest of the build being a beast of a machine and watercooled that's so quiet I'm more likely to hear the noise floor on speakers than the PC (which is on the desk, next to said speakers).
a) Maybe coil whine is an intrinsic factor in the manufacture of graphics cards, similar to dead pixels on displays. "Luck of the draw" when obtaining one is the only way to win. Cycle through RMAs until you get one with little to no coil whine.
b) Or, it depends which company you buy from: each of Asus, EVGA, Gigabyte, MSI, Zotac et al are supposedly better or worse than the others.
c) Or, it's not a problem with the GPU at all; rather, it's an indication of a poor quality power supply (PSU).
I've never seen an informed analysis from an industry engineer who has a goddamn clue what they are talking about. NVIDIA could probably enlighten us all with an exact-science explanation, but that seems unlikely. My uneducated guess is that the situation is closest to option 'a' above, and that rejecting units for coil whine during quality control would drastically reduce production yield.
Instead, ping a few review sites and see if any of them are willing to take a crack at it.
Fortunately it happens when at high load, which is while playing games. This doesn't help for quiet scenes, though.
Is there a way to fix this sound in video cards? I'll have to investigate.
Most cases made today don’t have any significant dampening material. It’s pretty trivial to add some to the panels without significantly affecting cooling capacity.
That would require making your own GPU PCB, and maybe a year's worth of studying on electronics and power supply design.
A year’s worth of power supply design? Have you looked at the LM7805 datasheet? The circuit is one regulator and two capacitors...
And designing a 250W-capable linear regulator is not as simple as just hooking up a LM7805.
So with a modern CPU and GPU, your talking ~400W of power to the actual components, and nearly 500W wasted in the linear regulators. This of course also means you have to get a 1000W PSU as a bare minimum.
Yeah and good luck driving a 200W linear element (if it even exists, lol) with a few op amps--the driver which should deliver a few amps into the gate/base of the pass element, which in and of itself is a pretty difficult challenge.
LMFAO you can't be any more wrong. You need /much/ more careful design to get GPU-compliant performance. The dI/dt on modern ASICs are insane, and you need an insane regulator to deal with it.
In the end I went with a Thinkpad and having seen the issues people I know have had with the XPS15 I'm pretty glad I did.
I had a Logitech G500 with awful coil whine and the opposite problem: it would stop whining when moving and start when idle. I suspect it had to do with the power saving mode that lots of mice have, where the laser power supply ramps down to dim the laser illumination after a period of no detected motion.
coil whine is highly unnerving while low fan sound is relaxing.
we like stimulus, the clicky keys of my old hp48 is neat, the insertion sequence of pioneer 32x slot-in cd drive was amazingly subtle; not long ago I revived an old HP tape drive, the tape rolling and the head gear was also beautiful.
Also, it was as cute as informative, it's a clear state change side channel. Often software notification about hardware are decoupled so much that you don't trust it; plus they're invasive, unlike a tiny led, a click, a tiny motor ramping up.
Now it’s just when the fan on my laptop starts taxiing for takeoff, which can take a lot longer.
I like the sound HDD make when grinding (except when I don't know the reason for the grinding... looking at you svchost.exe).
> coil whine is highly unnerving while low fan sound is relaxing.
Which is why I have been putting off getting a new laptop for years now. Most seem to suffer from coil whines and I can't stand it (to the point I ended up using an old eeepc 1000he rather than a brand new 16 inches VAIO some years ago).
This might be "SuperFetch".
However my graphics card (RX 480) is quite loud, and one bearing is making noises.
But the Fractal Design case has really dampened the sound. For my home server, it's using a RM500 (which also never turns the fan on), and a low-profile Noctua CPU cooler. No other fans, but I do hear the 6 HDDs spinning and seeking when it's real quiet in the room.
I don't hear any coil whine, except when using headphones plugged into my desktop's speakers--probably the result of the speaker system's power supply. Klipsch ProMedia, if you're curious.
As far as noise reduction for CPU cooling, I'd suggest buying more air cooling than you need for a modest TDP CPU. Between that and my fanless Seasonic power supply, and SSD, the only noiseI can ever hear from my machines is from the GPU.
I'm looking at buying a new machine soon, and was actually looking at a Fractal Design case with either one of those 2 cooling systems.
This is a Ivy Bridge (IIRC) Core i5 at 4.0GHz.
Seems obvious in hindsight, but I had no fan (pump) speed warning or anything.
I was tired of the never-ending quest for silence, so I bought 3 50-ft dvi cables and a couple usb-3 cables of the same length and put the PC in the attic.
It worked great, except any hardware issues resulted in a trip to the attic.
I work from home, so i'm on it several hours every workday, combined with the fact that I tend to have multiple things in-progress all the time means it would be a giant pain in the ass to shut it down fully.
Still kinda miss that couch.
This was my second silent PC. The first one still had moving disks, but I went for as few fans as possible, and had passive coolers on the internals. This time I did the opposite: lots of fans, but have them spin as slow as possible. This works very well.
But coil whine and electronic hums are easy to overlook when you're choosing parts. It's worth looking at not just the fans and the power use (more power needs more cooling), but also the quality of the electronics.
But when I walk to the backside of my desk I can hear some electronic buzz from one of my monitors. Whats funny about it: I have that monitor since a few years now and before I built that silent PC and turned my desk to another direction, I never noticed the buzz from the monitor :D
I had a Geforce 280 that would scream like hell whenever it was at full power and its framerate went below about 10 or above about 100 FPS. I was glad when it broke some other way and got replaced under warranty.
There are 5 fans in my tower, two on the CPU cooler, only one of those two fans is running constantly, at only 200rpm. All the others aren't running most of the time.
I don't need my computer to run at a cool 30°C all the time. The hardware can run very hot without any issues. And when all the fans eventually kick in under load, it will always keep under 70°C anyway.
And it's doing fine. Its cooling is slightly over-sized since I want to keep fans spinning at very low speed, but I haven't seen much difference compared to when it was outside.
That's a silly and extremist position to take. "insecure" is relative, not absolute. It's a certainty that the software he's running has far more quantity of vulnerabilities and a much longer history of them. I don't know his exact use case, but arguably his use case isn't one where Spectre is particularly more severe than even a userland, non-priv-escalation vuln. (eg ransomware doesn't require root access to hold all your files hostage.)
> Eliminate the moving parts (e.g. fans, HDDs) and you eliminate the noise — it’s not that complicated.
Ha! And yet it deserved a detailed blog post. I'm surprised he would say this even after the amount of effort he spent.
Imo the AMD drivers are way better than Nvidia's drivers. They're included in the kernel and therefore open source. Compared to Nvidia's proprietary drivers that have horrible support with a lot of compositors and lack support for DDC/CI over DisplayPort. The Nouveau drivers are better (slower performance but better compatibility), but are unable to change the clock speed (and are set to the minimum).
The AMD drivers "just worked". Selling my RX480 for a GTX 1070 was the worst decision I made when it came to compatibility. Now I can't even get Vsync to work with this Nvidia crap.
And Windows is behaving weirdly as well since I installed the latest drivers. Black bar on top of full screen programs after waking from hibernation, The HD audio driver not letting pulseaudio start (I need it to get sound from WSL) and crashes when multiple 3D accelerated VMs are open. And restarting the GPU driver (with either the shortcut or through device manager) is what solves all the issues and they only occur with the latest driver.
And the crappiest part is that there's nobody that can help. Getting someone from nvidia to respond on their forums is basically luck, and I'm not a huge company that can get their reps to get someone to help me.
What puts AMD GPUs in a weird situation where you can expect them not to work very well when they are new, but to improve until you can forget about them. (The inverse of the NVidia GPUs, that work ok when new, but slowly loses compatibility with time.)
Also, you can't even compare NVidia's drivers to them, since they don't even support Wayland properly!
AMD linux support was downright abysmal pre ~2015.
NSG S0, once out, will most likely be the go-to case for such setups. Until then, an HDPLEX H5 is cool.
My desk has a H5 on it, housing an i7 8700 (non-K) and a GTX 1060. The TIM under the heatspreader is replaced with Thermal Grizzly Conductonaut and Thermal Grizzly Kryonaut is used as every other TIM that the case setup needs. The CPU is on stock clocks with a voltage offset of -30 mV. The GPU has the power target reduced to 90% and clocks increased by 130 MHz, so that it is effectively undervolted as well. The PSU is a Seasonic Ultra Prime Titanium 650. Prime95 with AVX throttles really, really fast, under a minute, perhaps, but is a very unrealistic load. Non-AVX stress tests and FurMark take a while to start throttling (20 minutes?), as the thermal capacity of the aluminum case is quite big. After hours of gaming, the GPU and CPU float around 80 C while providing full stock performance. I don't do 3D rendering (other than in-game) or video en/decoding, so have not had long, real-world, full loads to see how temperatures behave with those.
From the discussion I've had and forums I've read, I think that people are afraid of putting more power in passive cases and having their components at "high" temperatures, despite those being rated for them.
I suppose blender would thermal throttle the cpu as well. If you run any non-Xeon/non-Laptop Intel chip (greater than 2k series) and care about temperatures - delid the bugger. (Xeons are soldered, laptop chips don't have IHS). Intel uses something that's worse than toothpaste, plus tons of glue between the die and the IHS. If you see temperature deltas under full load more than 9-10C between the cores, the thermal paste between the die and the IHS might have missing spots or have dried out. In your case removal of the IHS altogether would provide decent results.
You might wish to check the VRMs, they are rated at 125C but if the case is hellishly hot inside, they might not be able to dissipate the heat.
Metal is an incredibly good conductor on its own, and the properties of thermal paste (typically) are just barely better than air. So long as your cpu and heatsink are fairly flat surfaces and mashed together physically, it seems like either forgoing or having the absolute minimum amount of paste is ideal. I've used a razor to leave an absolutely minimal layer of paste (e.g. filling in sub-millimeter surface structure) on my latest build, and cpu temperatures are well within a reasonable range. But I'm also not trying to OC the cpu or anything.
I am not certain how you have managed to come to such a conclusion. Thermal conductivity of air is around 0.03W/(m·K). Good thermal, non-conductive paste is like 12.5W/(m·K) (or 400 times better than air). Conductive ones are in the region of ~40-80 W/(m·K) and Aluminium is 237W/(m·K). Also air also expands pushing the cooler and CPU away.
Normally you if choose between "too much" and "too little" paste, you pick the former. The pressure pushes out the unneeded amounts.
I would be extremely surprised if increased pressure due to air at higher temperature played any role whatsoever unless the bolts connecting the heatsink and cpu were very loose. If anything, I'd expect the increased conductivity of air at higher temperatures to dominate.
I'd also expect there to be effects at the metal-paste and paste-metal interfaces which reduce the effective system conductivity (i.e. phonons are much more likely to reflect in this scenario than in a metal-metal interface).
A fun thing to try is using a modern low-end CPU (latest i3s, Pentiums, Celerons) without its cooler. Not advised by Intel, of course, but you might get into your OS of choice even before it starts throttling. I'm somewhat comforted by the fact that a CPU automatically powers of once it reaches something above 100 C (103 maybe?) and throttles a few degrees before that. Those temperatures shouldn't leave the silicon damaged.
In practice, thermal paste is a must. If you don't like those (I personally don't, they get everywhere by accident and can be tough to remove), try getting an IC Graphite Thermal Pad which is reusable and rivals really good, if not the best thermals pastes, according to the limited number of reviews I've seen. I think that its practicality beats better results in non-highest-end applications.
Smallest ammount of TIM spread all over. NO "PEA" METHOD! All over!
The Cool Laboratory Liquid Metal stuff is the best but hard to work with.
The CPU is delidded! I've got another i3 4300 delidded as well running under a NoFan CR-80EH. Delid + Conductonaut + Kryonaut made the difference between throttling vs hovering around 90 C in FurMark + Prime 95. When integrated graphics aren't used, the CPU runs cooler, of course, and didn't throttle with MX4 thermal paste and no delid.
I do fear that VRMs are running too hot. When selecting components, I picked those that come with some heatsinks on VRMs at least. The motherboard is an AsRock Fatal1ty Z370 Gaming-ITX/ac (non ITX motherboard wouldn't fit in the case anyway with an ATX power supply). The graphics card is Gigabyte's cheapest offering and has a small sink across the VRMs. I'm hoping that undervolting will help keep the VRMs in check.
There are multiple scenes to render as benchmark (I guess BMW one is the shortest/most popular). https://www.blender.org/download/demo-files/
I enjoyed earlier days of "Silent PC" building, ten or fifteen years ago. For example, building a silent tower or desktop for a DAW or softsynth back then in a recording/studio environment required some ingenuity. SSD? Not on a hobbiest budget. I recall one build, not mine, fully immersed in a bucket of oil (mineral?) for passive heat dispersion.
Today, as a new reference point, any MacBook Pro within the last few years may qualify as truly silent for many people's everyday usage. It does for me. And when I do heat up the CPU/GPU with heavy tasks, the fans spin up but then they go away completely as soon as the hard work is done. Back to silent.
No more spinning platters or crappy fan bearings or poorly engineering airflow nowadays. :-)
There's no hacker pride in buying off-the-shelf, so the performance bar for DIY is higher. Progress!
This is admittedly pointless pedantry.
Of course it can’t be completely silent. Heat generates air movement which is “sound”.
By 0dB he means 0dB SPL which is give or take correct.
It's really surreal and a number of people actually became nauseated at the sensation.
Linus did a silent PC build which even in a sound proofed case and at idle was about 14dB and broke 20dB under load: https://www.youtube.com/watch?v=RXZrWqCT7R0
Even the high end microphone used to record the sound level in this video produced it's own 7dB of noise.
(This is unabashed pedantry. But I'm on my lunch break, so...)
No point in making your PC less noisy than the noise floor.
So you either need to carefully look at pictures / guestimate the location of components, or hope that someone else on the internet has figured it all out for you. Alternatively, you could just be ridiculously lucky.
But I don't really believe in luck personally. If this guy has been doing these builds for as long as he claims, he probably had a fair bit of thought go into the particulars of this build. The kinds of thoughts that don't go very well into blog posts because they're uninteresting (but very important). Like, does X combination of components fit or do I do Y instead?
With respect to the build itself: I'm surprised he was able to get 60+W CPU and a 60+W GPU in there. Most silent builds I see basically use laptop parts (30W or less) to keep the heat down.
32gb of RAM will be plenty, and 8 Zen-cores is plenty strong. The GPU is a bit weak for gaming, but he should be able to play plenty of the lower-end games without much issue, even at 1080p / 60fps (probably Overwatch for example). He probably can't run Witcher3 at 4k on Ultra, but such a GPU would blow out his thermal-design completely.
But, like you say, even though I had done all that, there was still the possibility that I might have missed something and would need to return/replace some parts and rework the plan. My mouse hovered over the "Order" button for quite a while before I finally committed. An anxious moment.
Virtually none of that methodical and boring research/prep work made it into the post — it's just not that interesting. Necessary, yes, but not something that the vast majority of readers would want to read.
The setup should be able to handle a 105W CPU and 75W GPU. At this point 65 + 75W is confirmed. When I OC the 1600 I should be able to verify up to ~95 + 75W.
The strategy was to start with a really well engineered passive case, then select components that could be pushed right up to the thermal limits. It's worked well so far. I'll keep publishing all of my test results for those that may be interested.
Edit: I currently have two gtx1050 in it running at full power, and can't hear the fans at all.
I often run the engine to heat my water and the very slightest change of tone has my ears pricking up, panicking with "I've not heard that sound before! What's up with it? Is it broken? Am I going to sink??" running through my head.
Fans don't tell the whole story. Network traffic is super useful to see as well. Also, if your computer never gets hot enough, it can be doing plenty of unwanted background work without setting off the fans.
There are also power button remotes that could work, depending on distance: https://www.amazon.com/dp/B01MQUANS8/
I screwed four lengths of 4" x 2" between two roofing joists to make two trapeziums. I then screwed a 3/4" thick sheet of chipboard to the bottom and then wrapped the top and sides in roofing felt (some protection in case of roof failure).
The loft is lagged 200mm deep with 3/4" chipboard on top which deadens the noise nicely. The Dell was running the fans quite a lot today when I went up to check on something - 25C outside in Somerset, UK today.
On a more serious note, would it be possible to achieve said silence without too much compromise by some variant of water cooling, with a good sized reservoir? And use the heat to drive circulation rather than a pump?
Because there is no pump the water will flow much slower, therefore you need much bigger pipes. Think at least 3x the diameter you would think you need.
Also because you are depending on thermo siphon you have to ensure you have to ensure the water flow works with gravity. The heat source must be at bottom, and the radiator must be at the top. You don't get any flexibility go around something that might be in the way, the pipes must always be sloped in the correct direction.
If you are using anything other than water you need to pay attention to specific gravity (and how it changes with temperature), and viscosity. Probably something else that I'm not thinking of.
It probably can work. I have an antique tractor that uses such a system. A 2.2L, 16hp engine, has 24L of coolant, and the pipes between the engine and radiator are noticeably bigger than my truck with a 7.3L 250HP engine. Remember that this was designed to run at just under the boiling point of water (antifreeze is too modern). You probably want your computer considerably cooler, which implies even larger systems than a similar active one.
Instead the other day on Youtube I don't recall how I landed to a 3M infomercial about their Novec dielectric liquid used for immersion cooling. That stuff looked interesting, but they didn't spilled (pun ^__^;) details about cost or health issues tho.
Both of those have a vastly higher global warming effect than 'just' using Pentane, but that is rather flammable so one would probably want to ensure the oxygen concentration in the room is too low to result in a flammable mixture. This would probably mean that a human couldn't breath without external oxygen supply, but that should not be more expensive if it is a suitably low-physical-maintenance location, i.e. the systems are installed once and at most swapped out when obsolete or in case of component failure. A human could probably make due with an oxygen tube in his nose and some way to prevent the exhaled oxygen from sticking around in the (probably low-airflow) room.
The cost for FC72 is about 300$/kg, keep in mind the density is about 2.3 times that of water. Novec should be somewhat below, I think (otherwise there is little reason to offer it).
I considered soldering the LGA pins to the CPU, but due to the craziness of this idea I haven't researched it further.
The thing is, that a dual-socket 32 core/socket EPYC clocked/overvolted for full stability and an overvolting-induced lifespan of 5 years, which does not have problems with power delivery on the way between the interposer (the thing the dies are mounted to and that connects them to each other and provides LGA pads) and the wall socket, would probably be the most affordable (TCO, i.e. including electricity and maintenance) system for non-distributed compiling of reasonably parallelizable software. It would be 'the' ultimate workstation in the sub-$15k range, this would be the full system including the single-quantity amount of, in this case, Flourinert (which is not toxic; there was even research to fill the lungs with it for high-G environments, as it prevents lung collapse. There were issues circulating the liquid fast enough due to the much higher viscosity compared to air, so they dropped it.) and a custom-manufactured (from a local metal shop) containment case for the system and work from a plumber to provide suitable piping. The only non-included thing is the manufacturing and mounting of the nozzle that provides the required flow rate across the CPU to keep up with the heat.
Short answer: mineral oil does not provide suitable cooling for high-TDP processors.
One is the good old P1 125Mhz router we had like 10 - 15 years ago. The only active cooling in that system was in the PSU and that thing throttled it's fan off during normal operation. It had zero moving parts outside of the PSU. You couldn't even get fans for those CPUs. These old systems were fun :)
And beyond that, this makes me think of the computer requirements of coal mines, and I think mining in general. In a mine, a piece of compute has a defined maximum energy emission per square inch surface. If you exceed that, you risk coal dust explosions, which are rather inconvenient, loud and adverse to throughput. The stuff in the article is very similar to industrial computers in such a setup. Quite interesting.
The thing was all text/ncurses.
The terminal was an under-clocked PII, with all the fans removed and connected to the server using a serial cable and it was kind of necessary since the environment was quite hostile and sticky (spilled beer, some fat deposit, etc).
If I recall correctly before the PII, it was a VT-100 console, so even more basic than the PII.
It cannot power a single mid-range video card today or pretty much any modern equipment. I would have loved to be able to use it.
Anybody making fanless PSU's today?
It would be an exotic option for sure. And not something you'll find pre-built.
Massive heat sink you say?
This is cpu and gpu fully passive. Amazing. But $1000.
There's a lot of PSUs nowadays that will work with the fans completely off up to ~75% of their rated power. Which is adequate for most situations.
Speaking of silent computing; unfortunately my monitor is "louder" than my PC in desktop use for most brightness levels. Anyway, personally I've given up the goal of doing a solid-state PC; a single fan chosen correctly in a case is virtually silent and makes the cooling design much easier; in my experience most PSUs and GPUs produce similar amounts of noise even without fans (and with bad luck, they can be worse).
They have more, that are non-standard ATX.
Or you could just submerge the whole machine in mineral oil. That technique really sucks, though. As cool as it is to have a fully submerged computer, it's hell to clean literal liquid laxative out of a PCIe slot when it's time to upgrade the graphics card.
The most bizarre thing I've seen, when the power requirement for a single brick wasn't enough, they used two power bricks, for example the Zotac Zbox EN1080 mini PC.
Is any system honestly 100% “securable”? I would argue that any computer, regardless of cpu manufacturer, is “unsecurable”.
And I believe there's Spectre v2 and/or Meltdown v2 now as well.
You just can't win.
I imagine any computer that approaches maximum security also approaches minimum utility.
Presumably the author just thinks that Intel is a little too low on security compared to the other offerings, and not that the other offerings are actually "secure".
Information can be extracted from airgapped computers by measuring magnetic fluctuations, so this isn't quite true!
Perhaps it is impossible to build a computer without an output mechanism.
Unless you limit your scope to say, Earth, and somehow build a computer outside of Earth's light cone. It is thus causally disconnected from Earth and has no output mechanisms that can be read from anyone on Earth.
But this is more about security engineering philosophy. The Intel management engine relies on a lot of security-by-obscurity, which is not the best way. But I think that Intel chips are passably secure as long as you watch what you're doing.
(I can personally recommend both the vendor and the card!)
From personal experience, I tried to silent cool an RX 550 (50 watt vs 75 watt of 1050 ti) without VRM heatsink and the VRM temnperatures were up to 100-103C during a torture test. Adding a fan lowered the temperatures by 20C.
I have more small heat sinks I could put on the mosfets, but they are relatively low profile so I'm not sure they would improve things much.
A single, large heat sink, with long fins (that would protrude above and beyond the nearby capacitors) would seem to make more sense — but given there are no mounting points I could use to clamp it down I'd be a bit concerned about the long-term effect of torque.
Opinions and suggestions are welcome.
Same basic chip architecture as the much more expensive 1070/1080. Not something you'd want to run newer 3D games on, but perfect for media center use.
If you wanted a setup slightly more capable of 3D games then the 1050 would definitely be a consideration.
I'm itching to stuff one of the recently release Raven Ridge APUs in a 3 liters case with a PicoPSU and have a portable, reasonably powerful desktop that I can stuff in a backpack, I'm waiting for a little more mature Linux support, at least a stable 2.4.17 kernel...
It may sound (no pun intended) like an exaggeration, but I think more powerful computers going completely silent (fanless) will be the next most noticeable breakthrough since we got Retina displays on computers in 2012.
I doubt that. A while back I removed all of the moving parts from an old 486 and netbooted it, only to find out I could actually hear the network card making very faint screeching noises whenever there was network activity. You've probably heard the same effect on a much larger scale, at much lower frequency, when you hear the 60Hz hum of a "silent" transformer.
It can’t be heard from 1cm away
Not only did that processor not exist at the time, but even if it did there were going to be a bunch of compatibility issues with it on Ubuntu (like there always are when new hardware is released). I simply didn't want to have to deal with those sorts of issues for — potentially — months.
Maybe the Ryzen 5 3400G(E) will be a compelling upgrade? We shall see.
This is just random FUD.
(1) AMD boards are vulnerable to Spectre - https://www.amd.com/en/corporate/speculative-execution-previ....
(2) There are other vulnerabilities that affect AMD boards - https://community.amd.com/community/amd-corporate/blog/2018/...
(3) Vulnerabilities are found everywhere. That's not the first time vulnerabilities are found, and it won't be the last. How is that an argument for anything exactly?
Irrespective of how serious you consider these vulnerabilities, the way Intel handled them can very well be a "final straw".
I worked at a VFX company, where we did interactive client sessions (think: zoom! enhance!) which required a totally silent setup. In somecases that meant trying to silence an entire rack of disks (baselite 8,http://erwanlecloirec.typepad.fr/digital_flipbook/2000/10/ba... big glowing thing at the bottom of the page)
Obviously this is impossible, so we used DVI senders (https://www.eastwoodsoundandvision.com/blackmagic-design-mic... they are HDMI nowadays) and remote USB
This has the advantage that we can have a full bore machine with no compromises.
A decade ago, I was obsessed with fans and at one point I had more than 10 in my workstation. The noise it created kind of resonated with me, like I would be able to tell when the job was CPU intensive, HDD or GPU; I guess it felt more lively!
But after the smartphone boom, I got more adhered to power efficient, noise less systems (Smartphone/Tablet/SBC's/Chromebook) & after the meltdown spectre (pun intended) my adherence got reinforced.
Anyone looking for low-cost completely silent yet portable system in laptop form-factor can take a look at Chromebooks, now that Google has announced Linux apps support it's usefulness would tend to grow.
100% quiet, not super fast, but fast enough.
Sadly, that's only useful for 60Hz, as high refresh rates require a short displayport cable to work. For this reason I no longer use this solution.
If you're ok with 60Hz and have compatible room layout I recommend this. The biggest advantage is that it's completely PC agnostic, which makes upgrades way easier, as standard cooling solutions are fine.
I've just written a front end launcher for the apps/games, and I found some working USB drivers for DOS that makes transferring software over nice and easy.
I'll probably post a Show HN when I'm done.
 22cm x 22cm x 4cm
 DOOM as it was meant to be heard.
And yes, you need the EPIA Soundblaster drivers for DOS, but that's not that different from normal Soundblaster cards. It was a bit of an uphill struggle getting it working, but it's quite simple once you get the right version of the drivers.
In my case it is the monitor that of all my components has the worst coil whine (actually the PWM brightness regulation).
I also went with 32GB RAM because I make multiplayer games and I need to be able to run multiple clients at the same time which takes alot of memory.
This build is my last computer, peak moores law will make it so.
Would love to know what application this was built for.
As a recording studio enthusiast, I can appreciate the pursuit of silent PCs.
What other use cases?
So we have the usual suspects: Watching videos and movies, staying abreast of current affairs, listening to music.
Then some not so-usual suspects: Research and self-education, software development, and a simulation that I've been running and tweaking for several years.
This computer is on for an average of about 15 hours every day of the week, and is pretty-much always doing something. It spends virtually no uptime idling.
I have three monitors and a laptop and not one is inaudible. They all become inaudible when I set their brightness to 100% but this is not acceptable to me.
I was able to mostly fix my Dell U3011 by replacing power supply capacitors.
These were placed in worst possible spot, bathing in hot ascending air assuring shortened lifespan. One could wonder if it was done intentionally...
Because of the 35W CPU, the system was cheap and easy to build.
What I have learned from it, is that my screen has a silent buzzing sound... :)
There was also some ready solution like this, that used heat dissipation through the walls of the case: https://airtop-pc.com/airtop/natural-airflow-technology/
Someone suggested in the comments that I should just wear earplugs instead. This was my response to him: "But the rest of the world doesn’t make annoying noises that I want to block out — only the computer does. I still want to hear the birds tweeting in the trees and the wind blowing past the awnings and the rain hitting the roof and the ice-cream truck driving by and my wife chuckling at something she’s watching and the cats running along the floor and the microwave oven heating a meal and the kettle finishing its boil. I’m not trying to disconnect from the rest of the world — I’m trying to reconnect to it."
A lot of the most interesting and enjoyable sounds in life are very, very faint. It doesn't take much noise to drown them out. Even a single fan will do it. Now I can hear those noises and get 'work' done at the same time. It's really hard to describe, but 'magical' comes close.
Thanks for asking.
Of interest, "Jay's Two Cents" did this clickbaity video where they daisy chain 4 radiators together. This would be one of those entertainment value only stunts, except that they discovered that they could shut off all the fans. Having only a pump running, you can passively cool both the top-end CPU and a high-end GPU. If someone can put the pump inside a nice machined aluminum case, this could be a DIY recipe for a fanless silent PC.
However, it's usually significantly better, cheaper and easier to build an "almost" silent system, just by using good heatsinks and very slow fans. A small amount of airflow is usually significantly better than none at all.
Possibly caused by the piezoelectric effect on an SMD ceramic capacitor, making it vibrate and hit the PCB surface at an audible frequency.
Now all I can hear is everyon else's PC's whining around me (there not even that close!).
The thing is not perfect, the case can get quite hot, specially in summer, but it works reasonably well.
I use the thing as a media center (xbmc/kodi), and it's quite nice to have a fanless/zero noise setup for this usage.
It overheated once (Linux/Ubuntu suspended automatically), my room was also 28°C or something and I'd been encoding videos non-stop for hours.
For those that like the looks but want moar powerrr, Streacom have an actively-cooled version on the drawing board. Do an image search for "Streacom DA4" if you're interested.
Overclocking, fanless PC, etc. are all fine if you can make sure that there is no problem if the ambient temperature raises to 40 C - or even higher, if the machine is accidentally exposed to sunlight. I don't trust most existing fanless solutions to work reliably under such conditions.
I started doing this because quiet computers just weren't available at decent horsepower. Rad to see a build like this!
I'd also like to present the loudest computer I've ever seen as a counterexample.
Things without moving parts emit sound. LED lights are a simple example. Anytime you have a wave -- be it mechanical or electrical -- you'll have some of the harmonics will 'bleed' into the acoustic spectrum.
The firmware hides the read-speed decrease, or prolongs the detection of it, by shuffling data around in the background.
Poor choice of SSDs, Id go with SanDisk Extreme Pro or any other version even Intel.
This was before solid state storage options were affordable.
It was a Pentium 1 (can't remember the model) and I had disconnected the PSU fan. Ran for a long time and I used it mostly as a router.
The new 600W: https://seasonic.com/prime-titanium-fanless
I'm actually on the lookout for a wired, backlit, white/aluminium, chiclet, TKL keyboard. Basically something like the Apple Keyboard but sans NumKeypad, backlit and keyed for Linux/PC. Bonus points if it has a column of macro keys on the left.
If anyone knows of one please drop me a line.
This is a somewhat deep rabbit hole, but what I will tell you is that you'll probably have a hard time finding what you want in a chiclet keyboard because there's not much of a market for chiclet keyboards among keyboard enthusiasts. Chiclet is both a switching mechanism and a keycap; most of the keyboard-enthusiast community is working with setups that involve separate switches and keycaps that are interchangeable.
What you probably want in practice is the following characteristics: a specific key layout, all of the keys should be the same height, the key travel should be low (i.e., the amount of distance you have to depress the keys before registering a keystroke). No idea if you want the keys to make noise or not, or how stiff you want the springs to be, but that's customizable too. If you want chiclet keys because you want low key travel, you have a lot of options. If you want something really thin, there are way, way fewer options.
A single left column of macro keys is not something I've ever seen, but the one keyboard I can think off of hand that has a similar layout is the Red Scarf II, which has two columns of macro keys on the left side; unfortunately it's not currently in production. Some people use an external num pad and put the num pad on the left for this purpose. https://www.massdrop.com/buy/red-scarf-ii-ver-b-custom-mecha...
I personally use a KBD75, which is an aluminum body, tenkeyless layout with a right-hand column, and it's all programmable with QMK so you can make any key do whatever you want. For example, I have a key that, when I press it, reverses the position of the Alt and Win keys, so that I can switch between layouts for either PC or Mac and have the Alt, Command, and Windows keys always in their correct location. Images and build information here: https://imgur.com/a/5pSva2A
you can get keycaps that have a flat profile that are interchangeable with any MX-Compatible switch. DSA Granite is a super popular flat-profile keyset like that: https://pimpmykeyboard.com/dsa-granite-keyset/
it's pretty common to add rubber o-rings to the keys to reduce the travel. But chiclet keys are generally going to restrict the rest of your options.
Anyway reddit.com/r/mechanicalkeyboards has a bunch of info on all this stuff.
Although my highest priority (by far) is low noise, I do prefer short travel/registration and a soft landing as well.
Prompted by your post, I did some more research and decided to upgrade(?) my gaming keyboard to a Logitech G Pro TKL https://www.logitechg.com/en-us/product/pro-gaming-keyboard. I haven't tried Romer-G switches so it will be interesting to see how they turn out.
Meanwhile, now that Corsair's exclusive rights to the Cherry MX Silent (Red) have expired, I'm hoping to see a lot more keyboards with that switch make their way into the market. The Corsair Strafe RGB was tempting, but the light spill from under the caps was far too gaudy and distracting for my liking.
Cherry MX Silent (Red) switches with landing pads and o-rings seems like a combination that would work well (for me) for gaming.
For the DB4 and daily driver usage, though, the hunt goes on. I'll keep using the wired Apple Keyboard until someone else comes up with a TKL clone.
To make up for it, I'm using a mechanical Razer Blackwidow. It's so loud you can get distracted by the sound of your own typing :)
Semi-passive power supply, passive GTX1050 and a noctua CPU cooler that is just barely audible when the room is silent.
The coil whine is irritating though.
I think the iMac Pro is much better in this regard but it's not sustainable as a workstation machine.
It is a problem of some motherboards, and one way to get ride of it is to use another audio chip (for example, a small USB device that acts like a sound card).
Personally, I use an Audioquest Dragonfly USB DAC/headphone amp, which gives me the option to use either headphones (of my choosing) or speakers. (And unlike a larger "breakout box" USB sound card, it's quite portable, being about the same size and shape as USB thumb drives used to be.) It's a good fit for me, but if you need mic input, there are other devices - or you can still use the mic in of your regular sound card/integrated audio.
On one particular susceptible computer I did a bit of testing, and scrolling white background pages made the noise, but black background ones didn't! Scrolling a completely white page also made the noise. And it depended on app, some apps were immune, it seemed to only happen on GPU-accelerated apps, like browsers.
I don't think I've heard scrolling noise since I stopped using analog video signals.
This was back in the CRT days, so I'm not sure if it's still an issue today, but it took me forever to figure out what was going on and drove me crazy when I noticed it.
That being said, within a year of using Windows 10 four (!) of my hard drives have begun making loud screeching sounds. They're technically fine, but still unusable.
It's really pretty nice in theory. The reality is that some components still emit a good bit of noise, and unlike fans, the noise they tend to emit isn't very pleasant. Power-hungry graphics cards are by far some of the worst offenders; without being masked by fan noise, they can be incredibly loud and irritating. An open bench PC is definitely cool, but was kind of a mistake in my case.
Fan tech has improved to the point that I think, more often than not, you really want to have them. Otherwise it's time spent trying to eliminate other noise sources that would otherwise be conveniently and usually pleasantly masked by a little wooshing.