NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Home servers are greener than cloud (blog.freedombone.net)
bloody-crow 1617 days ago [-]
> I didn't have any quantitative estimates then, and still don't now. However, it's likely that a world in which there is one server per household or per street would be more electrically efficient than the current world of billionaire cloud servers.

This is the core of the article and it's basically guesswork. From all I read, I tend to believe the opposite actually. Cloud is a lot greener than home servers.

antisthenes 1617 days ago [-]
Even just applying the basic principles of efficiencies of scale, it's almost guaranteed to be true that cloud is greener.

Utility scale power, utility scale chip production, exabyte-scale storage racks, generally more efficient chips (xeons vs desktop models), more efficient server PSUs.

Not to mention less overhead like deliveries. You only need 1 truck to deliver the hardware to the data center, whereas you might need 30-50-100 to deliver computers to individual households.

> But given that a home server can run on 10W of electrical power, and potentially off of a solar panel I found this unpersuasive. I didn't have any quantitative estimates then, and still don't now.

This is laughable. It's not an argument for individual servers in households, if anything, it's an argument for utility-scale solar and more efficient software that runs efficiently on 10W CPUs.

LargoLasskhyfv 1616 days ago [-]
And all that for running utility scale software bloat somewhere else, requiring networking infrastructure because the data isn't where it belongs. Talk about gridlock, and hick-ups because some backhoe or fat fingered admin tested the reliability of the the redundancy. As can be seen again and again when all sorts of services degrade or fail globally because some cloud experienced lightning. Instead of a single household, block, or city.

Yah, sure!

verdverm 1617 days ago [-]
If you use the Google Cloud, they run on 100% renewables
makerofspoons 1617 days ago [-]
AWS exceeded 50% renewable energy usage for 2018. My house is powered by gas.

"I didn't have any quantitative estimates then, and still don't now. However, it's likely that a world in which there is one server per household or per street would be more electrically efficient than the current world of billionaire cloud servers."

My neighbors are powered by gas too. The only way this would work is if we all also bought solar as the author suggests, however encouraging your average homeowner to not only run a home server but also invest in solar for it is a non-starter.

sp332 1617 days ago [-]
The argument is a little shaky, but I can see it. In a datacenter, heat is just wasted energy that requires more energy to cool. Running a server is not the most efficient way to heat a home, but at least the heat isn't just waste.

Also I suspect that a home server could just sleep the CPU a fair % of the time. On top of that, you don't need to run a billing system that charges different users for various things.

manacit 1617 days ago [-]
> But given that a home server can run on 10W of electrical power, and potentially off of a solar panel I found this unpersuasive. I didn't have any quantitative estimates then, and still don't now. However, it's likely that a world in which there is one server per household or per street would be more electrically efficient than the current world of billionaire cloud servers.

Sorry, but I don't buy it. DCs are frequently powered by solar (https://www.google.com/about/datacenters/renewable/, https://sustainability.fb.com/innovation-for-our-world/susta...), and this article gives no actual evidence other than home server being able to run on 10W. Solar and Wind projects are successful in part because of economies of scale. While local-scale grids may be great, using them to power services that run on cloud infrastructure today don't seem like a wonderful application.

I have no doubt that the aggregate sum of my all of my cloud usage across Google, FB, Amazon, etc. amounts to larger than 10W, but if you summed up all of the different pieces of those services that I use, I doubt you'd ever be able to scale them all down to something that could run at home, forget doing it on 10W. There may be a small sliver of that which is possible (e.g. EMail), but the fact of the matter is that it's almost irrelevant.

Coincidentally, the author of this post maintains a project that develops a home server system, which means there's plenty of vested interest in pushing this non-analysis.

moron4hire 1617 days ago [-]
That said, I'd love a "solar powered Raspberry Pi in a box" product to just buy and setup in my backyard somewhere for my personal website.
sliken 1617 days ago [-]
Largely ignored by this article is that fact that everyone needs a router. NASs are far from rare, the advantages of storage on the consumer side of a relatively slow internet connection have advantages.

So it's not really 10 watt local server vs a shared of a cloud. It should be something like 0.5 watts or similar to upgrade your home router to something that could provide internet services. Likely just double the ram and provide a microSD card for storage and that would be all you need. A 256GB microSD card, even a fast one is around $50.00.

Seems like a bit of smarts, like a CDN, or even IPFS and collectively home servers could make quite a bit of sense with close to zero power overhead.

oakwhiz 1617 days ago [-]
I think I disagree somewhat with the premise of this article. Computers are more efficient in general when they are fully loaded. A home server is rarely fully utilized and doesn't realize the full potential of multitasking/virtualization. It is more efficient to share different tasks among the same set of disks and memory. On the cloud, anything you didn't use could theoretically be given to someone else.

The hardware lifecycle should also be considered. Although home environments often recycle equipment, cloud environments are in a better position to get more bang per watt, and can upgrade their SAN storage capacity easily to handle changes in demand.

I suppose there is a lot of inefficiency in hitting all those switches and routers on the way to a cloud's network. However that's shared with other users as well.

sp332 1617 days ago [-]
I think the hardware lifecycle heavily favors the home server. This is reflected in the much higher cost of cloud computing vs owning the hardware. Cloud components have to be standardized and are swapped out en masse every few years.
Yetanfou 1617 days ago [-]
The price of cloud computing is higher than that of owning and running the hardware. As to whether that implies the cost of running a commercial cloud is higher than that of running individual servers on customer premises is doubtful. I assume that the likes of Amazon, Microsoft, IBM and Oracle do not run commercial cloud operations at a loss.
ekianjo 1617 days ago [-]
So the proof that it is greener than cloud is where?
nwah1 1617 days ago [-]
The article claims it is a gut feeling. Surprised this post made it here.
tryptophan 1617 days ago [-]
Fun fact, but CPUs/GPUs/etc are almost 100% efficient at converting electricity to heat, as they don't give off light energy like a heater would, via its glowing coils.

So, theoretically, heating your house with CPUs would be greener than using central heating...

If you live in a cold area, a home server's heat wouldn't be wasted, in fact it would theoretically lower your heating bill(by .01% or less, but hey!) as less electricity would be used on the less efficient heater.

tzs 1617 days ago [-]
But wait...what happens to that small amount of energy that a heating coil gives off as light? I'd expect that most of those photons end up getting absorbed by something somewhere else in the room, and ultimately the energy still ends up as heat.
LargoLasskhyfv 1616 days ago [-]
There is a french company that does this in some variations:

[1] https://www.qarnot.com/

leifmetcalf 1617 days ago [-]
Heat pumps are more than 100% efficient
moron4hire 1617 days ago [-]
Centralization is (always?) less efficient per-unit-work than decentralization, because centralization requires a hierarchy of management overhead to work. Centralization makes more efficient things other than the unit-work, things like aggregate tracking. These things are typically only of concern to the organization doing the management and have almost nothing to do with the goals of the party for whom the work is being done.

Take, for example, developing a website. Maybe it's a site for a small municipality to post community updates. This work could be done by a small, local consulting group or a gigantic, multi-national one. All else being equal, the cost of the project will necessarily be higher with the large consultancy than with the small.

And experience bears this out. I've been in both large and small consultancies, and the only difference was how many layers of management were on top of the decision process. There is no "economy of scale" for developing to a customer's needs. And there is nothing in the management overhead of the giganto-corp that improves the project for the customer. All that overhead has to get paid for in some way, so it necessarily leads to higher project costs (which probably means massive budget overruns because the big Corp probably priced the project at or under the local Corp in order to close the deal).

Large, centralized systems become something like hedges for the management org. They lose money on a long tail of small projects only to make it back with more on a few, large winners that can be milked. We look at the bottom line and say "everything is up! It must be good!", but we don't stop to look at the individual failures. While, if it were all decentralized, we wouldn't have a metric to look at to determine what direction the aggregate it going on, but that long tail would probably be served a lot better.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 21:59:10 GMT+0000 (Coordinated Universal Time) with Vercel.